Google Removes Breadcrumbs from Mobile Search Result

To improve the user’s experience on Mobile Devices, Google removes breadcrumbs for Search Result.

Now, users on mobile devices will only see the domain name for the search result alongwith Title, Description, and other elements.

Breadcrumbs was introduced to improve the user experience because it helps in navigating the website. However, the screen size limitation of mobile devices truncates the breadcrumbs. That’s why Google decides to depreciate this feature from mobile devices.

However, breadcrumbs will remain present in Google Search for Desktop devices. And, there is no need to change the implementation strategy of breadcrumbs on the website.

This new change has been added to all the regions where Google Search is available.

How to Identify Number of Google Maps Listings a Brand have in a Particular Location?

While doing Local SEO for a brand, sometimes you may need to know, “How many Google Maps Listing the brand has in a particular location?”.

Well, this is understandable, because the more a brand has Google Maps Lisiting, the more number of its listing has chances to rank highers in Google Search for different queries.

So, without a further ado, here is a trick to find out the same. Actually, we will use the Google Search Operators to do this easily.

  1. First open “Google Maps”.
  2. Type the below Syntax for Google Search Operator: “brandname” “store” OR “dealer” OR “showroom” location:locationname
  3. Hit Enter

Note: Keyword Marked in italics are variables and you can change it as per your need.

Studio Makes WordPress Local Development Simpler

I recently tried Studio by WordPress.com, and I have to say—it’s a developer’s dream come true. This lightweight, dependency-free tool makes local WordPress development a breeze. Whether you’re syncing with production sites or sharing demo sites with clients, Studio keeps everything smooth and efficient.

Here’s what I loved:

Quick Setup: No Docker, no MySQL—just instant WordPress installations. Demo Sites: Share live snapshots of your projects for real-time feedback. One-Click WP Admin: Say goodbye to login headaches. AI-Powered Studio Assistant: Need help managing plugins or running commands? The Assistant has you covered. It’s free, fast, and packed with features that save time. If you’re building with WordPress, Studio is worth checking out.

⭐ Verdict: A must-have for any WordPress developer looking to streamline their workflow.

👉 Try Studio Now

#WordPress #WebDev #TechTools

Does URL Always need to be in Lowercase in SEO?

While doing SEO for a website, many of us questions that “Does URL always need to be in Lowercase?”

Well, in a recent Bluesky post, John Muller clarified that, URLs doesn’t always need to be in Lowecase.

Instead the URL should be uniform in Nature. Means, if your website is consitently using lowercase URLs then you should use the Lowercase URLs and vice versa.

John Muller also clarified that, one should also check the case sensitivity of URLs present in the structured data for taking better decision in creating the URL structure.

Also, one should take consideration of Kebab Case as well.

Link Parity Affects Crawl Budget

Google has updated one of its Search Central Documentation which is related to Crawl Budget.

In the new documentation, Google says that if their exist link parity between the desktop and mobile version of a webpage, then it will affect the crawl budget of a website.

Do note that, back in September 2020, Google rolled out mobile fist indexing which means Google predominantly uses the mobile version of a webpage for indexing and ranking.

Now, empowering the mobile indexing further, Google says “If your website uses seperate HTML for mobile and desktop versions then provide the same set of link on the mobile version and ensure that all these links are included in the sitemap for better discovery”.

Which means their shouldn’t be link parity between the desktop and mobile version of a webpage.

OpenAI Launches ChatGPT Search Engine

After beta testing SearchGPT (which is also called ChatGPT Search Engine) for a few months, OpenAI finally makes it live on ChatGPT.

Those of you who don’t know, SearchGPT is a new search experience integrated in ChatGPT itself provides real-time and localized information to the user’s query and gives proper attribution to the sources to which it has partnered or the websites allowed to be crawled by ChatGPT bots.

Here are the list of some of the biggest news organizations to which OpenAI has partnered for extracting information:

  1. Associated Press
  2. Axel Springer
  3. The Financial Times
  4. Reuters

Once the user places a query, the SearchGPT gives a response and below each response it gives a Source Button clicking on which reveals the list of sources from where ChatGPT has extracted the information to made the response.

Apart from that, user’s can ask follow up questions for the response provided.

Now, this new search experience has been made available to all the ChatGPT Plus and Team users, however, will soon be made available to the free users.

How to Track Traffic Coming from ChatGPT?

Currently, ChatGPT has around 200 million weekly active users which is growing day by day.

Also, ChatGPT is giving attribution to the sources from where it extracts information. So, it is possible for those sources to get traffic from ChatGPT as well.

Hence, tracking traffic coming from ChatGPT is also essential for making your analytics report clear (especially when you are optimizing your website for Artificially intelligent search engines).

So, there is a UTM parameter you can use to track traffic coming from ChatGPT.

Now, suppose you are using GA4 to track the traffic coming to your website from different sources.

Then, you can simply see the UTM tracking parameter “utm_source=chatgpt.com” to track traffic coming to your website from chatgpt.com.

Google Search Removes Support for Sitelinks Search Box

Starting November 21, 2024, Google Search is going to remove support for Sitelinks Search Box.

Google Search introduced this feature ten years back but now have noticed that its usage has been dropped, hence decided to remove this feature.

This new change has been rolled out globally for all the countries and languages and will not affect rankings of other sitelinks visual elements.

One Google Search stops showing Sitelinks Search Box, it will also be removed from the Google Search Console reports.

Now, while Sitelinks Search Box has been removed from the Google Search and it’s report from Google Search Console, you can still keep its structured data because it works in accordance with website structured data.

Google Rolling Out AI Organized Search Result in US

To improve the user’s experience and ease of accessing accurate information, Google has started rolling out AI Organized Search Result in the US. But, what is so special about these results and how will it be different from the existing results, let’s explore the same.

What is AI Organized Search Result?

As the name gives hint, it means the search results are organized by Artifical Intelligence.

This new Google Search feature will give you personalized result based on your entered query.

For example, if you search a keyword say “vegetarian pizza receipe”, you will get different results grouped by category/section. Each of these category showcase the results from different perspective, for eg. a section dedicating results for top vegetarian pizza receipes, another section for easy vegetarian dips, and another one titled “Explore by Ingredient” and so on. Do check the image below to get more understanding about this feature.

Google AI Organized Search Result Demo

Note: While writing this blog post, this new feature has been rolled out for queries related to receipes and meal inspiration, but it will not take longer for Google to roll out this feature for other niche keywords as well.

Well, this is not a new feature as the Google Search team is testing this feature since 2007. However, the feature is now rolling out, starting with the US for select niche keywords.

If you have any helpful information for this new topic, feel free to let me know in the comments down below.

How to Efficiently Crawl a Next.js Website in Screaming Frog?

While doing SEO of a website built on Next.js (or any other Javascript technology), its important to crawl the website efficiently in Screaming Frog. But, Javascript websites are a little bit complicated in comparision to the websites built on other technologies like PHP, HTML and so on.

Below I have shared a few configurations that you can use to efficiently crawl a next.js website in Screaming Frog.

Note: You can use the below configurations for any other Javascript websites as well.

  1. Rendering with Javascript: To enable this option visit Configuration > Spider > Rendering > Javascript. Do note that Javascript crawling is slower in comparison to the text rendering.

  2. Crawled Linked Sitemap: To enable this option visit Configuration > Crawl Configuration > Crawled Linked XML Sitemap.

  3. Auto Discover XML Sitemap via robots.txt: You can find this option at Configuration > Crawl Configuration. Enabling this option will automatically fetch the website’s sitemap URL from the website’s robots.txt file.

  4. Crawl these Sitemaps: This option enables you to feed the website’s sitemap manually. This option is also located at Configuration > Crawl Configuration.

  5. Crawl and Store Javascript Resource Links: While crawling a Javascript based website in Screaming Frog, make sure to enable options to crawl and store Javascript resource links. You can find this option in Configuration > Crawl Configuration.

  6. User Agent: You can also change the Screaming Frog’s crawling user agent to Google Bot Smartphone via Configuration > User Agent.

  7. Speed: Now, as I said crawling of a Javascript based website is comparitively slower, but if you can increase the crawl speed by tweaking this option. All you need to do is visit the Configuration > Speed and there you can increase the number of maximum threads and maximum URLs.

If I forgot to mention any configuration, please let me know in the comments down below.

GSC Performance Report Filter is now Sticky

GSC Performance Report Filter is now Sticky, which means now the filters will stick to the last setting where you left off.

In GSC, now there will be a reset filter button, clicking on which will reset all the filters applied.

Infact, if you have the set the filters for search performance, discover performance, or news performance (in the performance report), the filters will remain be there unless and until you reset it.

In SEO, GSC Performance Report filter plays an important role as it helps to narrow down the result we actually want.

And now, the filters become sticky, finding the results with the filters you applied earlier will never become that easy.

What will happen if we use both the Canonical and No-Index Tag on a Webpage?

Have you ever questioned “What will happen if we use both the Canonical and No-Index tag on the webpage?”.

To understand this, let me explain the two things:

  1. No-Index: It is a directive that must be obeyed by the Google.
  2. Canonical: It is an attribute that modifies the HTML element with additional information. It is a strong signal that can’t be ignored but the crawler may ignore it (and that’s why we may sometimes get the duplicate content indexing issue in GSC).

Read More →

Google Search has removed the Cache Search Operator

To make Internet Non-Redundant, Google has removed the Cache Search Operator.

The cache: operator is no longer working on Google Search.

Google has basically replaced the Cache Search Operator with Internet Archive (Wayback Machine’s) link spotted on a search link’s “About this result” section. You can read more about it from this blog post.

After removing the removing the functionality of Cache Search Operator, Google search has also removed its documentation.

Now, to see the older version of a webpage, users have to check the Internet Archive’s link present w.r.t each search’s link.

Search GPT Optimization for Websites 2024

Search GPT is an AI Powered search engine that is equipped with the power of traditional search engines but has conversational abilities with Large Language Models.

Search GPT works on Retrieval Augmented Generation (RAG) which is used by Perplexity and Google AI Overviews as well.

Retrieval Augmented Generation works by integrating information from a database into the LLM response (for enhanced accuracy).

Read More →

Google Chrome Drops FID Support

In a recent tweet, Rick Viscomi (who takes care of Chrome’s Team Web Performance Developer Relations) reveals that, Google Chrome has ended support for First Input Delay (FID).

The Interaction to Next Paint metric has been introduced in place of FID.

Read More →

Spam Score in SEO: What it is and Should You Consider it?

Spam Score is a metric created by different third party SEO softwares that explains how many spam websites are referring to a particular website.

However, spam score isn’t any official metric referred by Google (confirmed by John Muller in a reddit post) for improved ranking and you can safely ignore them.

Read More →

Top 5 Best SEO Audit Tools 2024

Below are the Top 5 Best SEO Audit Tools which I think is still the leader in the year 2024.

  1. Google Search Console: Because here the data came directly from the product for which we are optimizing the website.
  2. Semrush: It has Javascript Rendering feature which works better than Screaming Frog.
  3. Screaming Frog: It is cheaper than Semrush if have to perform an audit for a big website.
  4. Ahrefs: It has similiar audit features like Semrush but has some limitation in different other features, hence put it at the 4th position.
  5. Moz: It is also good but I don’t like the tools UI in compare to Semrush and Ahrefs, hence put it at the 5th position. But who doesn’t know that the Domain Authority feature was first brought by Moz after Google’s Page Rank.

Apart from the above tools, you can also check our free seo tools.

What is Canonical Tag in SEO?

When we create a website, it may happens that two or more pages have the same structure, content and their is slight difference. So in such cases, when search engines crawl the website and find these pages, it flags them as duplicate content which ultimately hampers the website traffic. Hence, to make certain that this issue won’t happen, there is a meta tag called canonical tag, you can define in the page code that basically tells the search engine that the URL present in the canonical tag is the original URL and the search engine should consider indexing the canonical URL instead of the current page URL in which the search engine crawler is present.

Google may use OG Title for Title in Search Result

Google recently has updated its Title links search developer documentation.

In this documentation, Google has mentioned that it may use og:title for generating titles in search results.

Read More →

Do we loose the Google Search Console Data after Domain Expiration?

Have you ever qurious to know that “What happens to the Google Search Console Data after domain expires?”.

Well, the data will remain present in the Search Console and the new domain owner can claim the data after domain verification.

So, the GSC data transfers from one owner to another.