Googlebot in 2019 – How Does It Actually Work?

If you take a proactive approach to SEO, you’ll no doubt be familiar with the term ‘Googlebot’. If not, here’s its official definition as provided by Google:

“Googlebot is the generic name for Google’s web crawler. Googlebot is the general name for two different types of crawlers: a desktop crawler that simulates a user on the desktop, and a mobile crawler that simulates a user on a mobile device.”

In a purely matter-of-fact capacity, Google provides a fair amount of information on positive SEO practices. Nevertheless, they aren’t typically in the habit of providing the answers SEO specialists and webmasters really want.

As something of a break from the norm, Google’s Martin Splitt and Microsoft’s Suz Hinton recently sat down to discuss all things Googlebot in 2019. What it does, how it behaves and the more general information marketers and webmasters need to know.

Detailed below, you’ll find a selection of key questions raised during the meeting, along with a summary of the answers provided by both parties:

 

What Is Googlebot (and What Isn’t it?)

Much of the confusion surrounding SEO in general stems from the fact that most people have no idea how the Googlebot works. Even if they’re familiar with the concept, they may not understand its mechanics, its functions and its objectives.

According to Splitt, the Googlebot is simply a piece of software used to crawl websites, index web pages and help assign rankings. The system is designed to analyze web content, determine its value and decide who it should be presented to in the SERP rankings. The most appropriate content for any given query being prioritized and positioned more prominently.

However, Googlebot itself isn’t responsible for assigning rankings. Instead, it provides Google with the information required to make decisions and adjustments regarding SERP rankings. Googlebot essentially creates a ‘catalog’ of content by indexing web pages – SERP rankings are determined elsewhere.

 

Does Googlebot Behave Like a Web Browser?

Initially, Googlebot behaves in a similar way to a web browser. The software finds a website by way of a link or a sitemap submission, or through countless other detection methods. It’s also possible to direct site indexation or re-indexation if necessary. After which, the initial website visit occurs in a similar way as with any conventional web browser.

 

How Often Does Googlebot Crawl?

The frequency with which Googlebot crawls is calculated in accordance with the nature of the website and its content respectively. In the case of a dynamic website that’s likely to be updated with relevant content on a regular basis, it will be crawled more regularly than a predominantly static website.

A typical working example of which being news or current events website, which is updated around-the-clock with important news updates. Google wants to ensure it brings relevant information to its audiences the moment it is published, which in turn means crawling such sites more regularly to update its index accordingly.

By contrast, a retail store that’s updated once every few weeks needn’t necessarily be crawled as often. Just as an academic portal with relatively static content may not need to be crawled with any real regularity. Google also enforcers measures to ensure spammy sites are crawled infrequently, which can have a negative impact on their SEO performance.

It’s also possible to instruct Google not to index certain pages of a website if the respective webmaster doesn’t want the information to appear in the search results.

 

What About Crawlers & JavaScript-Based Websites?

Webmasters and business owners operating JavaScript-based websites have long been concerned with Google’s ability (or lack thereof) to successfully crawl and index their pages. Given how crawlers have been traditionally incapable of executing JavaScript, they’ve effectively see blank pages when attempting to crawl JavaScript content.

Contrary to popular belief, this is no longer the case with Googlebot. Today’s Googlebot is designed to effectively execute JavaScript and index the site’s content accordingly. Albeit at a later point, having been subjected to Google’s ‘web rendering service’ and the use of a second browser.

Google, therefore, recommends the mindful and strategic use of JavaScript, due to its capacity to significantly slow down the indexation and subsequent ranking processes.

 

How Can You Tell if Googlebot Visits Your Website?

Interestingly, Google makes no attempt to hide its crawling activities from webmasters and online business owners. It’s actually pretty easy to identify a Googlebot visit, as outlined in Google’s official webmaster support pages:

“Your website will probably be crawled by both Googlebot Desktop and Googlebot Mobile. You can identify the subtype of Googlebot by looking at the user agent string in the request. However, both crawler types obey the same product token (user agent token) in robots.txt, and so you cannot selectively target either Googlebot mobile or Googlebot desktop using robots.txt.”

 

What’s the Difference Between Mobile-First Indexing and Mobile Friendliness?

Addressing one of the most common questions among mobile-focused webmasters, Splitt explained that indexing is about Google ‘discovering content using a mobile user agent and a mobile viewport’. It’s essentially a case of prioritizing mobile-friendly content to be added to Google’s mobile index, rather than desktop content that may not be as appropriate for mobile audiences.

Mobile-friendliness – aka mobile-readiness – simply refers to the quality and appropriateness of a website’s content and layout for mobile access. If every aspect of a webpage or website can be browsed and interacted with using a mainstream mobile device, it’s considered mobile-friendly by Google.

Google’s efforts to index mobile content have been stepped up recently over the past couple of years, in order to satisfy the expectations and requirements of fast-growing mobile audiences worldwide.

 

What Are the Most Critical Quality Indicators for Ranking?

Last but not least, Google once again took the opportunity to confirm that more than 200 signals (or ranking indicators) are used to assign value to any given website. Mobile-friendliness has been cited as one of the most important quality indicators for websites in 2019 but by no means the only indicator taken into account.

Interestingly, Splitt also confirmed that the value assigned to each of these 200+ quality indicators is continuously moving and changing. One of the many measures making it increasingly difficult for black-hat SEO strategies to produce positive results.

The moral of the entire story being eloquently summarised by Splitt in one simple plea to the masses:

“Just build good content for the users and then you’ll be fine!”

 

Leave a Reply

Your email address will not be published. Required fields are marked *