A comprehensive guide to the latest Google updates on crawlers and user-triggered fetchers documentation

- Advertisement -


Google recently made updates to its crawlers and user-triggered fetchers documentation, aiming to improve organization and provide more information about the products each crawler affects. The changes include breaking down the single-page document into multiple pages and documents, as well as adding a robots.txt snippet for each crawler to demonstrate the use of user agent tokens.

In a blog post, Google explained that the documentation had become too long, limiting their ability to provide detailed information about their crawlers and fetchers. The reorganization allows for better organization and easier access to specific information.

- Advertisement -

The most notable addition to the documentation is the inclusion of the “affected products” section for each crawler. This section specifies which Google products are impacted by the preferences set for each crawler. For example, the Googlebot crawler affects Google Search, including Discover and all Google Search features, as well as other products like Google Images, Google Video, Google News, and Discover. This information is valuable for website owners and developers to understand how their content is being crawled and indexed by Google.

Additionally, Google added an “Example robots.txt group” section for each crawler. This provides developers with specific examples of how to use the user agent tokens for each crawler in their robots.txt file. This is particularly useful for website owners who want to control and manage the crawling behavior of their site.

- Advertisement -

The documentation also includes information about crawlers and fetchers that do not impact any specific product, such as GoogleOther and its optimized versions for fetching image and video URLs. These generic crawlers may be used for internal research and development purposes.

Understanding how each crawler affects different aspects of Google is crucial for website owners and developers. Some crawlers, like Googlebot, have a significant impact on Google Search and its features, while others are specific to certain products like Google News, Google Shopping, or Google Ads. By familiarizing themselves with the affected product sections, website owners can optimize their sites accordingly and ensure their content is being properly indexed and displayed in relevant Google products.

- Advertisement -

The new robots.txt examples provided in the documentation are also valuable resources for development teams. They offer practical guidance on how to control the crawling behavior of their sites, ultimately improving the overall user experience and search visibility.

In conclusion, Google’s updates to its crawlers and user-triggered fetchers documentation provide website owners and developers with a comprehensive guide on how these crawlers impact different Google products. The reorganization of the documentation, along with the addition of the “affected products” and “Example robots.txt group” sections, enhances accessibility and provides practical examples for developers to optimize their sites. By understanding and leveraging these updates, website owners can improve their site’s performance in Google Search and other relevant products.

- Advertisement -

Stay in Touch

spot_img

Related Articles