01-231 ul. Płocka 9/11B Warszawa (Wola)

Top

Google withdraws Crawl Rate Limiter tool

SeoFly / News  / Google withdraws Crawl Rate Limiter tool

Google withdraws Crawl Rate Limiter tool

Google Search Console’s indexing rate limiting tool will be phased out on January 8, 2024. It was available for more than a decade, but due to improvements in Google’s indexing logic and the availability of other tools for publishers, its usefulness has diminished. Googlebot adjusts its indexing rate based on the server’s response. For example, if the server systematically returns HTTP 500 status codes for a number of URLs, Googlebot automatically and almost immediately slows down the indexing rate. Similarly, the indexing rate is automatically reduced when the response time to requests increases significantly. The indexing rate limiting tool had a slower performance – it could take more than 24 hours to change the limits. In practice, the tool was rarely used, and those who did use it often set the minimum indexing rate. With the withdrawal of this tool, the minimum indexing rate will be lowered to a level comparable to the old indexing rate limits, effectively continuing to respect the settings of some site owners if search interest is low and indexing does not waste the site’s network bandwidth. Advances in automatic indexing rate management and a desire to simplify the tool for users are the main reasons for its withdrawal. Google still offers a form for reporting unusual Googlebot activity for emergencies and unusual cases, but the fastest way to reduce the crawl rate is to instruct Googlebot by server responses, as described in the documentation. Crawl Rate Limiter is a tool offered by Google that allows website owners to control how often Google robots (such as Googlebot) visit their sites for indexing. The purpose of this tool is to allow site administrators to manage the server load caused by the indexing process. When Googlebot visits a website, it “crawls” or searches the site to understand its content and structure. This information is then used to update Google’s index, which is key to displaying the site in Google’s search results. However, if Googlebot visits the site too often or crawls too much content at once, it can put a strain on the server where the site is hosted, which can lead to the site slowing down or even crashing. Crawl Rate Limiter allowed administrators to limit the frequency with which Googlebot visits their sites, allowing them to balance Google’s need for indexing with maintaining server stability and performance. Administrators could choose whether to increase, decrease, or maintain the current frequency of Googlebot visits.

 

https://developers.google.com/search/blog/2023/11/sc-crawl-limiter-byebye https://developers.google.com/search/docs/crawling-indexing/reduce-crawl-rate

Wykonaj bezpłatny audyt SEO swojej strony już Teraz!
Wprowadź adres URL swojej strony głównej lub dowolnej strony w swojej domenie i otrzymaj raport o jej działaniu w ciągu 30 sekund.






Share
No Comments

Post a Comment