Adding a crawl delay

#501832
  • Resolved Mt Zion
    Rank Math free

    Hello, this is Sam from Mt Zion Church.

    We have been having SEO bots crawl our page at the pace of a few requests a second and it is ultimately slowing the performance of our site down. Our hosting service has suggested that we add a crawl delay of 3 seconds into our robots.txt file in order to limit their requests to once every three seconds. How do we do this? We have read your article about editing the robots.txt file, but we do not want to mess with that file on our own, especially when we do not know what the exact crawl delay code is.

    Sam
    Mt Zion Church

Viewing 1 replies (of 1 total)
  • Hello,

    Thanks for contacting us, and sorry for any inconvenience that might have been caused due to that.

    You may use the rule I’ve shared below to add crawler delay.

    User-agent: *
    crawl-delay: 3

    That should be applied to any other bots/crawlers except Googlebots.

    This is because, in 2019, Google announced they would be ignoring certain directives in the robots.txt file and subsequently updated its Google Search Console to give you control over the crawl speeds.

    As Google doesn’t support the crawl-delay directive, so the crawlers from Google will just ignore it. If you want to ask Google to crawl slower, you need to set the Crawl rate in Google Search Console. Here’s a URL below that you may follow to learn more:
    https://support.google.com/webmasters/answer/48620

    Hope that helps, and please do not hesitate to let us know if you need our assistance with anything else.

    Thank you.

    Hello,

    Since we did not hear back from you for 15 days, we are assuming that you found the solution. We are closing this support ticket.

    If you still need assistance or any other help, please feel free to open a new support ticket, and we will be more than happy to assist.

    Thank you.

Viewing 1 replies (of 1 total)

The ticket ‘Adding a crawl delay’ is closed to new replies.