Robot.txt issue

#491244
  • Resolved Michelle McHann
    Rank Math free

    Hi,

    In my Google Search Console it looks like my pages are being blocked. I have checked the file in my WordPress settings, submitted my sitemap to Google and also contacted my hosting company (wpx) to see if they could help.

    My hosting company said: As I can see you are using Rank Math SEO. I cannot see the robots.txt file in the public_html directory, where usually the file is located. However, many SEO plugins create this kind of file in a different directory.I cannot see any related blocks on our end. What I can suggest is to request a recrawl for your website or contact Rank Math SEO for more insights on the issue and if there is anything we would be able to assist with.

    I have included relevant screen shots: https://imgur.com/a/AKC8pku

    Can you please help with this issue? Thank you.

Viewing 1 replies (of 1 total)
  • Md. Sakib Khandaker
    Rank Math agency

    Hello,

    Thanks for contacting us, and sorry for any inconvenience that might have been caused due to that.

    I’ve checked your site in various way and it seems like your robots.txt file is fine and not blocking Googlebot.

    As your robots.txt file seems to be fine in this case, but still I’d like to suggest you to test your robots.txt file on Google’s robots.txt tester mentioned in the guide below. If everything seems to be fine, you may try validating fix on GSC to see if the issue disappears from there.

    Here’s a guide on Indexed though blocked by robots.txt error on GSC:
    https://rankmath.com/kb/indexed-though-blocked-by-robots-txt-error/

    However, while checking your site on Rich Results Test, I can see that there’s an error appearing:

    Failed: Hostload exceeded
    

    Hostload exceeded means, your site might be at maximum capacity for Google crawling or inspection requests. Google can’t run tests until your traffic load drops. That ‘hostload exceeded’ is technically the same as ‘crawl rate limit’ exceeded.

    Note: It is Google’s estimated load, not the actual load (which Google can’t find!)

    Here’s a screenshot for your reference:

    In this case, you may try checking what the current crawl limit is via the settings page:
    https://support.google.com/webmasters/answer/48620?hl=en

    If there’s a low limit set there, you may try increasing it to see if that works for you.

    Here’s a URL below with a similar issue for your reference:
    https://support.google.com/webmasters/thread/170238750/

    Let us know how it goes. Looking forward to helping you.

    Thank you.

    Hello,

    Since we did not hear back from you for 15 days, we are assuming that you found the solution. We are closing this support ticket.

    If you still need assistance or any other help, please feel free to open a new support ticket, and we will be more than happy to assist.

    Thank you.

Viewing 1 replies (of 1 total)

The ticket ‘Robot.txt issue’ is closed to new replies.