Robots.txt is blocking pages from being indexed

#928655
Viewing 5 replies - 1 through 5 (of 5 total)
  • Hello,

    Thank you for contacting us and sorry for any inconvenience that might have been caused due to that.

    We checked your website but couldn’t find any disallow rule on your robots.txt file which may prevent Google from crawling your website.

    Since Google still showing the error, please try to clear your website cache including any server-level cache and check again.

    Let us know how it goes. Looking forward to helping you.

    Jeff
    Rank Math free

    Thank you for the prompt response. I have cleared the website cache & the server-level cache again & I will check if there is any change later in the day.
    Can you keep this ticket open, please?

    Hello,

    We can see Google is now can crawl your pages without any issues:

    However, if you still face any errors, please let us know here. We’ll keep this ticket open.

    Looking forward to helping you.

    Thank you.

    Jeff
    Rank Math free

    Thank you all resolved now

    Hello,

    We are glad to know that this issue has been resolved. Thank you for letting us know.

    This ticket will be closed now, but you can always open a new one if you have any other questions or concerns. We are here to help you with anything related to Rank Math.

    We appreciate your patience and cooperation throughout this process.

    Thank you for choosing Rank Math.

Viewing 5 replies - 1 through 5 (of 5 total)

The ticket ‘Robots.txt is blocking pages from being indexed’ is closed to new replies.