-
My client’s site was recently incorrectly flagged as having a phishing scam and was removed from google. I resolved the issue and Google put the site online again, however now I’ve requested it to be indexed it keeps coming up with “Failed: Blocked due to other 4xx issue”
I was told by my host that my robot.txt file was only allowing my ajax file to be viewed, I thought I had since resolved this and replaced the code on the RankMath settings page, but the site is still bouncing back from being indexed with the same failed error.
Does it take a few hours for the robot.txt changes to go through? Is there any advice from anyone to help resolve the issue?
Many thanks in advance π
The ticket ‘Robot.txt file not updating, Google not able to index site.’ is closed to new replies.