-
Why is the sitemap URL not included in the Robots.txt file?
What is the purpose of line 7 in the robots.txt file? This appears to be blocking all the pages our plugin creates, so I’m curious if there is some purpose to this that I’m not seeing.The robots file is it appears to be blocking Google from crawling the pages we created.
I would recommend removing that disallow directive.
You can see why this is problematic by going to Google and entering “site:fastcasualstorage.com”. There are only 6 pages indexed on Google for the domain, which matches up with the robots file blocking the additional pages.
This is clearly problematic.
The ticket ‘robot txt errors on fastcasualstorage.com’ is closed to new replies.