-
Hello, I created 6 new location pages on my website https://matthewsplumbingandgas.com.au. One of these is https://matthewsplumbingandgas.com.au/plumber-ulladulla but when I do a URL inspection on this page in Google Search Console it is not being indexed and there is no referring sitemap detected. I tried to Request Indexing but it was rejected due “blocked by robots.txt”. I reset the robots.txt in RankMath SEO and here it is
User-agent: * Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php Sitemap: https://matthewsplumbingandgas.com.au/sitemap_index.xmlThis did not fix the problem. I then tried to resubmit the 3 sitemaps which I successfully submitted 8 days ago:
https://matthewsplumbingandgas.com.au/page-sitemap.xml
https://matthewsplumbingandgas.com.au/sitemap_index.xml
https://matthewsplumbingandgas.com.au/sitemap.xmlGoogle Search Console is now giving a status of Couldn’t Fetch due to a General HTTP error. When I do a URL inspection on https://matthewsplumbingandgas.com.au/sitemap_index.xml and https://matthewsplumbingandgas.com.au/page-sitemap.xml they are not being indexed and there is no referring sitemap detected. I tried to Request Indexing on both but they were rejected due “blocked by robots.txt”.
I have also validated my robots.txt file.
I do not know how to resolve this and none of the location pages are being found in searches.
Can you please help?
Thanks, Jeff
The ticket ‘Robots.txt is blocking pages from being indexed’ is closed to new replies.