-
I have a report that says 122 pages are blocked from appearing in search engines. The advice is to remove the existing Robots Meta Tag, X-Robots-Tag or robots.txt file. How do I do that?
-
Hello,
Thank you for contacting Rank Math and bringing your concern to our attention. I’m sorry for any inconvenience this issue may have caused you.
Could you please share the affected website URL so we can check?
Meanwhile, here’s a helpful link you can follow to ensure that your site is set to index:
https://rankmath.com/kb/google-index/Looking forward to helping you.
Thank you.
Hello,
I’ve checked your homepage, and it is already set to index. Also, your sitemap and robots.txt are working fine. Could you please share where are you getting this error message? If you wanted to share screenshots, you can upload them using this tool and add the link here.
You can check your Google Search Console account’s coverage section to check for any issues reported on your site.
Looking forward to helping you.
Thank you.
Thank you, Reinelle. The report is a site audit from UberSuggest.
I’ve checked RankMath > Settings > Reading and the box to discourage search engines is unchecked.
Google Console also shows an error showing the submitted URL is marked noindex.
Screen snapshots are attached.Kind regards,
JocelynHello,
It seems that you have missed adding the screenshot URLs here. Could you please share them here as well so we can check?
Looking forward to helping you.
Thank you.
There are 123 pages so can’t send screen snapshots of all; I’ve just sent one example.
Hello,
You can take a screenshot of the first few URLs as we only need a couple of affected URLs to check the issue in detail.
You may send us 2 or 3 URLs and that should be enough for us to further investigate the issue.
Looking forward to helping you on this one.
I did send you screenshots of the first few URLs that are affected. They’re in the album, from the tools link you shared earlier: https://jocelynw.imgbb.com/
Do you not have access to that?Hello,
These are search pages and they should be set to noindex.
You can add a Disallow rule in the robots.txt file to stop Google from crawling these URLs:
User-agent: * Disallow: /wp-admin/ Disallow: */?s=* Allow: /wp-admin/admin-ajax.php Sitemap: https://jocelynwatts.com/sitemap_index.xmlYou can follow this guide to edit robots.txt file with Rank Math:
https://rankmath.com/kb/how-to-edit-robots-txt-with-rank-math/Hope this helps.
Thank you.
Thanks. I’ve just completed those instructions, did another site audit, and got the same result – still 124 pages with that error (screen snapshot attached). Does the change take some time to kick in?
Hello,
You can clear your website and browser’s cache and test again to see if the error is removed.
In any case, as those URLs are having additional parameters that should not be indexed as it is, you can safely ignore that error.
For example, you would not get the error for the URL
/contact-jocelyn/(which should be indexed), instead, you’re getting it for/contact-jocelyn/?s=(which should not be indexed on the SERPs).Hope this helps.
Thank you.
Thank you.
Hello,
We are super happy that this resolved your issue. If you have any other questions in the future, know that we are here to help you.
If you don’t mind me asking, could you please leave us a review (if you haven’t already) on https://wordpress.org/support/plugin/seo-by-rank-math/reviews/#new-post about your overall experience with Rank Math? We appreciate your time and patience.
If you do have another question in the future, please feel free to create a new forum topic, and it will be our pleasure to assist you again.
Thank you.
You must be logged in to reply to this ticket.