-
Hi
Any ETA on when this will be updated?
Seems it is causing problems for approx a month.Google can’t fetch sitemaps therfore pages are being removed from their indexing. It is not good.
Kind regards
Matt
-
Hello,
We apologize for the issue you are facing with the
x-robots-tag: noindex
affecting your Google Search Console. This can indeed lead to problems with indexing and fetching sitemaps.To address this issue, please ensure that your
robots.txt
file does not contain any rules that might be blocking Google from crawling your site. You can check and edit yourrobots.txt
file through Rank Math by navigating to WordPress Dashboard → Rank Math SEO → General Settings → Edit robots.txt.If you are still experiencing issues, it may be beneficial to review the settings related to the
noindex
tags on your pages. You can find more information on how to manage these settings in our guide on how to noindex URLs.If the problem persists, we recommend replying to this ticket with your site URL so that a human agent from our support team can assist you further. Thank you for raising this concern, and we appreciate your patience as we work to resolve it.
Sources:
Discovered – Currently Not Indexed in Google Search Console » Rank Math
How to Fix Indexing Issues in Google Search Console » Rank Mathno it is the plugin defnetely, (yoast does the same btw) php filter workaround fixes it.
Hello,
Please note that RankMath sets the “noindex” tag on your sitemaps by default. It’s done this way because a sitemap is only meant to guide search engines through your site, not to rank in search results.
The reason you’re seeing it set to “noindex” is to prevent it from appearing in search results, which is the desired behavior.
You should follow this guide to properly submit your sitemap:
https://rankmath.com/kb/submit-sitemap-to-google/However, if you still wish to remove the noindex tag from your sitemap, you can use the filter on your site:
add_filter( 'rank_math/sitemap/http_headers', function( $headers ) { if ( '/sitemap_index.xml' !== $_SERVER['REQUEST_URI'] ) { return $headers; } unset( $headers['X-Robots-Tag'] ); return $headers; } );
Here’s how you can add a filter/hook to your WordPress site:
https://rankmath.com/kb/wordpress-hooks-actions-filters/Looking forward to helping you.
Yea, I read that. Tho I don’t think this is true anymore as Google changed its policies about a month ago. So many of sites Im managing started having sitemap issues in google (Can’t fetch) after being completely fine for years. All affected the sitemaps were set to noindex.
The workaround is fine but it would be nice to see it implemnted for good!
Hello,
Regarding the sitemap issue identified in Search Console, please try to follow the steps below and see if that works for you.
1. Flush the Sitemap cache by following this video screencast:
https://i.rankmath.com/pipRDp2. Exclude the Sitemap files of the Rank Math plugin in your caching plugin. The cache could be via a plugin or from the server. For plugins or Cloudflare, please follow this article:
https://rankmath.com/kb/exclude-sitemaps-from-caching/3. If the above steps don’t seem to work, kindly apply the following filter to your site.
add_filter( 'rank_math/sitemap/enable_caching', '__return_false');
Here’s how you can add filter/hook to your WordPress site:
https://rankmath.com/kb/wordpress-hooks-actions-filters/Once done, clear your website cache, remove all the submitted sitemaps, and resubmit only the index sitemap in the Search Console.
Let us know how it goes. Looking forward to helping you.
Thank you.
thanks for this workaround and tips! but main problems I had was with the particular sitemaps (not the sitemap index) since you set sitemap index to ‘noindex’ I guess google have no way of finding the post-sitemaps, page-sitemap etc, since those are linked from index lol. On the other hand Yoast sets ‘noindex, follow’ tag, and also has issues so idk 🙂 I have added the filter that removes the ’noindex’ from sitemaps – required for google to actually fetch the sitemap correctly and submitted all the sitemaps including custom_type-sitemap.xml, custom_taxonomy-sitemap.xml etc to google.
Correction to my previous post. I have checked carefully and the site that uses Yoast rather than RankMath has sitemaps robots tag set to:
x-robots-tag: noindex, follow
— this doestn’ caus issues in Google, while the bare ‘noindex’ seems to cause issues
(also the RankMath Pro doesn’t seem to have ‘links per sitemap’ option)
Hello,
Can you please share the recent documentation from Google where they stated the
noindex
directive must be removed from XML sitemaps?The purpose of that tag is to avoid sitemaps from getting listed in the SERP since you don’t want your sitemaps to be served on searchers. The directive
x-robots-tag: noindex, follow
doesn’t and shouldn’t hinder Google from accessing the sitemap.Can you please try the following filter and see if this could work:
add_filter( 'rank_math/sitemap/http_headers', function( $headers ) { if ( '/page-sitemap.xml' !== $_SERVER['REQUEST_URI'] ) { return $headers; } unset( $headers['X-Robots-Tag'] ); return $headers; } );
As for the Links per Sitemap option, please make sure that the “Advanced Mode” of our plugin is enabled if you don’t see the desired option: https://rankmath.com/kb/advanced-mode/
Looking forward to helping you.
Hello,
Since we did not hear back from you for 15 days, we are assuming that you found the solution. We are closing this support ticket.
If you still need assistance or any other help, please feel free to open a new support ticket, and we will be more than happy to assist.
Thank you.
The ticket ‘x-robots-tag: noindex – breaks Google search console’ is closed to new replies.