Hello,
We apologize for the issue you are facing regarding the ‘noindex’ directive detected in the ‘X-Robots-Tag’ HTTP header. This typically means that the server is configured to prevent search engines from indexing the page, which can be intentional or an oversight depending on your website’s SEO strategy.
To resolve this, you may want to check your server configuration or any plugins that might be setting this directive. If you need specific guidance on how to adjust the X-Robots Tag settings, please reply to this ticket, and a human agent from the Rank Math Support team will assist you further.
Thank you for raising this concern!
Hello,
Thank you for contacting us. We apologize for any inconvenience this issue may have caused.
Can you please confirm if you’re inspecting your sitemap URL in your Google Search Console and getting this error? If so, then it is normal and you don’t have to worry. You’ll get the error if you inspect your sitemaps because, by default, sitemaps are set to noindex as they are meant to be crawled and not to be displayed on search results pages.
You should only submit your sitemap to the sitemap section of your GSC account and shouldn’t use the URL inspection tool for the sitemap URLs.
https://rankmath.com/kb/submit-sitemap-to-google/
Hope that helps and please don’t hesitate to let us know if you have any other questions.
Thank you.
Hello,
Since we did not hear back from you for 15 days, we are assuming that you found the solution. We are closing this support ticket.
If you still need assistance or any other help, please feel free to open a new support ticket, and we will be more than happy to assist.
Thank you.