Sitemap: ‘noindex’ detected in ‘X-Robots-Tag’ http header

#904834
  • Resolved Allplaybook
    Rank Math free

    In Google Search Console, I noticed that my sitemap had an error saying “‘noindex’ detected in ‘X-Robots-Tag’ http header”.

    Afterwards, I checked my sitemap at https://httpstatus.io/ and found that x-robots-tag was set to noindex.

    I want to solve this problem, what should I do?

Viewing 3 replies - 1 through 3 (of 3 total)
  • Hello, we’re sorry to hear that you’re facing this issue with your sitemap. The error indicating that a ‘noindex’ directive is detected in the ‘X-Robots-Tag’ HTTP header means that search engines are being instructed not to index your sitemap, which can prevent it from being crawled properly.

    Here are the steps you can take to resolve this issue:

    1. Check Your Server Configuration: The ‘X-Robots-Tag’ is often set at the server level. You will need to check your server configuration files (like .htaccess for Apache servers) or your web server settings to see if there is a directive that is setting the ‘X-Robots-Tag’ to ‘noindex’. If you find such a directive, you should remove or modify it to allow indexing.

    2. Review Security Plugins: If you have any security plugins installed on your WordPress site, they might be adding this header. Check the settings of these plugins and ensure that they are not blocking indexing for your sitemap.

    3. Exclude Sitemap from Caching: If you are using caching plugins, ensure that your sitemap is excluded from caching. Cached sitemaps may not reflect recent changes. You can refer to our tutorial on how to exclude your sitemap from caching.

    4. Contact Your Hosting Provider: If you are unable to find the source of the ‘noindex’ directive, it may be worth reaching out to your hosting provider for assistance. They can help you identify if there are any server-level settings causing this issue.

    5. Resubmit Your Sitemap: Once you have made the necessary changes, make sure to resubmit your sitemap in Google Search Console. You can follow the steps outlined in our tutorial on how to submit your sitemap to Google.

    If you continue to experience issues after following these steps, please reply to this ticket, and a human agent from the Rank Math Support team will assist you further.

    I found ‘X-Robots-Tag’ to ‘noindex’. How to remove or modify it to allow indexing.

    Hello,

    That URL of the sitemap of the website should return a noindex status code because you wouldn’t want to index the sitemaps from your website, but rather only the pages that are inside the sitemap.

    That directive on the sitemap prevents it from appearing on SERPs but it doesn’t prevent Google from crawling the pages on it and indexing them.

    If you test any sitemap, even from big websites you’ll see that all of them use this directive to be sure it’s not getting included on SERPs.

    Even our own sitemap includes that:
    Sitemap

    To submit the sitemap to the correct place on GSC you should follow the steps in this tutorial: https://rankmath.com/kb/submit-sitemap-to-google/

    Don’t hesitate to get in touch if you have any other questions.

    Hello,

    Since we did not hear back from you for 15 days, we are assuming that you found the solution. We are closing this support ticket.

    If you still need assistance or any other help, please feel free to open a new support ticket, and we will be more than happy to assist.

    Thank you.

Viewing 3 replies - 1 through 3 (of 3 total)

The ticket ‘Sitemap: ‘noindex’ detected in ‘X-Robots-Tag’ http header’ is closed to new replies.