Sitemap HTTP 404 Errors – ‘noindex’ detected in ‘X-Robots-Tag’ http header

#209396
  • Resolved Aliya
    Rank Math free

    Though my sitemap is being read, I am receiving HTTP 404 errors on all the sitemap files. When I inspect the individual URLs, I notice ‘noindex’ is detected in ‘X-Robots-Tag’ http header. There is nothing in my robots.txt file that should be causing this.

Viewing 3 replies - 1 through 3 (of 3 total)
  • Prabhat
    Rank Math agency

    Hello,

    Thanks for contacting us and we regret the inconvenience caused.

    It is completely normal for the sitemap files to be set to No Index since you do not want them to be indexed on SERPs.

    The sitemaps are intended to be crawled by the search engines but they are not to be indexed/displayed on the search results and hence the sitemaps are kept as No Index.

    Also, keeping the sitemaps as No Index does not, in any way, prevent search engines from crawling it.

    Regarding the 404 error, I checked your sitemaps and they are loading properly. Please follow the below steps and see if that fixes the issue:
    1. Flush the Sitemap cache by following this video screencast:
    https://i.rankmath.com/pipRDp

    2. Exclude the Sitemap files of the Rank Math plugin in your caching plugin. The cache could be via a plugin or from the server. For plugins or Cloudflare, please follow this article:
    https://rankmath.com/kb/exclude-sitemaps-from-caching/

    3. Clear your website’s cache and remove all the sitemaps from your Google Search Console account, and then submit only the main sitemap (your-domin.com/sitemap_index.xml).

    Further, give Google some time to crawl your website again and see if the issue is fixed.

    Hope that helps.

    Thank you.

    Aliya
    Rank Math free

    Hi Prabhat,

    I followed all of the steps above but the pages still show 404 errors in Search Console. Though they were deleted and reuploaded as well as removed from caching, Search Console shows they were last read 5/21/21.

    Prabhat
    Rank Math agency

    Hello,

    Thanks for your reply.

    Currently, you will see the same results in your Google Search Console account, it would be updated when Google crawls your website again.

    Your sitemaps are accessible to Google bots. Please have a look at the screenshot in the sensitive data section of this ticket.

    Please wait until Google re-crawls your website and see if the issue gets fixed.

    In the meantime, please let us know if we can assist you with anything else.

    Thank you.

    Hello,

    Since we did not hear back from you for 15 days, we are assuming that you found the solution. We are closing this support ticket.

    If you still need assistance or any other help, please feel free to open a new support ticket, and we will be more than happy to assist.

    Thank you.

Viewing 3 replies - 1 through 3 (of 3 total)

The ticket ‘Sitemap HTTP 404 Errors – ‘noindex’ detected in ‘X-Robots-Tag’ http header’ is closed to new replies.