The robot detected errors in one or more Sitemap files of the site

#790167
  • Resolved Giuliano Ciolino
    Rank Math free

    Yandex has sent for 2 days emails with advise “The robot detected errors in one or more Sitemap files of the site”.

    I’ve used their validator, but it’s all ok. Why i’ve this error? In while, I delete sitemap from hosting and use your.

Viewing 12 replies - 1 through 12 (of 12 total)
  • Hello,

    I’m sorry to hear about the issue you are facing with the errors detected in the Sitemap files of your site. Yandex has been sending emails advising that the robot detected errors in the Sitemap files.

    If you have already validated the Sitemap files and they appear to be correct, the issue might be related to caching. Sometimes, caching the Sitemap files can lead to errors, even if they are valid.

    To resolve this, you can exclude your Sitemap from caching. Here is a detailed tutorial that will guide you on how to exclude your Sitemap from being cached: Exclude Sitemap from Caching.

    Additionally, if you have tried validating the Sitemap files and excluding them from caching but are still facing issues, it might be due to restrictions from your web host or a misconfiguration on the server. In such cases, I recommend reaching out to your web hosting provider for further assistance.

    I hope these steps help resolve the errors detected in your Sitemap files. If you have any more questions or need further assistance, please feel free to reach out to our support team directly here. We are here to help you.

    Thank you for bringing this concern to our attention.

    Sources:
    How to Fix the “Indexed, though blocked by robots.txt” Error » Rank Math
    How to Fix “Incorrect HTTP Header Content-Type” Error in Sitemaps » Rank Math
    Common robots.txt Issues & How to Fix Them » Rank Math
    How to Fix the ‘Blocked Due to Other 4xx Issue’ Error » Rank Math

    Hello,

    Thank you for contacting Rank Math support.

    Please share a screenshot of the error and URL of the affected sitemap so we can check and assist you further.

    Looking forward to helping you.

    Hello,

    I have updated the sensitive data as requested. Can you please check further?

    Thank you.

    Unfortunately the htaccess file was deleted from the hosting and I changed cache plugin. So I had neither modification. Now I have both.

    I also have this rule that was always active, but it doesn’t seem to work, because I didn’t have any other rules to exclude the sitemap from the cache:

    /**
    * Filter if XML sitemap transient cache is enabled.
    *
    * @param boolean $unsigned Enable cache or not, defaults to true
    */
    add_filter( ‘rank_math/sitemap/enable_caching’, ‘__return_false’);

    it’s a PHP?

    Is it better if I exclude other things besides the sitemap? For a month I have had 0 impressions, but thanks to the exclusion of the sitemap from the cache I had reached 400 keywords in the top 100. A poor result, but it is my record.

    I am convinced that I have done a job of at least 50 clicks a day. I didn’t say a high expectation, but a highly probable one.

    I edited my previous answers. Could you read them, please?. The problem is at https://giocone.com/sitemap_index.xml, but various online tools say it’s fine. Now I followed the rules to exclude the sitemap from the cache.

    Hello,

    The data you shared in the sensitive data section is the .htaccess file but that doesn’t help us in any way identify the issue.

    We need to know the exact error that is being reported by Yandex so we can advise further. Given that external tools say the sitemap is fine it’s possible that there is a particular issue with Yandex but without further data, we cannot pinpoint the issue.

    Thank you.

    The exact error from Yandex does not exist. I receive emails, I go to validate the offending URL and everything is fine. I tried 3 times.

    Let’s ignore the Yandex problem, it helped to understand that I didn’t have the sitemaps excluded from the cache again.

    Let’s go back to the cache problem instead: Why do I have the feeling that I have everything in the cache, even what shouldn’t be? As I’ve said more than once, Search Console struggles enormously to see the changes I make. After a year or more, Search Console sees URLs deleted. Just as if the problem came from the cache.

    Unfortunately the hosting modified the htaccess and robots.txt file several times, making me confused as to what I did wrong. But I can assure you that I started getting new keywords every day with the addition of exclusions in htaccess and exclusions from cache plugins. But when the hosting has deleted htaccess and added a new one, I’ve lost all keywords.

    Now I no longer have Cloudflare, but Optimole sends CSS and JS via CDN. Unfortunately you don’t have the plugin in the list. I was banned from Wrdpress, because they don’t tolerate me having memory problems and I continually ask the same questions or continually break the rules due to my lack of memory. If only they had rules for netetiquette, it would be easier for me to follow them…

    Hello,

    Unfortunately, we cannot help with caching issues if you believe that to be the problem.

    The only recommendation we make to people who feel like the sitemap is not getting updated consistently is to exclude the caching from the sitemap by following this guide: https://rankmath.com/kb/exclude-sitemaps-from-caching/

    Looking at your server configuration file it seems that you have chosen to use the .htaccess option.

    Any other areas of the website that could be affected by caching are not the responsibility of our plugin so it’s not something that we can assist you with.

    As we mentioned on other tickets, we need to keep the tickets about our plugin as the support from our team doesn’t extend outside that.

    Thank you.

    Did I enter the rule correctly to exclude the sitemap from the cache?

    Should I remove the rule I added as PHP?

    /**
    * Filter if XML sitemap transient cache is enabled.
    *
    * @param boolean $unsigned Enable cache or not, defaults to true
    */
    add_filter( ‘rank_math/sitemap/enable_caching’, ‘__return_false’);

    Hello,

    Yes, that PHP code snippet is correct to remove the cache from the sitemap.

    You don’t need to have this rule and the server configuration rule added to your .htaccess file. Only one of them will suffice and we recommend the transient one added via the PHP code snippet.

    Don’t hesitate to get in touch if you have any other questions.

    Do I also remove exclusions from the cache plugin? Or do I keep plugin + php for cache exclusions for the sitemap? In the meantime I delete the code added in htaccess.

    Hello,

    The exclusions in your caching plugin can be kept as a safe measure, only the one from the server configuration file should be removed.

    Don’t hesitate to get in touch if you have any other questions.

    Hello,

    Since we did not hear back from you for 15 days, we are assuming that you found the solution. We are closing this support ticket.

    If you still need assistance or any other help, please feel free to open a new support ticket, and we will be more than happy to assist.

    Thank you.

Viewing 12 replies - 1 through 12 (of 12 total)

The ticket ‘The robot detected errors in one or more Sitemap files of the site’ is closed to new replies.