-
Why is the last sitemap read 4 days ago? Is it normal for such a long time?
-
Hello,
We apologize for the issue you are facing with your sitemap. It is not uncommon for Google to take some time to crawl and index sitemaps, but a delay of 4 days can be concerning depending on your site’s activity and updates.
Here are a few things to consider:
- Sitemap Submission: Ensure that your sitemap is correctly submitted to Google Search Console. If it’s not submitted, Google may not be aware of it.
- Site Activity: If there haven’t been significant changes or new content on your site, Google may not prioritize crawling your sitemap frequently.
- Crawl Budget: Google allocates a crawl budget for each site, which can affect how often your sitemap is read.
- Technical Issues: Check if there are any technical issues with your sitemap or website that might prevent Google from accessing it.
If you continue to experience issues, we recommend checking the sitemap URL directly in your browser to ensure it loads correctly. You can also refer to our article on why Google Search Console shows zero discovered URLs through sitemap for more insights.
Thank you for raising this concern! If you have further questions or need assistance, please reply to this ticket, and a human agent will assist you.
Sources:
Why Google Search Console Shows Zero Discovered URLs Through Sitemap? » Rank MathHello,
Please note that Google assigns a crawl budget to your website depending on a lot of factors (especially your posting frequency + the domain authority) and that has a direct effect on how soon or how late your sitemap will be crawled.
However, if you want, you can remove the submitted sitemap, and resubmit the
/sitemap_index.xml
in the Search Console again and see if that works for you.Let us know how it goes. Looking forward to helping you.
Thank you.
I’m sorry to tell you that you haven’t solved the problems for 4 years and I’m furious.
For example, I have posts not in the sitemap. I deactivated plugins and themes but the result is the same. Please don’t tell me once again a temporary solution, but fix the bug… I’ve been waiting for 4 years! The site is beautiful, fast and well done (for being that I made it). I can’t do more than this to avoid Google removing the URLs from the SERP, despite their indexing.
Post sitemaps have duplicates. The number of posts in the sitemaps matches the number of published posts, but as you may have guessed, there are posts missing from the sitemap.
I have been complaining about sitemap issues for 4 years. Please check further. There are probably errors I missed.
I used your old advice to change the number of posts in the sitemaps and save the permalink.
I fixed it, but the category pages will probably be reported by ahrefs. I checked the sitemaps and I don’t have such pages.
I use a CDN not in list. What i should follow on https://rankmath.com/kb/exclude-sitemaps-from-caching/?
Hello,
You should check with your CDN’s support, as they will be in a better position to help you exclude the sitemap files from the cache.
To fix the sitemap duplication issue, please follow this step:
Flush the Sitemap cache by following this video screencast:
https://i.rankmath.com/pipRDpIf the issue persists, please share your site’s login details in the sensitive data section so we can check this further.
Looking forward to helping you.
I submitted the sitemap to Search console 2 days ago and there has been no new read, despite having published new posts.
Of course I’ve been following your advice for 4 years, but I don’t understand why the sitemap isn’t read. I have to send it to Search console manually and it doesn’t help with the rankings, since the urls are always excluded but indexed.
Hello,
The Google bot is a very capable spider and once it’s gained entry to your site and assuming that the pages are interlinked in a sensible way, it is perfectly capable of indexing the entire site, without a sitemap even existing.
According to Google:
“Google doesn’t check a sitemap every time a site is crawled; a sitemap is checked only the first time that we notice it, and thereafter only when you ping us to let us know that it’s changed. You should alert Google about a sitemap only when it’s new or updated; do not submit or ping unchanged sitemaps multiple times”.
Here’s a link for more information:
https://support.google.com/webmasters/thread/182027180/my-sitemap-is-not-being-read?hl=enLooking forward to helping you.
This not explain sitemap not was checked and 20 posts not was neither indexed…
Hello,
You are putting all the honus on the sitemap when that is important but not crucial or the only way Google discovers a website.
The Google spider is very capable at crawling websites and when it gains access to the website, via the sitemap, it can discover other pages just by the internal links on the website.
You can even check the documentation from Google on the sitemaps which clearly states that the websites can be discovered easily if properly linked:
As for the posts not being indexed, Google won’t index every page as a matter of certainty and most of the time only includes a selection that it deems helpful.
An SEO plugin can help with the technical aspects of a website, and we do that very well. You can check that the sitemap we produce is accessible and passes validation just fine, otherwise, it would never be read.
We already explained this before, but we will state it again. If you want to drive traffic to your website and grow it you need to think outside the box or invest millions in paid ads. The niche you are currently in is very competitive and has major players with millions to spend on advertising.
Don’t hesitate to get in touch if you have any other questions.
Hello,
Since we did not hear back from you for 15 days, we are assuming that you found the solution. We are closing this support ticket.
If you still need assistance or any other help, please feel free to open a new support ticket, and we will be more than happy to assist.
Thank you.
The ticket ‘Last sitemap read’ is closed to new replies.