-
I think my site is beyond the crawler’s budget. This would explain why the scans are slow.
Do you have suggestions on how to improve the situation?
How do I understand how many total requests are needed? Are the requests I see with the Firefox tool?
-
Hello,
I’m sorry to hear that you’re experiencing issues with your site’s crawl budget. To improve the situation, you can start by optimizing your internal linking structure to enhance the crawlability of your website. This will help ensure that valuable pages are crawled and indexed efficiently.
Additionally, you should keep an eye on your internal links attribute and avoid using no-follow links internally, as this can signal to Google that certain pages are not important.
Regarding the total requests needed and understanding the requests seen with the Firefox tool, it’s important to note that the crawl budget is determined by Google based on various metrics, and it’s not directly related to the number of requests seen in the Firefox tool.
To understand how many total requests are needed, you can use Google Search Console’s Crawl stats report to see an overview of how your server responded to Google’s crawl requests over the last 90 days. This will provide insights into any server errors encountered by Google on your site.
If you have a lot of failed requests, it’s advisable to contact your host for support. Server errors can sometimes last for a short time, so it’s important to address any abnormal amounts of server errors indicated in the Crawl stats report.
I hope this helps! If you have any further questions or need assistance, please feel free to reach out to our dedicated support team here. We’re available 24/7, 365 days a year.
Sources:
Discovered – Currently Not Indexed in Google Search Console » Rank Math
How to Fix Page Fetch Error Failed: Hostload Exceeded » Rank Math
Rank Math Sitemap Issues and Their Fixes » Rank MathHello,
Thank you for contacting us and sorry for any inconvenience that this issue might have caused.
Improving your website’s crawl budget involves optimizing some aspects of your site to ensure search engines can efficiently crawl and index your content. Below are some strategies that can help:
1. Prune Low-Quality Pages:
If your site has a lot of low-quality or thin-content pages, consider either improving them or removing them entirely. This can help ensure that Googlebot isn’t wasting time crawling these pages when it could be focusing on your higher-quality content.
2. Manage Duplicate Content:
Having lots of duplicate content on your site can save your crawl budget, as Googlebot could crawl multiple versions of the same page. To manage this, ensure that every page on your site has a unique canonical URL.
3. Optimize Your Site’s Load Time:
A faster website helps the crawlers to access more pages in less time. Ensure that your website is well-optimized for speed. You can use Google’s PageSpeed Insights tool to identify areas for improvement.
4. Leverage the Robots.txt File:
You can use the robots.txt file to guide search engine bots away from pages that aren’t important or necessary to index. However, use this cautiously, as incorrect usage could prevent essential pages from being indexed.
5. Improve Internal Linking:
Ensure every page on your site can be reached by internal links. Orphan pages (pages that can’t be reached by following internal links) are bad for your crawl budget, as they can be more challenging for Googlebot to find.
6. Correctly Use Redirects:
If you’re using redirects, ensure they’re 301 (permanent) rather than 302 (temporary). Additionally, avoid redirect chains, as these can consume more of your crawl budget.
7. Keep Your XML Sitemap Updated:
An updated sitemap helps search engines understand the structure of your site and can guide them to your most important pages. Ensure that your sitemap is updated whenever you add new content.
8. Use Google Search Console:
You can see which pages Googlebot cannot crawl in Google Search Console and why. You can also see a report of your site’s crawl stats to understand how Googlebot interacts with your site.
Large site owner’s guide to managing your crawl budget:
https://developers.google.com/search/docs/crawling-indexing/large-site-managing-crawl-budgetRemember, improving your crawl budget is about making it as easy and efficient as possible for search engine bots to crawl and index your site’s content. This often involves a combination of technical SEO, content optimization, and site structure improvements.
2.
None of the ‘requests’ in the Firefox console tool screenshot you shared is related to Rank Math.
We hope that helps, and please don’t hesitate to get in touch if you have any other questions.
Hello,
Since we did not hear back from you for 15 days, we are assuming that you found the solution. We are closing this support ticket.
If you still need assistance or any other help, please feel free to open a new support ticket, and we will be more than happy to assist.
Thank you.
The ticket ‘budget crawl’ is closed to new replies.