-
Hi see attached why do we have 4 robots.txt files in search console please stop this immediately or advise how we can only just have 1:
https://www.boxingnewsandviews.com
See attached urgent we have many sites some use rank math some do not we cant give access to any sites etc. we use rank math self service but we need to fix this robots.txt issue we should only have one robots.txt file in google search console
cant attach but sent seperate email with attachments this is a mess there should not be 4 robots.txt files this is messing up one of our sites need to fix this reached to server company also
-
Hello,
We apologize for the issue you are facing with multiple robots.txt files appearing in Google Search Console. It can be confusing to see multiple entries, and we understand how this can impact your site’s performance.
In general, a website should only have one active robots.txt file. If you are seeing multiple entries, it could be due to different configurations or caching issues. Here are a few steps you can take to address this:
-
Check Your Site’s Configuration: Ensure that your site is not configured to serve multiple robots.txt files. This can happen if there are conflicting settings in your WordPress setup or if multiple plugins are trying to manage the robots.txt file.
-
Use Rank Math: Since you mentioned using Rank Math, you can manage your robots.txt file directly through the Rank Math settings in your WordPress dashboard. Make sure that only one version is being used and that it is correctly configured.
-
Clear Cache: If you have caching plugins or server-side caching, clear the cache to ensure that the latest version of your robots.txt file is being served.
-
Test Your Robots.txt File: You can use the Robots Testing Tool to check what your current robots.txt file looks like and ensure it is functioning as expected.
-
Contact Your Hosting Provider: Since you mentioned reaching out to your server company, they may also be able to assist in identifying why multiple files are being served.
If you continue to experience issues, please reply to this ticket, and a human agent from the Rank Math Support team will assist you further. Thank you for bringing this concern to our attention, and we hope to help you resolve it soon!
Sources:
Common robots.txt Issues & How to Fix Them » Rank MathHello,
Thank you for contacting Rank Math support.
We can see only one
robots.txtfile on your website, i.e:https://www.boxingnewsandviews.com/robots.txt. Can you clarify which otherrobots.txtfiles you’re referring to?Please share the attachments here on this ticket. You can share images with us by uploading them to an image hosting site like Imgur, and then sharing the generated URL with us here.
We are looking forward to hearing back from you.
Hi, we emailed you at s******@r*********** as we can’t attach attachments safely here and are not allowed attach provide documents on a hosting site like you mentioned, we emailed s******@r*********** with the attachments, our hosting company got back to us saying there is only 1 file but 4 different versions or the file which is the usual, however we also uploaded the file to the root directory of our server over night to make sure it is read
Hello,
You can upload screenshots using this tool and add the link here.
However, this happens when the verification method taken by your Search Console is via Domain property. So there will be HTTPS, HTTP, WWW, and non-WWW versions.
This case should be fine as per Google’s support forum here:
https://support.google.com/webmasters/thread/248370211/multiple-robots-txt-files-found-in-google-search-console-s-robots-txt-tester?hl=enLooking forward to helping you.
this issue was fixed corrects robots.txt is uploaded to route file of server and is no longer writable due to this as it is in correct place please close ticket and regards
Hello,
We understand that this issue has been resolved, but please note that when you upload your
robots.txtfile to your website root, you will no longer be able to edit it using Rank Math. If you want to edit the file via Rank Math, then you’d need to delete therobots.txtfile in your server root.We hope this helps clarify the issue. Please let us know if you have any other questions or concerns.
We never have reason to edit robots.txt it always stays the same content standard content as follows:
We don’t see why would ever edit this. Everything else is custom code htaccess and other places etc. but robots.txt never changes for us
Having it in root folder seems to make more sense where it is now on boxingnewsandviews.com here is content of file:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.phpSitemap: https://www.boxingnewsandviews.com/sitemap_index.xml
*Have never had to edit robots.txt file before not sure why would in the future? Have never edited standard file ever
Hello,
That is fine. You can continue using the robots.txt file in your server root.
Let us know if you need help with anything else.
Thanks, we have a few sites, some use rank math some don’t but as regards this sports one boxingnewsandviews.com any quick tips you can offer appreciated. We turned off index for search results due to thin content but we did turn on pagination and date archive index, reason being, we have over 11000 posts and this site was relaunched the last few months with a huge amount of schema.org org and schema.org author rank implemented, as well as a lot of new 301 redirects to old pieces of content lost in a migration to similar pieces of content on the site etc – so there’s a lot going on, reason we indexed date archives and pagination pages is to let google crawl everything give it full picture of the site, also some very big sites our authors have also written for included in their schema on boxingnewsandviews.com as well as a lot of org schema on social media platforms and other platforms as well as a lot of video content on youtube and humix, all new stuff, so this all takes time etc. we also acquired a boxing domain recently that was previously owned with a lot of rank in google and links and we redirected 301 the url doransboxingblog.com after the acquisition last week to boxingnewsandviews.com a million other good things also but as you know, this all takes a lot of time. if rank math goes well on this site will look at putting it on other sites and look at the pro platform in 2025 etc and regards.
Hello,
It sounds like you’re doing a lot to optimize the site. We can see that the site boxingnewsandviews.com to doransboxingblog.com which is a good one.
Since you’ve already implemented schema and redirects, one quick tip is to monitor the indexing of your archives and pagination pages to ensure they’re not diluting your site’s authority. You might also consider using the Rank Math SEO Analysis tool to identify any overlooked optimization opportunities.
Please feel free to reach out to us again in case you need any other assistance.
We are here to help.
Thank you.
Thanks yes the analyzer came back 90 out of 100 score, yes pagination and date archives are indexed to allow gogle to understand structure overall, yes doransboxingblog.com is owned by us again and redirects to boxingnewsandviews.com thankfully again, thanks for the tips.
Do you recommend disabling index for date archives and pagination as large as a site like boxingnewsandviews.com with over 11000 posts? We think indexing them is okay but your thoughts appreciated.
Hello,
Regarding your question about indexing date archives and pagination, For a site as large as yours, with over 11,000 posts, it’s generally a good idea to carefully manage what gets indexed. Indexing date archives and pagination can help Google understand your site structure, but it might also lead to thin content pages being indexed, which can dilute your site’s overall authority.
One approach could be to monitor the performance of these indexed pages in Google Search Console. If you notice that they aren’t driving significant traffic or engagement, you might want to consider noindexing them to focus the crawl budget on your more valuable content.
However, if these pages are helping Google get a clearer picture of your site structure, and you see a positive impact on indexing and ranking, then keeping them indexed could be beneficial. The decision ultimately depends on your specific site goals and performance.
We hope that helps.
Thanks.
Thanks, you know, we see completely where you are coming from. Usually no indexing pagination and date archives is what most seos recommend. However upon testing we are seeing positive results from actually indexing them with Google. The reason why is this website was only restarted and relaunched back in the end of April of this year after some time off, so it actually is helping Google understand the site architecture. We were also able to make sure not a single orphaned post exists any longer on the site, which is having a positive impact it seems and reducing bounce rate and improving UX and user experience also at the same time. Best regards.
-
The ticket ‘why do we have 4 robots.txt files in search console have reached out to server’ is closed to new replies.