Hello,
We apologize for the issue you are facing. To disable crawlers from crawling your staging site, you can add the following rule to your robots.txt
file:
User-agent: *
Disallow: /
This rule will prevent all search engine bots from crawling any pages on your staging site. However, please note that this is not recommended for live websites, but it is suitable for a staging environment where you want to avoid indexing duplicate content.
If you need guidance on how to edit your robots.txt
file using Rank Math, you can refer to this guide: How to Edit robots.txt with Rank Math.
Thank you for raising this concern! If you have any further questions or need assistance, feel free to reply to this ticket, and a human agent will assist you further.
Sources:
What Are the Best Robots.txt File Rules for Multisite? » Rank Math
Common robots.txt Issues & How to Fix Them » Rank Math
This is an automated reply from Rank Math Support Bot. If you need further assistance, please reply to this topic and a human support agent will reply to you shortly.
Hello,
Since we did not hear back from you for 15 days, we are assuming that you found the solution. We are closing this support ticket.
If you still need assistance or any other help, please feel free to open a new support ticket, and we will be more than happy to assist.
Thank you.