Hello,
We apologize for the issue you are facing. To restrict access to your sitemap so that only browser bots can access it while preventing normal users from accessing it, you can use the robots.txt
file to disallow all user agents from accessing the sitemap URL. However, this approach can hurt your SEO as it may prevent search engines from crawling and indexing your sitemap, which is essential for them to understand the structure of your website.
Here’s an example of how you might configure your robots.txt
file:
User-agent: *
Disallow: /sitemap_index.xml
This rule will block all user agents from accessing the sitemap. However, keep in mind that this is not a recommended practice as it can hinder your site’s visibility in search engines.
If you want to allow only specific bots while blocking others, you would need to specify those bots in your robots.txt
file, but this can get complicated and may not be foolproof.
For more detailed guidance on configuring your robots.txt
, you can refer to our article on robots.txt rules for multisite.
Thank you for raising this concern! If you have further questions or need assistance, feel free to reply to this ticket.
Sources:
What Are the Best Robots.txt File Rules for Multisite? » Rank Math
Hello,
Thank you for contacting support.
That is not recommended nor is it possible using our plugin. The sitemap being a crawling and discovery mechanism should be freely accessible.
Don’t hesitate to get in touch if you have any other questions.
This is how scrapers can abuse! It seems that it is not the right way to make the sitemap available to everyone! Can you suggest me the best way to do this?
Hello,
Using services like Cloudflare, you can implement rate limiting and block known bad bots or suspicious IP addresses. Cloudflare also offers a bot management feature that can help differentiate between good bots (like search engines) and malicious scrapers.
Cloudflare’s Web Application Firewall (WAF) can add an extra layer of security by blocking malicious traffic before it reaches your site. You can configure rules specifically to protect your sitemap.
While these methods can help mitigate scraping, keep in mind that completely hiding your sitemap from everyone except search engines is not practical or recommended, as it can affect your SEO.
Hope that helps, and please do not hesitate to let us know if you need our assistance with anything else.
Hello,
Since we did not hear back from you for 15 days, we are assuming that you found the solution. We are closing this support ticket.
If you still need assistance or any other help, please feel free to open a new support ticket, and we will be more than happy to assist.
Thank you.