Hello,
Thank you for your query and we are so sorry about the trouble this must have caused.
Rank Math doesn’t create a physical robots.txt file, instead, the robots.txt is created on the file when you visit it like so: https://mydomain.com/robots.txt
The robots.txt can be edited via our General Settings. Please refer to this guide: https://rankmath.com/kb/how-to-edit-robots-txt-with-rank-math/
Looking forward to helping you.
is it a problem not to have the robots.txt file on the server?
Hello,
A robots.txt is not a requirement, but it does help search engines know which files/content you want to be crawled and what you do not want to be crawled.
Hope that helps. Please let us know if you have questions.
The hosting created a robots.txt. Is the solution on the server better or the one from Rank Math?
The hosting placed the more forgiving robots.txt. The sitemap url is missing.
Hello,
Thank you for providing additional information.
Rank Math creates robots.txt dynamically, which you can edit from WP Dashboard > Rank Math > General Settings – Edit robots.txt. You can also have the physical file in the server. Both works exactly the same. Dynamic robots are created on fly by WordPress / Rank Math, hence you don’t have to edit anything. It’s purely, your choice which way you want to go with.
I would prefer the dynamic method as it’s error free.
Hope that helps. Please let us know if you have questions.
Thank you
Actually Yandex has been marking errors in robots.txt for a while. That’s why I asked the hosting for clarification.
Hello,
Please share the error message and the robots.txt link so we can check it as well.
Meanwhile, here are the robots.txt rules we recommend (works either in Rank Math or created physically in your server):
https://rankmath.com/kb/how-to-edit-robots-txt-with-rank-math/#default-rules
Looking forward to helping you.
Thank you.
This is the error on Yandex. https://postimg.cc/5YPskNYK
I suspect that this bug is also causing problems for other search engines.
Hello,
The robots.txt file looks perfectly fine. As per the image there’s another code User-agent: *
at line 7 and it’s also mentioned below about it. Would you please delete it and check if that resolves the issue.
Please check the sensitive data for the reference image.
Hope that helps, and please do not hesitate to let us know if you need our assistance with anything else.
Thank you.
User-agent: *
Allow: /wp-admin/admin-ajax.php
Disallow: /wp-admin/
Disallow: /wp-content/uploads/wpo-plugins-tables-list.json
Sitemap: https://giocone.com/sitemap_index.xml
Hello,
Yes, you should use that format and remove the duplicate lines.
Hope that helps.
Thank you.
What do you mean by “Yes, you should use that format and remove the duplicate lines.”
I’m using the robots.txt I wrote.
Hello,
You have shared a different robots.txt configuration here, which is also a valid robots.txt configuration.
If you wish to remove the errors coming from Yandex, then please apply the configuration you have shared.
Hope that helps and please do not hesitate to let us know if you need my assistance with anything else.