robot.txt not valid

#194524
Viewing 15 replies - 1 through 15 (of 16 total)
  • Anas
    Rank Math business

    Hello,

    Thank you for contacting Rank Math, and sorry for any inconvenience caused.

    I checked your robots.txt file and found an incorrect entry Domain.

    Rank Math generates a virtual robots.txt file if you do not have an actual robots.txt file in the website root directory.

    To manage robots.txt file through Rank Math, please login to your web server via FTP and delete the robots.txt file present in the root folder.

    It is the same folder where you have your wp-content, wp-admin
    etc.. folders.

    Once you delete the robots.txt file, Rank Math’s virtual robots.txt file will take over and you will be able to edit the robots.txt file.

    Paste the below text in the robots.txt

    User-agent: *
    Disallow: /wp-admin/
    Allow: /wp-admin/admin-ajax.php
    
    Sitemap: https://thirdeyetraveller.com/sitemap_index.xml

    Also, your sitemap is redirecting to the homepage, please follow the below steps:

    1. Flush the Sitemap cache by following this video screencast:
    https://i.rankmath.com/pipRDp
    2. Exclude the Sitemap files of the Rank Math plugin in your caching plugin. The cache could be via a plugin or from the server. For plugins or Cloudflare, please follow this article:
    https://rankmath.com/kb/exclude-sitemaps-from-caching/

    That should fix the issue. Please let me know if that does not. We are here to assist.

    Hi Anas,

    I’ve had Rankmath for a while and so I don’t have a Robots.txt file in my host file manager. I’ve just double checked on FTP in my route files through Siteground. Rankmath should be generating my sitemap.

    I’ve followed your screengrab video and re-input the text you provided into my RankMath settings.

    I’ve flushed the cache and looked on my WP Rocket plugin – I can see I already had some exceptions cache code put in there in my advanced settings so the sitemap is not cached;

    /(.*)sitemap(.*).xml
    /(.*)sitemap.xsl
    /sitemap_index.xml
    /post-sitemap1.xml
    /post-sitemap2.xml
    /post-sitemap3.xml
    /page-sitemap.xml

    Even after I’ve done this, I’ve gone incognito and my sitemap is still re-routing to my homepage.

    I’ve just run another test via Web Core Vitals (Lighthouse) and it’s still saying the errors..

    – robots.txt file is not valid
    – Links are not crawlable

    Please advise?

    Thank you
    sophie

    Anas
    Rank Math business

    Hello,

    We might need to take a closer look at the settings. Please edit the first post on this ticket and include your WordPress & FTP logins in the designated Sensitive Data section.
    Sensitive Data Section

    It is completely secure and only our support staff has access to that section. If you want, you can use the below plugin to generate a temporary login URL to your website and share that with us instead:

    https://wordpress.org/plugins/temporary-login-without-password/

    You can use the above plugin in conjunction with the WP Security Audit Log to monitor what changes our staff might make on your website (if any):

    https://wordpress.org/plugins/wp-security-audit-log/

    We really look forward to helping you.

    Hello,

    I have updated the sensitive data as requested. Can you please check further?

    Thank you.

    Hello,

    Thank you for that information, and apologies for the delay.

    I’ve checked your site, and it looks like the issue is related to caching on your site.

    I have deleted the cache and re-saved your virtual robots.txt file and the changes are saved.
    robots

    Please also clear your server-related cache if the issue still persists on your end.

    I hope that helps. Thank you, and please donโ€™t hesitate to contact us anytime if you need further assistance with anything else.

    Hi Reinelle,

    thanks for this and trying that.

    As i explained above, I have already placed some exceptions in my WP Rocket Cache for the Robots.txt file.

    I’ve just run another test and it’s still saying it is not valid and links are not crawlable?

    Thanks, Sophie

    Anas
    Rank Math business

    Hello,

    We will need your FTP logins to check this further.

    Please add the FTP login details in the sensitive data section.

    Looking forward to helping you.

    Hello,

    I have updated the sensitive data as requested. Can you please check further?

    Thank you.

    Hello,

    Thank you for the follow-up.

    We’re getting this message upon trying to login to your SiteGround account:
    Siteground Login

    In case we cannot proceed with logging into your account, you can follow the steps in this tutorial on how to create an FTP account in SiteGround:
    https://www.siteground.com/tutorials/ftp/accounts/

    I hope that helps.

    Thank you, and looking forward to helping you.

    Hello,

    I have updated the sensitive data as requested. Can you please check further?

    Thank you.

    Hello,

    Thank you for keeping in touch with us.

    I tried accessing your FTP and I wasn’t able to because it returns an error. Please refer to the screenshot I shared in the Sensitive Data section. Could you please check again so we can further investigate the issue?

    We are looking forward to helping you.

    Hello,

    I have updated the sensitive data as requested. Can you please check further?

    Thank you.

    Hello,

    Thank you for keeping in touch with us.

    I checked your robots.txt file while admin is logged in and the correct file is loaded. If I load the robots.txt file on a new browser, the old robots.txt file shows up. https://i.rankmath.com/sApv46

    This could only suggest that there is other software within your server that is still heavily caching your robots.txt file. Could you please contact your host for this and see if they are able to help you? You may also request to clear Cloudflare cache if you have any on your server.

    Let us know how it goes. We are looking forward to helping you with this one.

    Hi there,

    I’ve contacted siteground and it appears that it was my Dynamic Cache that was causing this issue.

    This is a cache for non-page elements like Server files with NGYNX.

    So, in future, if other siteground users get the Robots.txt file is not valid issue….

    They need to make sure that they flush the Dynamic Cache in Siteground > Site Tools > Speed > Caching > Dynamic Cache > Flush with the paintbrush icon ๐Ÿ™‚

    Or, they can disable Dynamic Caching via the SG Optimiser plugin.

    I didn’t know it was even a thing until today ๐Ÿ™‚

    Thank you for all your help!!!!

    Google Lighthouse now says the Robots.txt file is valid again!

    Sophie

    Thank you again

Viewing 15 replies - 1 through 15 (of 16 total)

You must be logged in to reply to this ticket.