Find problems in robots.txt

#609367
  • Resolved Giuliano Ciolino
    Rank Math free

    I’ve been struggling to get traffic from Google for two years. I recently edited the robots file and didn’t previously add the wildcard * to the end of the url. I had 404 error code for the url containing the sitemap. Google, Bing and Yandex said that the robots.txt was written well and I had no crawlers block. But as you may have understood, there was a serious mistake. Then I added the wildcard * to the end of the url and the sitemap url works.

    And if I have problems in robots.txt, as I have repeatedly said? An unadded symbol is enough to create problems. Could you check if I wrote everything correctly? You told me yes, but obviously it’s not like that.

Viewing 3 replies - 1 through 3 (of 3 total)
  • Alhan
    Rank Math free

    Hello,

    Thank you for your query, and we are so sorry about the trouble this must have caused.

    I understand your concerns about the robots.txt file on your site, and I’m here to help.

    Your shared text shows some discrepancies between the initial and current robots.txt files on your site.

    The version you posted in the sensitive data section contains rules to disallow crawling of your /wp-admin/ and /wp-login.php?* directories.

    However, the current version of the site needs to have these rules.

    Regarding the sitemap URL, adding a wildcard at the end of the URL should not influence the ability of search engines to crawl the sitemap.

    The 404 error suggests that the sitemap was not found, but it’s more likely to be an issue related to the location or generation of the sitemap rather than the robots.txt file.

    You mentioned that you added a new rule to disallow the crawling of a specific .json file in the wp-content/uploads/ directory.

    It’s okay to have specific rules like this as long as they are intentional.

    Now, I’ll comment on your current robots.txt (check sensitive data section).

    This set of rules tells all crawlers that they are allowed to access /wp-admin/admin-ajax.php, and they are not allowed to crawl the .json file in wp-content/uploads/. The Sitemap directive is also pointing crawlers to the correct sitemap.

    Overall, your current robots.txt rules are acceptable if they align with your goals.

    Remember, it’s not a strict requirement to block crawling of the /wp-admin/ directory or /wp-login.php, but it is common practice to do so for WordPress sites.

    If you are still facing indexing issues, factors beyond the robots.txt file may influence this. For example, on-page SEO factors, meta tags, backlink profiles, quality of content, server issues, etc., could affect search engine rankings and visibility.

    Make use of Google Search Console to diagnose further any crawling or indexing issues that might be present on your site.

    This tool can provide more in-depth information on how Google interacts with your site.

    We hope that helps, and please don’t hesitate to get in touch if you have any other questions.

    Thank you.

    I put the previous robots.txt again, as it is a job that experts do. I’ve noticed how much influence and instability it has by not putting the asterisk in the login page url.

    As for my SEO deficiency, I remember reading several times that those who work on SEO spend most of their time on the titles that appear on the SERP. Today and in the future, I spend more time on this aspect. This decision was also made because I had 1000 impressions in 7 months but 2.3% clicks.

    Up until now I hadn’t given enough importance to the titles on the SERP as I thought that a title like “play super mario bros game” was enough. In the past I had also added online and free, but to no avail. I most likely removed those terms after a short time.

    Alhan
    Rank Math free

    Hello,

    Looks like you marked this ticket as resolved.

    Can please confirm if we can close this ticket or do you still need assistance on this ?

    We look forward to helping you.

    Thank you.

Viewing 3 replies - 1 through 3 (of 3 total)

The ticket ‘Find problems in robots.txt’ is closed to new replies.