Indexed though blocked by robots.txt feed pages

#392826
  • Resolved Miettim
    Rank Math free

    Hello! I noticed like year ago that google indexed my RSS Feed-pages and was thinking that they can cannibalize the “normal pages” and added “Disallow: */feed/” to Robots.txt.

    The feed pages disappeared from google and now after a long time Google suddenly gave me warning about this.

    So my question is: Can this warning impact my rankings?
    Do RSS feeds even cannibalize if they get indexed or is it normal, or even a good thing?

    If its not good and this warning can impact my search rankings, how could i solve this?

    https://ibb.co/2ZNSMTv image from the error.

    Thank you 🙂

Viewing 3 replies - 1 through 3 (of 3 total)
  • Hello,

    Thank you for contacting Rank Math support, and sorry for any inconvenience that might have been caused due to that.

    RSS feed pages will not give you a much better ranking as their content is not unique. You can continue to hide it from the crawlers.

    To address the issue from your Google Search Console. Please refer to our guide here: https://rankmath.com/kb/indexed-though-blocked-by-robots-txt-error/#pages-not-to-index

    Hope that helps and please do not hesitate to let us know if you need our assistance with anything else.

    Hey thanks for the answer. I cant find from the tutorials, how to add “No Index Robots Meta” to all RSS Feeds. All i can find is to add the “Disallow: */feed/” to Robots.txt, which i have done, but its the one that gives me the error.

    Or did i miss something, can i even add No Index Robots Meta to feed pages?

    Nigel
    Rank Math business

    Hello,

    Robots meta tags can’t be added to RSS feed pages.

    You can look into using the “X-Robots-Tag” in the .htaccess/nginx config file of your website. The X-Robots-Tag allows you to add a noindex setting to the header of any page of file server from your website depending on condition set in the .htaccess file.

    However, it may be worth looking in to why Google is ignoring the noindex setting. For example, if you have podcasts the feed is used to get listed on streaming platforms. In that case it is safer to set your feed to index.

    Hope that helps. If you have questions, do not hesitate to ask.

    Hello,

    Since we did not hear back from you for 15 days, we are assuming that you found the solution. We are closing this support ticket.

    If you still need assistance or any other help, please feel free to open a new support ticket, and we will be more than happy to assist.

    Thank you.

Viewing 3 replies - 1 through 3 (of 3 total)

The ticket ‘Indexed though blocked by robots.txt feed pages’ is closed to new replies.