Hello,
I apologize for the issue you are facing with the ‘Duplicate without user-selected canonical’ error in Google Search Console for your feed pages. While the Rank Math blog articles provide guidance on fixing this error for regular pages, the same principles can be applied to feed pages.
To address this error for your feed pages, you can set a canonical URL using Rank Math. By setting a canonical URL, you can specify the preferred version of your feed page that search engines should index. This will help resolve the duplicate content issue.
Here’s how you can set a canonical URL for your feed pages using Rank Math:
- Access the Rank Math meta box for the feed page.
- Look for the Canonical URL field within the meta box.
- Enter the preferred canonical URL for your feed page.
- Save your changes.
By setting a canonical URL for your feed pages, you can guide search engines on which version of the page to consider as the original source.
If you encounter any difficulties or need further assistance while setting the canonical URL for your feed pages, please feel free to reply here, and our support team will assist you promptly.
Thank you for reaching out with your concern.
Sources:
How to Fix āDuplicate Without User-Selected Canonicalā Status in Google Search Console Ā» Rank Math
How to Fix āDuplicate, Google Chose Different Canonical than Userā Ā» Rank Math
Hello,
I have updated the sensitive data as requested. Can you please check further?
Thank you.
Hello,
Thanks for contacting us, and sorry for any inconvenience that might have been caused due to that.
Feed URLs are not created for humans but for RSS feed crawlers and readers. They are only basic code versions of your actual content pages.
Those are not meant to be indexed and hopefully, you won’t want these pages to be indexed as Google doesnāt like them on its SERPs and would most likely not show them to users anyway.
In this case, please add this rule to your robots.txt
to disallow them from being crawled:
User-Agent: *
Disallow: */feed/
Hereās how you can edit the robots.txt
file using Rank Math:
https://rankmath.com/kb/add-sitemaps-to-robots-txt/#num-2-2-navigate-to-edit-robots-txt
This should save your crawl budget and after the next few crawls, the warnings should be removed from GSC.
Hope that helps, and please do not hesitate to let us know if you need our assistance with anything else.
Thanks.
thx for your answer. God bless you and all rankmath team
you HAVE BIG LIKE FROM IRAN
Hello,
Glad that helped.
Please feel free to reach out to us again in case you need any other assistance.
We are here to help.
Thank you.