Robots files query

#784600
  • Could you please guide me on how to block all URLs except some specific ones?

    For example:

    I want to block all posts on my website not pages.

    https://abc.com/10-off-orders-over-79/
    https://abc.com/oyo-rooms-hdfc-offer/

    But I want Google should crawl these specific URLs only.

    https://abc.com/store/

    Could you please guide me on how to write a robots.txt file for this?

    Is this Robots.txt correct?

    User-agent: *
    Disallow: */

    Allow: */store/
    Allow: /contact-us
    Allow: /privacy-policy
    Allow: /terms-of-use
    Allow: /about-us
    Allow: /sitemap

    Thanks

Viewing 3 replies - 16 through 18 (of 18 total)
  • Hello,

    Can you please share the canonical URL of your homepage and also the rule in your robots.txt file so we can review why your homepage cannot be accessed by Google bots?

    Looking forward to helping you.

    User-agent: *

    Disallow: /

    Allow: $
    Allow: /dealstore
    Allow: /contact-us
    Allow: /privacy-policy
    Allow: /terms-of-use
    Allow: /about-us
    Allow: /sitemap

    Disallow: /wp-admin/
    Allow: /wp-admin/admin-ajax.php

    Sitemap: https://mycoupongod.com/sitemap_index.xml

    ————

    The canonical URL is also without a slash. https://mycoupongod.com

    Hello,

    The canonical is without a slash but the screenshot you have from GSC showing the homepage blocked has the homepage with the trailing slash.

    You need to test the actual canonical on GSC and not the variation that is not canonical.

    Don’t hesitate to get in touch if you have any other questions.

Viewing 3 replies - 16 through 18 (of 18 total)

You must be logged in to reply to this ticket.