Prefent Google from indexing mit sites with filter parameters

  • Resolved Kevin

    Hello guys,

    how can I prefent Google from indexing mit sites with filter parameters?


    I don’t want to index everything that comes after the “?”

Viewing 7 replies - 1 through 7 (of 7 total)
  • Hello,

    Thank you for contacting Rank Math today.

    Please check your permalink settings in WordPress Dashboard > Settings > Permalinks.

    Another workaround would be to enable breadcrumbs, in which case the SERPs will show you breadcrumbs instead of the URL. This will however take some time until Google fully crawls the page and update their indexing cache.

    Looking forward to helping you. Thank you.



    I am also interested how to noindex filtered pages, but did not understand your reply at all.

    Is there a settin in Rank Math whick we could tick – just as it is with Search Results and Paginated pages.

    Thanks in advance for your reply!



    I think Michael wanted to completly deactivate the filter with the ?-Parameter correct? But I think that won’t make the situation better, it would generate anyway thousands of sites with possible filter combinations. I searched in the web and it seems a lot of people have that problem but there is no satisfying solution. I found two options to go:

    One is to disallow “/?*” in the robots.txt which means all crawlers shouldn’t follow those links, here the problem is that they also don’t follow if you have internal links with those parameters.

    The second option is adding those parameters manually on every search engines webmaster tools, like Googles URL-Paramter Tool and Bings URL-Parameter Tool, the problem here is that both seem to have deactivated those tools at the moment and Google says it should be replaced by some other tool in the future, but didn’t say when this is coming out.

    So I don’t know what to do now, so far the solution with the robots.txt seems to me the only temporary fix.

    • This reply was modified 5 months, 1 week ago by Kevin. Reason: spelling mistake

    Hi Kevin,

    Thanks for your reply. I am also getting to the conclusion that robots.txt is the best solution for the time being. However, as far I know the directive should be:

    Disallow: /*?

    Please, correct me if I am wrong and explain the difference between
    Disallow: /*? and Disallow: /?*

    Thanks in advance¡



    Disallow: /*? will disallow any link that starts with / and ends with ?

    Disallow: /?* will disallow any link that starts with /? and has any content after that

    None of them would solve your issue, you would want this one: Disallow: /*?*, it would disallow any link that starts with /, has a ? and after that has some parameters.

    Looking forward to help you.

    Thanks Alberto!

    I’ll add Disallow: /*?* then. Just for my understanding, since all my parameter links start with /?... shouldn’t Disallow: /?* be ok?

    Best regards


    No, because / means the root directory. So it would just disallow parameters in the home page:

    You need the * between / and ? to get any URL in the site.

    Looking forward to help you.


    Since we did not hear back from you for 15 days, we are assuming that you found the solution. We are closing this support ticket.

    If you still need assistance or any other help, please feel free to open a new support ticket, and we will be more than happy to assist.

    Thank you.

Viewing 7 replies - 1 through 7 (of 7 total)

The ticket ‘Prefent Google from indexing mit sites with filter parameters’ is closed to new replies.