Hello,
Thank you for contacting Rank Math for help with redirecting spam URLs on your website.
The question mark cannot be added to a blanket redirect rule as it is also used for internal WordPress redirects. You will have to create 410 redirects for each variable/word in a query parameter. Regex and contains redirects should work for this. You can also use ‘ends with’ in the cases where they end with “.html”.
Hope that helps. Please let us know if you have questions.
Thanks but ends with for .html doesn’t work,. The problem is the multiple subfolders in the URL, the slash is replaced by %2F. So for example this URL https://deinternetbelegger.nl/?epigynum/bequest1155727.html is redirected to https://deinternetbelegger.nl/?epigynum%2Fbequest1155727.html
As for WordPress internal URLs with question marks, is there a way to force it and exclude wordpress URLs? Perhaps outside rankmath?
Kind regards
Hello,
Please note that the slash (/) character is reserved for the delimiting of substrings whose relationship is hierarchical. It is used to distinguish the level of pages like category or file directory.
That’s why WordPress is redirecting it with %2F

Regarding your internal search, they are set to index upon checking. In this case, you can follow the steps in this link to set them to noindex:
https://rankmath.com/kb/internal-site-search-spam/#num-2-fix-internal-site-search-spam-using-rank-math
Hope that helps.
Thank you.
Hi,
“Please note that the slash (/) character is reserved for the delimiting of substrings whose relationship is hierarchical. It is used to distinguish the level of pages like category or file directory.”
> So this means i can’t redirect page with hierarchy?
“Regarding your internal search, they are set to index upon checking. In this case, you can follow the steps in this link to set them to noindex:”
> I didn’t mention anything about internal search.
> Are you saying url’s that start with /? are automatically recognized by wordpress as search result pages?
I’m dealing with 1000s of indexed spam url’s. Is it really true i cannot do anything about?
Hello,
So this means i can’t redirect page with hierarchy?
The forward slash is being encoded and thus WordPress is decoding it with the UTF-8 character. However, if you wanted to capture those characters in the redirect, then you should use regex instead of “ends with”.
Source: (.*).html$
Destination: https://deinternetbelegger.nl/
Apologies about the internal search links as the URLs you’re sharing have query parameters only (/?), which show your homepage and are considered ignored.
I’m dealing with 1000s of indexed spam url’s. Is it really true i cannot do anything about?
You can redirect them or disallow those query parameters in your robots.txt to block Google bots from crawling those URLs:
Here’s a link for more information:
https://rankmath.com/kb/how-to-edit-robots-txt-with-rank-math/
Looking forward to helping you.
Thank you.
Okay thanks.
Help me out here cause your regex doesn’t work:
https://deinternetbelegger.nl/?ounds-285224-sJJyrkNc/958259
https://deinternetbelegger.nl/?Miki/mediosilicic1634491.html
https://deinternetbelegger.nl/?TDjb/chummily596/2dFI931s
https://deinternetbelegger.nl/?basilical=7&rbKnwQ=1139457
https://deinternetbelegger.nl/?mallum=252&Wuw0k4r6=1972849
Above i have 5 distinct urls that i want to 410 redirect with regex.
How to take this on?
– Starting with /? and ending without .html (Slash inbetween but also – inbetween)
– Starting with /? and ending with .html (Slash inbetween) >> your regex doesnt work on this
– Starting with /? and ending without .html (Slash inbetween and 3 hierarchy levels)
– Starting with /? and multiple = and & inbetween (Last 2 urls)
Im reading up on this but i need your help cause im already at 2.5million urls in search console.
Hello,
I managed to create one redirect that addresses the URLs that contain ‘.html’:
source: (.*)/?(.*).html$
Type: 410
Screenshot:

URLs that end with ‘.html’ do not occur in the normal functioning of WordPress so they are safe to redirect. The rest of the redirects are difficult to create because they contain patterns that might conflict with URLs that are needed for the normal functioning of your site.
You can refer to the following article that explains the use of URL parameters (URLs that contain a ‘?’): https://www.semrush.com/blog/url-parameters/
Since the URLs have all been deleted from your site, the best you can do is wait a few months and they will eventually stop being crawled as Google will not have seen any reference to them for a while.
Hope that helps. Please let us know if you have questions.
Hello,
Since we did not hear back from you for 15 days, we are assuming that you found the solution. We are closing this support ticket.
If you still need assistance or any other help, please feel free to open a new support ticket, and we will be more than happy to assist.
Thank you.