Hello,
Thank you for contacting Rank Math and sorry for any inconvenience that might have been caused due to that.
Could you please share the URL of your site so that we can investigate this further?
You can also follow this knowledge base article – https://rankmath.com/kb/google-index/ to check if everything is set correctly on the site.
Also, have you submitted your sitemap in your Google Search Console? If not, please follow this knowledge base article to submit your site – https://rankmath.com/kb/submit-sitemap-to-google/
If you have submitted your sitemap, please check the coverage report under your Google Seach Console account to see if there are any errors shown, that might be causing the issue? This is so we can identify why Google refuses to index your site.

Please note that it is completely upto Google if they want to index or not index your website. So, if you haven’t willingly or accidentally made your page noindex and Google still isn’t indexing your page – it means that either Google doesn’t want to index or hasn’t crawled your page yet because it hasn’t allocated that much crawl budget to your website.
Please check this website for more details – https://www.searchenginejournal.com/definitive-list-reasons-google-isnt-indexing-site/118245/
Hope this helps.
Looking forward to helping you. Thank you.
Hello,
I have updated the sensitive data as requested. Can you please check further?
Thank you.
The URL of the website is vosmulticleaning.nl
Here are two screens of the sitemaps I submitted and a screen of the coverage report, which shows that there is no data to analyse..
https://ibb.co/ZVk4ZfK
https://ibb.co/82HmKtf
Hello,
I checked the website’s source and it is properly set to index. The sitemap also seems fine and has listed the posts on the website.
However, while checking the robots.txt of the website, I found that the Allow and Sitemap statements are in the same line. Here’s a screenshot: https://i.rankmath.com/EiqITZ
I tried to manually update the rules in robots.txt (from Rank math > General Settings > Robots.txt) but the file still shows the statements in the same line when the user is logged out. This usually happens when the page is getting heavily cached.
Please ensure that the robots.txt is not being cached. You can also get in touch with your web host regarding this.
With that being said, as the sitemaps are getting read properly, after fixing the above issue, please give Google some more time and see if the website gets indexed.
Hope this helps.
Thank you.
I think I changed it. I flushed the dynamic cache: https://ibb.co/YhkDjZw
Is this what needed to be fixed?
Hello,
The screenshot you shared with us seems correct. For the non-indexing issue, could you please share some screenshots from your Search Console coverage section? Also, please take a look at the Security & Manual Action section and see if there is any message there. If yes, then kindly share a screenshot with us.
Looking forward to your update.
Thank you
https://ibb.co/pj0958r
Still no coverage really. I also discovered that I had two properties listed in my Google Search Console, one via domain verification and one https verification. The domain verification shows some data of clicks, the one of HTTPS didn’t, so I deleted the HTTPS to avoid duplicates?
I also checked the Security & Manuel actions section but it showed no problems.
Could this wrong line in the robots.txt be the reason why the site didn’t get listed?
Hello,
I checked your page in Bing search and your page is indexed there.

If a URL is on Google, then there is technically no reason for it to not be included in Bing and vice versa, unless Google has some reason to not index a page like DMCA, copyright content, nulled content, copied content, etc.
With that said, we would like to address the minor issue with your robots.txt first. Could you please share your FTP login here so we can further correct the robots.txt configuration?
Looking forward to helping you on this one.
Hello,
I have updated the sensitive data as requested. Can you please check further?
Thank you.
Hello,
The robots.txt is now configured correctly and the sitemap URL is placed below Allow
rule.
You would need to wait for Google to recrawl your website and see if this fixed your issue although this will still depend on Google if they wish to rank/index your page.
Looking forward to helping you on this one.
But I could ask Google to index some individual pages? Or do I need to wait for Google to crawl the website on itself?
Hello,
You can use the URL Inspection Tool of your GSC account to request manual indexing for the URLs you want. However, it still depends on Google how much time it will take to crawl after submitting the URL.
Hope this helps. Let us know if you need any other assistance.
Thanks.
Hello,
Since we did not hear back from you for 15 days, we are assuming that you found the solution. We are closing this support ticket.
If you still need assistance or any other help, please feel free to open a new support ticket, and we will be more than happy to assist.
Thank you.