-
Hi There,
I noticed that RankMath identified 40 pages that have been discovered, but not indexed.
I believe that this was due to the disallow tag being applied to the Robots.txt file. I logged into the root file and delete the disallow tag.
However, it looks like it continues to show up, even after it has been deleted.
It shows up like this:
“User-Agent: *
Disallow:”Does it take time for the change in the root folder to be applied?
My goal is to get all of our pages properly indexed, but I am not sure what the next step would be.
Any assistance you can provide would be great!
Thanks so much!
– Aaron
The ticket ‘indexing issue – Discovered but not indexed’ is closed to new replies.