I checked update from google search console and based on the updates I’ve done with the new robots txt file plus the new sitemap based on turning on settings on your plugin. It seems google is taking time because nothing much has changed from my end? Is it because they are re checking and re validating before they can start ticking off. I’ve now unblocked or allowed them.
I had a similar situation recently while managing SEO for estateagentsilford. Even though I updated index settings in the CMS, Google wasn’t indexing certain pages due to robots.txt still blocking them. You can double-check using the URL Inspection Tool in Google Search Console — it shows if indexing is blocked by robots.txt.
To fix it, just edit the robots.txt file and remove or adjust lines like Disallow: /category/ or Disallow: /tag/ if those are present. Then revalidate the pages in Search Console.
I’ve checked last crawled urls from my end with crawled but not indexed and also excluded by no index tag issues as these were last done before the time of updating robots.txt hours later based on my time zone as well as sitemap that now has settings being turned on as well as from disallow to allow.
With that said, would I have to manually re-validate the pages line click on request indexing from my end?
Since I haven’t heard back anything I’m actually good from my end for the time being with this particular issue.
I’ve seen this issue happen when robots.txt still had “Disallow” rules for categories or tags even after enabling them in Rank Math. What usually works is updating the robots.txt to allow those sections, then resubmitting the sitemap in Google Search Console — indexing should pick up after that. For anyone working with Android apps or editing-related tools, https://pixellabzone.com
is one of the sites I often check, and similar indexing fixes applied there too.