-
Thanks for the support, I do not see ✅ Done-for-you Rank Math PRO setup – Premium Care has you covered. at this time in the account since I am unable to check the video sitemap option in Rank Math. I have also listed a few challenges and appreciate your help.
image.png
2. I am using elementor on wordpress and both seem to be syncing the SEO, although I am unable to save or update the wordpress interface Rankmath and the seo scores seem to fluctuate, not record the SItename in keywords.
image.pnghere is another one to see the difference, this is before the above, all keywords were green and I was on page 1 for the same. and the day later it’s not reading the same.
image.png
image.png
while on the wordpress dashboard it says 96 and has focus keyword but not recording it. Both are below.
image.png
image.png
Finally, There is a noindex and nofollow tag in the header that is blocking crawlers to read and also marking me down on SEO.image.png
I am still trying to figure out if Rankmath is the solution. Everyone ranking on page 1 are using Yoast and I ranked in page 1 on 3th place for performx performance marketing, now I am not. Below is the snippet were the Site Name hasnt updated a yet it came through, it got rid of the dash.
image.png
How can there be such high fluctuations and also please let me know whats I am not doing. Also some Social Cards are mssing it reports. I am happy to get the edition of your product provided I know it works as good.
-
Hi Jeremy,
Thank you for the prompt reply,
From inside of elementor its shows 96 and at the same time on google page speed has it at 92, and when deactivating elementor its 100, and thats the onlt time when page insight and seo coincide.
How is rankmath reading wordpress headers using the shortcodes, similarly whats missing, is elementor blocking you from reading within that container? (a feature they sell as Elementor Pro) or Rankmath is unable to read it. Please answer this as Elementor is blocking or RankMath cannot read so to understand.
You just have the direct the H1 of rankmath to read shortcode for custom html block elementor.
Here are some the others.
→ the page is not recording nofollow or follow, i used to add them together for links to avoid adding the rel=no follow.
This error keeps showing up “We found 1 outbound links in your content and all of them are nofollow.”
Its also the same in Moz.
Sitemap not recording hreflang, only en-us, not showing up in other places, should I add it separately in the header?
site:performx.me does not have me indexed in UK UAE etc, just checked to see, this was working cos i was drawing some traffic form these locations organically.
Error Message “Multiple errors: not valid XML documents and/or no hreflang tags
Press Escape to dismiss.Press Control-“z” to
”Hello,
Please note that Rank Math’s Content Analysis and Google PageSpeed Insights use very different evaluation methods as PageSpeed focuses on performance metrics, while Rank Math analyzes only the on-page content such as headings, keywords, links, and metadata.
Rank Math reads the content that’s available through WordPress’ content filters. When Elementor shortcodes or custom HTML widgets are used, the content inside them is not always parsed by WordPress until the Elementor editor renders it. That’s why you see different SEO scores inside Elementor versus the default WordPress editor.
In short, Elementor controls the rendering of those custom containers while Rank Math relies on WordPress’ content output, so it cannot “bypass” Elementor’s rendering layer. That’s why it’s always best to check and rely on the SEO score shown inside the Elementor editor.
As for the outbound links issue, please share the affected post/page URL(s) so we can check this directly. Sometimes, visual builders add extra wrappers or filters that modify the rel attributes before the content is analyzed, which can cause this type of detection issue.
As for the hreflang, Rank Math doesn’t add hreflang tags automatically, as this is handled by multilingual/translation plugins (like WPML, Polylang, or TranslatePress). Our sitemap only lists the primary language URLs and the alternate language links are expected to be output at the HTML level by your translation plugin.
To confirm that your sitemap is valid, you can use Google Search Console or this validator tool: https://www.xml-sitemaps.com/validate-xml-sitemap.html
Last but not least, the trailing slash behavior is controlled by WordPress, not Rank Math. Please go to WordPress Dashboard → Settings → Permalinks, and make sure your permalink structure includes the trailing slash as shown here: https://i.rankmath.com/ZQJx8L
Once adjusted, both Rank Math and your sitemap should read the URLs consistently as https://performx.me/.
Don’t hesitate to get in touch with us if you have any other questions.
Hi Jeremy,
I left it under observation and here are issues that are persistent.
1. Robots.txt is reading from different locations. I have added 3 snippets Ahref Site crawl, Google Search Console and Page html.
2. GSC is reading from 4 different variations and Nov 2.
3. Ahref is missing on some days, meaning it cannot find the robots text in the crawl urls.
Last we discussed was changing the permalink to custom however I see the Post name
https://performx.me/sample-post/
also adds the trailing slash. I am on Rank Math 1.0.254.
Crawled but not indexed – 4 Pages. These pages are not indexed from July.
All snippets are in Final Fix (Nov 27)
I’ve actually dealt with almost the exact same situation, so I know how frustrating it gets.
In my case the fluctuations, sitemap issues, and blocked crawling were all connected to a few small settings that were conflicting with each other.Here’s what fixed it for me:
Multiple robots.txt
I discovered that my server + a security plugin were both generating robots.txt. I deleted the plugin-generated one and kept only the default WordPress/RankMath version. After that, Google started reading my sitemap again.Elementor & RankMath not syncing
This happened because Elementor was caching older versions of the page.
I cleared:Elementor cache
RankMath cache
Browser cache
Host/server cache
After doing all four, the SEO scores updated properly and stopped fluctuating.Noindex/Nofollow tag in header
Mine came from a theme setting. Even after disabling it, the tag stayed due to a caching layer. Clearing server cache + disabling the theme’s “maintenance mode” finally removed it.Focus Keyword not recording
Turned out RankMath’s “Instant Indexing” plugin was interfering. Disabling it fixed the tracking and green scores came back.Once everything was cleaned and unified, the rankings stabilized again.
So yes — RankMath worked for me, but only after removing the conflicts.Hi Sergio,
Now you are new to this thread and I have been troubleshooting with Jeremy, why I am saying this is cos he understands this from the start, the caching is happening to be the original issue but we are past that I suppose.
I don’t see why the server with generate a robots.txt, I have added the a snippet of all the active plugins. You tell me which security plugin could that be among them.
the steps you followed I do for every-change I make, even if I am working on incognito (shouldn’t save browser)
Now the challenges arises with an without elementor,
Its Indexing but incorrect url, so its isn’t the theme maintenance mode. Did not find .maintenance file
→ I have turned off elementor. Added snippets of the score in the the folder Nov 28. (Before and After)
→ Also It contains site Audits, Yesterday and Today so you can see the difference and get a better idea.Now the Robots.txt file is placed in the root directory and instructs the crawlers to understand the restriction to indexing, the file is a set of rules that tells search bots what they can and cannot crawl, including the location of the sitemap itself.
Either Rankmath is reading My root wrong,
and I checked through FTP for maintenance file but none exist. However I noticed a robots.txt file twice in rankmath and once more in Litespeed. Snippet added in the folder Nov 28.
I have also Turned Off Instant Indexing.
Now this is on going from July, Jeremy last suggestion was to change the permalink to custom. You may wanna refer or double check.
Here is more so it can help. Added in the sensitive area are two sites A and B and relevant built.
A -SEO 100
B -SEO 100 drop in performance is not using litespeed.Snippet added in Folder Nov 28 (New Find)
I also investigated for the robots.txt in A and B and it is consistent across and does not seem to be the cause of conflict. One has litespeed enabled and another does not as added about but in both cases litespeed/robots.txt is consistent.
Snippet added in Folder Nov 28 (New Find)
Hope that helps.
Hello,
Those robots.txt files you’re seeing in the plugin directories are simply core files used to generate and manage the robots.txt. They do not affect your site’s crawlability in any way as search engines only read the robots.txt file located in your domain’s root (e.g., example.com/robots.txt), so everything is functioning as expected.
The different robots.txt variations you see in Google report should redirect to the final version, which is https://website.us/robots.txt.
Please note that the robots.txt file is only read when a bot crawls so it’s normal to see a gap in the days the robots.txt is detected.
As for the permalink structure, the one you show in the screenshot is correct and a trailing slash is normal to see. Can you please explain your issue here in detail?
Please also note that Rank Math plugin does not handle the trailing slash or permalink structure for you.
Lastly, as for the “Crawled – Currently not indexed” error, this error means that Google has crawled your page but has not indexed it yet. As we already know, Google does not index all the URLs we submit, and finding a certain number of URLs under this status is completely normal. You may refer to this guide: https://rankmath.com/kb/crawled-currently-not-indexed/
Hope that helps.
Hey Jeremy,
I think you are angry or upset, haven’t hear back as fast as you generally reply, lol. I apologize if I came across rude, not in an effort to pacify you but its isn’t cool.
Also I want to Thank you for all the help. Send You Cake for Xmas bro.
Just so you know it worked well until day before yesterday and then yesterday the robots reading from
http://www.performx.me/robots.txt 12/6/25, 1:28 PM check_circle_outline Fetched 115 bytes
Is there a way these robots.txt can be deleted. Keeping it under observation for a few crawls. I have updated more in Final Fix Dec8
Hello,
Absolutely no worries at all and we sincerely apologize for the delay. Your previous replies were pending moderation and did not immediately enter our support queue.
We’ve now approved them so that everything is back on track.
Back to the issue, Googlebot automatically checks for robots.txt across all protocol and subdomain variations of a site for example:
– http://example.com/robots.txt
– http://www.example.com/robots.txt
– https://example.com/robots.txt
– https://www.example.com/robots.txtBecause of this default behavior, Google may show those variations in reports even if your site fully enforces HTTPS and a preferred domain version.
The important part is that all those variations redirect to your primary robots.txt (which, in your case, is https://performx.me/robots.txt).
To clarify, Rank Math generates the virtual robots.txt, but it does not control HTTPS / non-HTTPS or www / non-www redirection as those are handled by your server/hosting configuration.
So, the behavior you’re currently seeing in Google Search Console is expected and not something that should cause concern.
If you notice any change in indexing performance or crawl anomalies that don’t improve over time, please feel free to share the affected URLs and screenshots. We’ll be happy to take a closer look, but it’s important to note that in such cases, the robots.txt is typically not the root cause.
We appreciate your patience and the kind words.
Hello,
I have updated the sensitive data as requested. Can you please check further?
Thank you.
Thanks Jeremy, Appreciate it mate.
404 – Sitelinks
Some others like broken link etc.
Snippets are in Final Fix DEC 8.
Hello,
Thanks for the new screenshots you shared.
We will address each concern:
1. 3xx redirect removal.PNG – this is an HTTP to HTTPS redirection, which is perfectly normal. Can you please confirm which part you wish to remove here? Also, if he crawler keeps on discovering the HTTP version, it is possible that this
URL is currently referenced in your website content.If you’re referring to the trailing slash, this is also normal to see.
2. The issues with the health scores point to 404 and broken pages. You will need to check the affected URLs and obtain their referencing pages so you can know where the crawlers have found them. If you believe these URLs have ranking juice, you can simply set a redirection for them as well.
3. Latest Crawl Logs.PNG – Seeing HTTP URLs redirecting to HTTPS (301) is completely normal and expected. This indicates that Ahrefs discovered some http:// URLs that correctly redirect to the secure https:// version. To find where these are coming from, you can check the referring pages in Ahrefs, as those HTTP URLs are likely still internally linked somewhere in your content, theme, or older templates.
Regarding the 404 reported for the sitemap, since the sitemap is accessible when checked manually, this is most likely a temporary caching or crawl-related issue. We recommend clearing any site or server cache and checking the Sitemaps section in Google Search Console. If GSC shows the sitemap as submitted successfully with no errors or failures, then everything is working correctly. In that case, it’s also possible that Ahrefs’ crawlers are being blocked from accessing the sitemap due to server rules, security settings, or firewall restrictions.
4. Indexable page not in sitemap – The homepage is currently listed in your sitemap as you can see from the screenshot below. Since the sitemap is showing as 404 in your log, this issue is shown. However, the sitemap is completely working and the path to the homepage is already appearing in the sitemap so rest assured that there’s no issue in the indexing. We can confirm that on the SERP as well (see screenshot below)
5. As for the last screenshot, it seems that you’re trying to internally link the actual XML sitemap in the homepage content. We suggest not doing that as XML sitemaps are meant for search engines, not for your users.
If you wish to provide a list of URLs of your content for your users, you can use the HTML Sitemap feature instead: https://rankmath.com/kb/html-sitemap/
Don’t hesitate to get in touch with us if you have any other questions.
You must be logged in to reply to this ticket.