Crawler issues

#177643
  • Resolved Stuart
    Rank Math free

    Hi,

    Ive been reading about crawler / canonical tags issues in your support forums but Im still having issues.

    Google seems happy

    Bing says – Index Issues – Large number of pages pointing to the same canonical URL. Canonical tags can be used to help de-dupe URLs for pages with the same or very similar content. Bing has detected that your site is using what appears to be the same canonical URL for a large number of pages that are not the same. This may be indicative of a problem with your HTML template, content publishing system, or site code. You should review the source of your pages and check if the URLs used inside the <link rel=”canonical” href=”<url>” /> are not accidentally pointing to the same location for all (or too many) pages.

    Facebook – OMG – CRASHES MY SERVER

    I have an excel spreadsheet from Content King who scanned my website and I can see whats going on, I just cant fix it.
    I’ve emailed this to s******@r*********** (This was 2 months ago though) with this ticket in the title – look at line 248 onwards.

    The issue is with my job categories, Ive recently added a manual redirect but the hit count is in excess of 27000 hits.

    I’m so stuck, my developer is stuck and so is a digital marketer. Please help.

    Ive spoken to the theme developer, they say they are ready to help if needed, Ive also spoken to the plugins people at wp job manager and they said to install another plugin for crawlers, like a black hole, but that wont work as facebook is whitelisted.

    All job categories are marked as no index no follow.

    I am having to block facebook regulary as the server would be crashing constantly otherwise from high error log messages filling the memory.

    I have today emailed support@ with a sample of error log messages. The email title is “Facebook sample traffic”

    In the last support ticket Reinelle said “I would just like to summarize things up. Bing is saying about too many pages with the same canonical (those pages with filter), so in order to fix that – you want those pages to be non-indexable (noindex) so Bing doesn’t see them. Could you please confirm if this is correct?

    answer -YES

    If so, I’ve checked your CSV file, and as you mentioned, the URLs with filters are set to noindex. However, upon checking the URLs you have followed up (from Bing’s rescan), it seems that they are not included in your CSV and their meta tags are still set to index (page’s source code). We wanted to check their robots meta individually, but the credentials you have provided seem to be not working. Could you please check them as well?”

    Credentials suplied

    In previous support ticket Michael suggested “To fix the issue with index/noindex, you can consult with the job plugin you are using to see if there is a way to get all pages with filters then use th following hook to set all filtered pages to noindex:”

    Not possible

    what am I doing wrong?

    ideas welcomed. Thank you.

Viewing 1 replies (of 1 total)
  • Hello,

    Thank you for contacting Rank Math and bringing your concern to our attention. I’m sorry for the delay and for any inconvenience this issue may have caused you.

    I’ve checked your robots.txt, and it seems that Bing couldn’t crawl the filtered URLs you shared from your previous ticket. You can also use this tool to check:
    https://technicalseo.com/tools/robots-txt/
    Robots

    Could you please confirm if you have already rescanned your website on Bing?

    If the issue still persists, please also add your FTP logins in the sensitive data section to check further.

    I hope that helps. Thank you, and looking forward to your update.

    Hello,

    Since we did not hear back from you for 15 days, we are assuming that you found the solution. We are closing this support ticket.

    If you still need assistance or any other help, please feel free to open a new support ticket, and we will be more than happy to assist.

    Thank you.

Viewing 1 replies (of 1 total)

The ticket ‘Crawler issues’ is closed to new replies.