Pages “404 not found” GSC

#608173
  • Resolved Fdl
    Rank Math free

    On GSC, I have many pages “404 not found’ and all of theme are from the default theme :

    “https://en-velo-simone.fr/comparison-table-demo-post/feed/”

    “https://en-velo-simone.fr/inner-ideas-for-your-verandas/feed/”

    “https://en-velo-simone.fr/content-egg-post-with-multiple-offers/feed/”

    “https://en-velo-simone.fr/category/laptops/”

    “https://en-velo-simone.fr/holiday-home-design/”

    “https://en-velo-simone.fr/inner-ideas-for-your-verandas/”

    “https://en-velo-simone.fr/custom-gutenberg-loop-blocks/”

    “https://en-velo-simone.fr/contemporary-home-project/”

    “https://en-velo-simone.fr/your-new-office-is-here/”

    “https://en-velo-simone.fr/category/non-classe/”

    “https://en-velo-simone.fr/non-classe/”

    “https://en-velo-simone.fr/mon-compte/”

    “https://en-velo-simone.fr/commander/”

    “https://en-velo-simone.fr/panier/”

    “https://en-velo-simone.fr/?s={search_term_string}”

    “https://en-velo-simone.fr/author/fdl-pro-hotmail-com/”

    “https://en-velo-simone.fr/mon-compte/lost-password/”

    I cant delete them on Admin-Wordpress because these pages are not existing any more.

    My question :

    Is there a way to delete them with Rankmath (robots.txt.. ?)

    Should I do nothing and let Google desindex them ?

    Or should I ask to Google to delete them with the Temporary Removals tab on GSC ?

    Thanks for your help

Viewing 5 replies - 1 through 5 (of 5 total)
  • Hello,

    Thank you for contacting Rank Math and bringing your concern to our attention. I’m sorry for any inconvenience this issue may have caused you.

    Since they are already deleted from your site, then letting them in a 404 status should be fine. Google will understand that and eventually deindex/remove them from the search results page (if they were indexed before).

    In this case, to figure out how Google is discovering those URLs, you should use the URL inspection tool of your GSC account to check the URLs and check the referring page where are they coming from or how Google is discovering them.

    This will let you remove those URLs from being crawled as well.

    Hope that helps.

    Thank you.

    Fdl
    Rank Math free

    Hello,
    Thanks a lot for your answer.
    1) I have checked on the URL inspection tool, but I cant see any data about the refering page and where they are coming from.
    “This will let you remove those URLs from being crawled as well.”
    I dont see where it could remove these URLs here…

    2) What do you think about removing these URLs with the Clear Cached URL option in GSC ?

    Thanks a lot

    Fdl
    Rank Math free

    3) And what do you think about inserting these URLs in no-index in the robots.txt ?

    Fdl
    Rank Math free

    or inserting these URLS in no-index in htaccess ?

    Alhan
    Rank Math free

    Hello,

    Thank you for your follow-up questions. Let’s clarify each of your points:

    1 . If you are still looking for data about the referring page from the URL Inspection Tool in Google Search Console, it’s possible that these URLs have been previously crawled and indexed but no longer have any active links pointing to them.

    Unfortunately, the URL inspection tool does not offer functionality to remove the URLs directly.

    2 . As for using the “Clear Cached URL” option in Google Search Console, it’s important to note that this only removes the cached version of the page and not the URL from the index.

    It could be helpful if you’ve changed a page and want Google to recrawl the newer version quicker. However, in your case, where the pages are already deleted, this option wouldn’t be necessary or beneficial.

    3 . Regarding your question about adding these URLs to the robots.txt file or .htaccess for no-index, neither method is recommended.

    Here’s why:

    A. Robots.txt:

    Adding URLs to the robots.txt file doesn’t remove them from the Google index; it merely instructs search engines not to crawl those URLs.

    B. .htaccess:

    While it’s technically possible to use the X-Robots-Tag in .htaccess to apply the noindex directive, it’s not a good solution.

    Since these pages are already deleted and returning 404s, they should eventually drop out of Google’s index naturally.

    However, if you’re in a hurry, you can use the Removals tool in Google Search Console to request temporary removal of these URLs.

    Remember that this is a quick solution and should not be used as a long-term strategy for controlling your indexed pages.

    We hope that helps, and please don’t hesitate to get in touch if you have any other questions.

    Thank you.

    Hello,

    Since we did not hear back from you for 15 days, we are assuming that you found the solution. We are closing this support ticket.

    If you still need assistance or any other help, please feel free to open a new support ticket, and we will be more than happy to assist.

    Thank you.

Viewing 5 replies - 1 through 5 (of 5 total)

The ticket ‘Pages “404 not found” GSC’ is closed to new replies.