Hello,
Thank you for contacting us and bringing your concern to our attention.
1. When I search site:mywebsite.com on Google, I only see 7 links in the SERP (less than 30% of the indexed links). Since this is a new website, I’m wondering why not all indexed links are appearing. Could you please clarify this?
Please note that it is completely normal for newly published posts/pages to take time before they get crawled or indexed by Google. It depends on a lot of factors. Your posting frequency + the domain authority are just two of the many factors Google considers when indexing some new URLs.
Google assigns a crawl budget to your website depending on these factors (especially these two) and that has a direct effect on how soon or how late your content can get indexed.
Here’s a link for more information:
https://rankmath.com/kb/google-index/
Also, the site:
operator doesn’t always give an accurate test, it often just returns a sample of URLs. In this case, you can use the URL Inspection tool to make sure the URL is indexed or not.
Here’s what Google says about the site operator:
If a URL is indexed in Google, it can show up in search results for site: queries that are related to the URL, however it’s not guaranteed. If a URL doesn’t show in a site: query, use the URL Inspection tool to make sure the URL can be indexed and to submit the URL to indexing.
https://developers.google.com/search/docs/monitor-debug/search-operators/all-search-site
That’s why the best way is to use the URL inspection tool of the GSC account to check any URL.
2.1. Are sitemaps and robots.txt files typically created manually, or are they generated automatically? I noticed that WordPress creates them automatically. Currently, I have separate sitemaps for categories, posts, and normal links. Is this setup correct?
Rank Math generates the robots.txt and sitemap files automatically for your website and they are generated on the fly.
Also, having sub-sitemaps for categories, posts or pages is normal and doesn’t affect your SEO.
2.2. When I checked the robots.txt file in GSC, I found multiple versions created on different dates. Is this normal? I assume Google will only consider the most recent version.
It usually happens when you verify your website using the Domain Property and that’s why different versions (https, http, www and non-www) of your robots.txt can be created.
If all the versions get redirected to the correct version then you don’t have to worry about it.
You can also check this thread on this topic: https://support.google.com/webmasters/thread/248370211/multiple-robots-txt-files-found-in-google-search-console-s-robots-txt-tester?hl=en
2.3. In GSC, there’s an option to request Google to crawl the website (in robots.txt section). Is it a good idea to use this feature, especially after making changes or adding new content? Or should we rely on Google to crawl automatically when it’s time?
The recrawl option in the robots.txt section is only used to recrawl your robots.txt file. You should use this option if you make any changes to your robots.txt.
You can also check the versions to make sure when Google last crawled your robots.txt and after making any changes to your robots.txt file, if you see the last crawl was a long time ago then you can manually request a crawling.
Here’s a screenshot for your reference:

3. In the “Removals” section of GSC, I noticed we can request pages to be de-indexed. Is this a reliable method to remove pages from the index, or are there any potential issues with this approach?
The best way to de-index any page from SERPs is to set this to Noindex. But you can also use the Removal Tools if you have the following circumstances:
– You have a URL on a Search Console property that you own, and you need to take it off Google Search quickly. You must take additional steps to remove the URL permanently. The URL to remove can be for a web page or an image.
– You’ve updated the page to remove sensitive content, and want Google to reflect the change in Search results.
You can also check the Google guidelines on the Removal Tool: https://support.google.com/webmasters/answer/9689846?hl=en
4. My Google Search Console doesn’t seem to detect any sitemap. Should I manually submit it, or will Google automatically detect it over time? If I don’t submit it, will Google still be able to find it?
Google doesn’t automatically detect your sitemap. However, if you connect your Google account with Rank Math, it automatically submits your sitemap on GSC.
Or, you can also manually submit your sitemap by following the steps mentioned here: https://rankmath.com/kb/submit-sitemap-to-google/
Hope that helps and please don’t hesitate to let us know if you have any other questions.
Thank you.