Hello,
Thank you for contacting us.
1. No, adding a disallow statement for a post in robots.txt tells the bots not to crawl the page, whereas excluding the post from sitemaps would only remove the post from sitemaps i.e, Google can still crawl and index the post via other sources like internal links.
If you do not want a post to get indexed, then you can simply set it to No Index. Please refer to this tutorial guide for more details:
https://rankmath.com/kb/how-to-noindex-urls/
2. Those IDs are of posts/pages/CPTs which are excluded from the sitemaps. If those posts do not exist on your website or if you do not wish to exclude any post/page/CPT with those IDs from the sitemaps, then you can remove them.
You can directly access posts with those IDs using this URL format:
your-domain.com/wp-admin/post.php?post=45&action=edit
Please ensure to replace “your-domain.com” with your actual domain and “45” with the ID.
Hope that helps.
Thank you.
Hello,
Since we did not hear back from you for 15 days, we are assuming that you found the solution. We are closing this support ticket.
If you still need assistance or any other help, please feel free to open a new support ticket, and we will be more than happy to assist.
Thank you.