Google Search Console does not find all sitemaps in the sitemap index. How to fix?

I have created a valid sitemap index (/sitemap-dec2019/sitemap_index.xml) for a site as follows:


Each child Sitemap is a simple list of 49,999 URLs and their size is less than 10 MB.

I have uploaded the Sitemap index and all the Sitemap children to the server. I double checked to make sure the Sitemap index and all Sitemap children are all accessible on the server, and they are.

However, when I submit the sitemap index to Google via the Google Search Console, I receive a success message saying "Sitemap index processed successfully", no error is reported anywhere, and yet it lists only 9 of the 106 Sitemap children (and they all start with & # 39; sitemap1 & # 39;):

Sitemap                         Status  Discovered URLs
/sitemap-dec2019/sitemap1.txt   Success 49,999
/sitemap-dec2019/sitemap10.txt  Success 49,999
/sitemap-dec2019/sitemap100.txt Success 49,999
/sitemap-dec2019/sitemap101.txt Success 49,999
/sitemap-dec2019/sitemap102.txt Success 49,999
/sitemap-dec2019/sitemap103.txt Success 49,999
/sitemap-dec2019/sitemap104.txt Success 49,999
/sitemap-dec2019/sitemap105.txt Success 49,999
/sitemap-dec2019/sitemap106.txt Success 49,999
1-9 of 9

There is nothing different in the sitemap index that would require Google to select only these 9 children of specific sitemaps.

Under the total number of URLs discovered, it is shown 449,991 which happens to be 49,999 x 9.

When I delete the sitemap and add the sitemap index, the even 9 Sitemap Children are listed and none of the others appear anywhere.

Since there are more than 100 Sitemap children, this means that over 90% are completely ignored.

How can I fix this so that all 100% of the Sitemap children (and their respective URLs) are discovered?

Google Sitemaps FAQ

Google Sitemaps FAQThe

I have decided to do these frequently asked questions on sitemaps to help ease the number of questions on the same subject. I hope this can help answer all the questions you have related to Google Sitemaps. You will find that this FAQ will be able to answer most of your questions about sitemaps, if you don't, you can always post your question in the Google Sitemaps sub-ant on digitalpoint.

Q: What is a site map?

A: A sitemap is an XML file that lists all the URLs available for a website, as well as additional information about each URL in the file. You should always create a sitemap for your website. This will help search engines to crawl your site faster and display more content. Currently, the standard sitemap format is XML, all major search engines use this format for sitemaps.

————————————————– ——————

Q: How to create an XML sitemap?

A: This is a task that you will not want to do in the notebook, it can be done, but it would take a long time. There are many free websites and programs that can create an XML sitemap for you. I do not recommend purchasing software to create XML sitemaps; many free programs do the exact same thing. Below are some links to free XML website generators and websites.

————————————————– ——————

Q: How do I send my XML sitemap to Google?

A: It & # 39; sa very simple thing to do; I also recommend placing a direct link to your sitemap on your home page. You can access Google Webmasters Central to submit your site map to Google. There are also many other great features and tools with Google Webmasters Central; you will find it very useful. Below is a link to Google Webmasters Central.

————————————————– ——————

Q: What is the robots.txt file?

A: The robots.txt file is part of the robot exclusion protocol. You can use this file to allow or prohibit search engines from crawling and indexing certain areas of your website. This is ideal for reducing the use of bandwidth by robots. You can also point to your XML sitemap file; this is useful because it allows search engines to find your XML sitemap.

————————————————– ——————

Q: How to create a robots.txt file?

A: It & # 39; s very simple to do. You can open Notepad or any text editor you like and create the robots.txt file is in text format only. Below is the sample basic code to point to your sitemap and allow indexing and crawling of all of your pages. Just save this file and upload it to your top web directory.

User agent: *
site map: your URL to your SiteMap

————————————————– ——————

Q: I submitted my sitemap, why haven't I been crawled?

A: Simply submitting a site map does not mean that you could be analyzed more quickly. New sites may take a while to be fully indexed in Google or any other search engine for that matter. The best way to increase this indexing is to get quality back links to your site. Remember, the sitemap is just a map of your site for the search engines; this does not increase your crawl rate. If that is the case, it is not much.

————————————————– ——————

Site Map Resources

Below is a list of resources that can also help answer your questions, remember that reading is knowledge, and the more knowledge you have, the better your site will work.

Do sitemaps sent to the Google Search Console only display sitemaps sent through the console?

I use rails sitemap_generator gem to ping Google for an updated sitemap, but the submission date does not change in the Google Search Console.

sitemap_generator gem uses the method "http: // sitemap = URL / of / file" described here:

seo – Problems with Bigger Sitemaps with 50 KB URLs

My sitemap contains 50,000 URLs / 7.8 MB and the following URL syntax:, maquiagem,   2019-10-03T17:12:01-03:00 

The problems are:

• The search console indicates that "The site map could not be read".

• Loading the Sitemap takes 1 hour and Chrome stops working.

enter the description of the image here

• In Firefox, the Sitemap downloaded in 1483ms and fully loaded after 5 minutes);

Things I did without success:

• disable GZip compression;

• Delete my .htaccess file;

• Create a Test Sitemap with 1 KB URLs and the same syntax that you sent to the Search Console. It worked, but the Sitemap of 50,000 URLs still indicates "" the inability to retrieve the Sitemap ";

enter the description of the image here

• I tried to inspect the URL directly, but this gave an error and asked to try again later while the 1K URL was working;

• I tried to validate the Sitemap on five different sites (YANDEX, ETC) and all worked without error or warning.

A light?

Why Google Important Sitemaps for business site?

Some time do not give the necessary attention to create a site map of their site and after days their site has lost a lot of organic traffic, why?

Sitemaps and DoS | Promotion Forum

The implementation of sitemaps makes me a little nervous without a very strict limitation.
It seems like this could be one of the most important things in terms of DoS on the face of the planet (well, not literally lol).

Ridiculously large files that can be requested by n number of bots from m IP addresses at any time. Even a badly configured bot could probably wreak havoc, huh.

Sitemaps – Site maps are generated automatically with / pub / media in the URL, generating 404 errors.

Our sitemaps are generated automatically every day and divided into 22 files, – -1-22.xml
The problem is that links are generated with / pub in the URL that navigates to a 404.

When we regenerate the XML code of VIA Marketing> SEO & Search> Site Map, they are generated correctly without the /

Has anyone been confronted with this problem? I am not sure what other information I can post.

How to create xml sitemaps?

How to create xml sitemaps?

seo – Master Sitemap containing child sitemaps in different fields

I have a main sitemap, index.xml on

This XML contains child cards pointing to the URL of my subdomain, as below:
  2019-02-04T11: 15: 02.000Z
  2019-02-04T11: 15: 02.000Z

And the current sitemap contains:
  2019-02-04T11: 15: 02.000Z


  1. Is there a problem to do above?
  2. Should i add a property on the webmaster of the google search console for to allow my property for the subdomain?
  3. Google has notified me with an error "General HTTP Error: 404 Not Found
    HTTP Error: 404 ", is it because # 2 is not in place?

PS: I have checked the access for the GoogleBot for all the URLs.

Do I need to add separate sitemaps and properties in the Google Search Console for http: // and https: //

This question already has an answer here:

When I use the Google Search Console, do I need to add 4 separate properties for my site with the new Google Search Console?

In online video tutorials, it shows people who add different http: // and https: // as good as www and non-www versions etc., but this is done with the old search console.

I have recently redone my website which now has different page names and so on. and I want to submit a new sitemap, but I do not understand if I should add the 4 properties to sitemaps, or if I just add 1 property / and 1 sitemap.

Any help / advice would be wonderful.