The Google Search Console creates URLs that do not exist in my sitemap, and complains about the error in these pages.

You have a misconception of what a sitemap is.

The sitemap is used to audit the crawling of the site by the bot search engine. The site plan and the exploration of a site are two different and independent things. Google will continue to explore your site regardless of any site map. The sitemap will be used to check / see if Google is able to correctly analyze your site. For example, if pages are in your sitemap and Google has not seen the page, Google can add the page to the recovery queue to include.

The opposite is not true. If a page is not in the sitemap, Google will not remove it from its index. Why? Because Google found it while exploring the site.

What you seem to believe is that the sitemap is the only authority used by Google to know which pages exist on a given site. This is not the case. The crawl is. The site map only allows Google to know if they can properly analyze your site and, if not, the missing pages in Google that need to be added to the recovery queue.

Your Google expectations no longer try to access pages because these pages no longer appear in your sitemap are incorrect. Sitemaps are cached and checked only periodically. Why? Because it is an audit process.

You have a real problem to solve.

You return a 500 error for pages not found. It's bad. Your website should return a 404 Not Found error. The error 500 is a system error and Google will treat the condition as temporary. If your site returned a 404 error, Google will continue to try the page several times over a period of time until it decides that the page does not exist anymore. If possible, you want to issue a Deleted 410 error for the pages you have deleted. If this is too much work or it is not possible, the 404 will be the same over time.

You must correct your 500 error.

seo – How many links per page can we have in an HTML sitemap?

In the XML sitemap, there is a limit of 50,000 per XML sitemap. Is there a link limit to be present on the HTML sitemap?

If I have more than 100,000 pages or posts, should I use them as a way to set up pagination for HTML sitemaps?

PS: The XML sitemap is different from the HTML-based sitemap.

seo – Change the names of the images or create a sitemap for the images? Or both?

The environment is a classified ad site, where there are products with their description, title and photos. With the exception of images, all data is stored in a MySQL database.

Although there are user-friendly titles for classifieds and it's relatively easy to link them, all images are saved with a randomly generated name (something like d6as897d6ad67a5s7da8d56sa7d.jpg).

The question here is, for SEO, PAGE RANKING and SERP pages, what is the best thing to do and why?

  • Change the names of random character images to classified titles;
  • Create a site map image;
  • Both.

What are the disadvantages and advantages? And why not choose the other options? Are there any other alternatives that I have not mentioned here?

seo – How can I index the sitemap of my website?

I want to optimize the site link on my website through the search console / site administrator.

I want when searching on Google that some menus appear. for example menus, a, menu b and menu c

I am looking for references. that we can set through the Search Console, only a sitemap.xml file needs to be parsed first

I have therefore accessed this I enter my domain and click on the start button. once the process is complete, I downloaded the sitemap xml file

My question is, do I download the XML file directly to my hosting? or do I have to modify it first to define the menus to display when entering keywords on google?

Update :

My xml like this:



there are 2000 lines of code, but here I only get a few

How to find a rss or sitemap link?

I have a site based on shopify. now, how to find RSS and Sitemap links? My site is

seo – Google can not retrieve a large sitemap containing 50,000 URLs, or browsers generating it

My sitemap contains 50,000 URLs / 7.8 MB and the following URL syntax:, maquiagem,   2019-10-03T17:12:01-03:00 

The problems are:

• The search console indicates that "The site map could not be read".

• Loading the Sitemap takes 1 hour and Chrome stops working.

enter the description of the image here

• In Firefox, the Sitemap downloaded in 1483ms and fully loaded after 5 minutes);

Things I did without success:

• disable GZip compression;

• Delete my .htaccess file;

• Create a Test Sitemap with 1 KB URLs and the same syntax that you sent to the Search Console. It worked, but the Sitemap of 50,000 URLs still indicates "" the inability to retrieve the Sitemap ";

enter the description of the image here

• I tried to inspect the URL directly, but this gave an error and asked to try again later while the 1K URL was working;

• I tried to validate the Sitemap on five different sites (YANDEX, ETC) and all worked without error or warning.

A light?

do onpage SEO optimization, meta description, speed optimization and sitemap xml for $ 40

do SEO optimization on site, meta-description, speed optimization and sitemap xml

SEO optimization on the page is the process of optimizing each web page of your site to better appear in the pages of search engine results (SERPS). This White Hat SEO optimization service can increase the SEO of your website or blog in Google by 90% or more.

Our on-site optimization services include:

  • Free: Website verification report
  • Long tail, LSI & Optimization of targeted keywords
  • The competitors analysis of backlinks
  • Install and configure WordPress yoast SEO plugin
  • Irresistible Meta description, Title and effective tags
  • Title tags (H₁, H₂, H₃) optimization
  • Images Alt tags and speed optimization
  • XML Sitemap / Robots txt file creation and submission
  • Google Webmaster Tools verification
  • Check for broken links and redirect broken links to the homepage or parent page
  • Meta social media tags, Optimization of hypertext links and anchor text
  • Index all pages on Google, Bing and Yahoo
  • User-friendly engines for search engines, URLs and website structure

Why us?

  • 100% customer satisfaction and money back guarantee.
  • WordPress, Wix, Shopify and Amazon Specialists

If you have any questions, do not hesitate to contact us.

. (tagsToTranslate) to page (t) seo (t) wordpres (t) siteseo (t) websites

How to create a 32-page XML sitemap with an HTML website [on hold]

I have created an HTML website with 32 pages of crore. This is a flight booking site. Now, I want to create an XML sitemap for this and implement

Sitemap in Bing waiting for two weeks

I've built a new website a month ago. Two weeks ago, I sent the site map to the Bing search console.

At this point, he is still waiting. What should I do for Bing to move?

seo – Google could not retrieve the sitemap

Sitemaps are an integral part of our Web work. And we need them to get more love from Google. But for about a month, I can not send sitemaps to the Google Search Console.

I've used to check and I go through it successfully. I used the sitemap validator and that too showed no errors. It seems that something is wrong with the search console.

Yesterday, I started another site and used the same wordpress plugin to generate a site map, then the same gmail login to connect to the search console. I have tried to add a property.

If I submit a new sitemap, it can not fetch, and if I try to resubmit an existing plan, it succeeds, but the date of the last reading is 6 months. So I guess it does not work.

What are the possible solutions. Live tests indicate that it is impossible to extract because it is a noindex. I think there's nothing wrong with keeping sitemaps without indexes because sitemaps should not show up in searches. Infact my second domain also had sitemaps submitted with noindex

I use WordPress php 7.3 and HTTP / 2

My sitemap url is

My HTTP headers are:

HTTP/2 200 
server: nginx
date: Thu, 22 Aug 2019 12:23:21 GMT
content-type: text/xml; charset=UTF-8
vary: Accept-Encoding
x-robots-tag: noindex, follow
link: ; rel=""
last-modified: Wed, 21 Aug 2019 14:03:48 GMT
etag: W/"59184deaadc3de11f553f5a8fbaac7f0"
x-httpd: 1
x-robots-tag: index, follow
host-header: 192fc2e7e50945beb8231a492d6a8024
x-proxy-cache: MISS
content-encoding: gzip