seo – GSC: Unable to retrieve the sitemap

I'm trying to submit a very simple sitemap (for test purposes only) to Google Search Console but, unfortunately, I'm constantly getting the following error message:

╔══════════════╤═════════╤══════════════╤═════════ ══╤══════════════════╤═════════════════╗
Site Map │ Type │ Submitted Last Reading │ State │ URL Discoveries ║
╠══════════════╪═════════╪══════════════╪═════════ ══╪══════════════════╪═════════════════╣
/Sitemap.txt │ Unknown 17 Jul 2019 │ * Unable to retrieve * 0 ║
╚══════════════╧═════════╧══════════════╧═════════ ══╧══════════════════╧═════════════════╝

By clicking on it, there is an additional error message: "(!) Can not read site map".
However, if you click on "OPEN SITEMAP", it will open normally.

Question
An idea of ​​what's going on?


Field: world-hello.ddns.net
Sitemap file: sitemap.txt
Server: Apache (Debian)

How to warm the server cache without using a sitemap?

I need to write a script that warms the cache when my server reboots. And I want to cache the first 10 pages of my website. I have these first 10 pages in one .txt file each URL on a new line in the same directory.
Can any one help me write a script to warm up the cache with these pages with the help of a bash script, please?
Any help is appreciated.

seo – Should I use an xml sitemap instead of a txt for a site with deeply nested product pages?

The context

I have a B2B parts website with about:

  • 25 parent categories (organized hierarchically)
  • 150 leaf categories (models)
  • 250 products (unique articles, each with quantity = 1)

Target visitors are looking for a specific spare part.
Generally, they do not hesitate between several brands and products, as in the consumer segment.

The site is intended for specialists (niche market).

Despite several optimizations, the site remains poorly referenced in the search results, compared to those of the competitors.

I must admit that I am not a fan of social networks, so there are only few links to the site, coming from specialized forums.

Publishing many products on the homepage can help reference the site, but would also create duplicate content with dedicated product pages.

In this thread, the general consensus is that there is no problem using a txt sitemap instead of an XML. However, I am not sure of this in the context where the pages to be indexed are buried in the middle of the hierarchy and that the search engines ignore the intermediate levels.


How pages are currently indexed

Google was able to index pages for sheet categories and products, which were provided via two text sitemaps (URL lists):

Site map with leaf categories:

https://example.com/A_model
https://example.com/Another_model
(...)

Site map with products:

https://example.com/A_product
https://example.com/Another_product

The products are mainly accessible via a search field, where the visitor enters the model he wishes to acquire for spare part (s). The model name is used as Friendly URL and .htaccess file redirects directly to the sheet category page.

# Currently, no friendly URL for intermediate categories (branches).

# Friendly URL for sheet categories (templates)
RewriteRule ^ A_model $ /index.php?cmd=category&cat_id=123 [L]
RewriteRule ^ Another_model $ /index.php?cmd=category&cat_id=124 [L]

Category pages contain links to unique spares.

Friendly URLs are also used and redirection is done with .htaccess file.

# Friendly URL for unique products
RewriteRule ^ A_product $ /index.php?cmd=products&prod_id=456
RewriteRule ^ Another_product $ /index.php?cmd=products&prod_id=789

For the convenience of the user, if only one spare part is available for a given model, there is an automatic redirection of the sheet category page to the single product page, so that the address of the category behaves like a tiny URL (or a gateway if you prefer). on the product page.


If the visitor wants Browse Categorieshe can do it if a Ajaxified tree whose developed nodes load the subcategories on the fly. (For this, the site uses dynatree.js with a delayed load.)

So, the robots are aware of the relevant landing pages for sale (sheet categories and product pages) but – because they do not have an XML sitemap -, the site may appear to them as unstructured (no hierarchical structure they know).


Why I used .txt sitemaps rather than the .xml so far:

  • Simplified Maintenance: I just need to add a new link when a new product or category is released.
  • Targeted visitors are experts in their field,
    who from the beginning know which model / piece they are looking for.
  • Intermediate categories (tree branches) are almost irrelevant – apart from
    see the different families of products available – and therefore does not need to be referenced.

Questions:

  1. Do I have to create friendly URLs for intermediate categories and add
    the site map
    in order to make the site more structured, given
    that these pages would create duplicate content with the sheet
    categories and product pages?
  2. In this particular case, should I to switch from .txt sitemaps to XML? (although the interview would be much more difficult).
  3. I plan to replace ajaxified tree with an ajaxified navigation based on tags (filters). Would that make SEO even worse?
  4. Since the homepage looks more or less like a search engine (ie, with little content), would you recommend adding "blah blah blah" to it – even if it is useless for the visitor – to attract more traffic?

information architecture – Are the tabs and / or steps of an assistant displayed as separate areas in a sitemap diagram?

I create a sitemap for a business application.

For a section of the application, there is a calendar editing feature. Once clicked, there are three sections / or different types of calendars to configure.

  1. Start / end dates for the entire project
  2. Blocked dates (public holidays, etc.)
  3. Start / end dates for specific tasks within the project

We are currently using a step-by-step wizard to modify the calendar. The user must therefore define the dates in this order.

In my sitemap, do I map each step as a separate area or would it be in a separate user flow diagram?

Sitemap

seo – I use curl to send a sitemap but I do not get any thoughts in the web portal

TWO things confuse me how should I submit my sitemap.

First.
It is understood that a site map can be submitted with cURL. https://support.google.com/webmasters/answer/183668?hl=fr

However, whenever the submission of the site map does not match the web portal. The timestamp in the web portal only indicates the time at which I submit the site map via the web portal.

Why are they different? Does submission via curl work?

Second
I found that Google was only analyzing my website right after the first site plan submission. I update my site and sitemap regularly, and hope Google can explore my site. However, it seems that Google never comes back.

What is the problem with these?

seo – why print drop after submitting the sitemap

We recently launched a new website last month and we use Google Search Console to determine how to improve traffic.

we found that there was a sharp increase after the site plan submission to Google, but there is a sharp drop in printing.

Can someone explain why it's like that? We have attached a photo in this post. The red circles indicate the date on which we submitted the sitemape.

enter the description of the image here

Google Search Console – Adding a domain / subdomain and a sitemap for everyone

The best way is the one you mentioned in point 2.

You will need to add two separate properties to your search console.

  1. http://example.com
  2. http://subdomain.example.com

The procedure described in point 1 is not possible at the moment, because the new search console will not allow you to add a sitemap for a subdomain, within the http://example.com property.

In addition, the new Search Console automatically adds a domain property that consolidates all properties in your domain, including all subdomains. Inside the Domain property, you can add a sitemap corresponding to your website, whether it's a subdomain or a root domain.

7 – How to update the sitemap in D7?

I want to exclude certain pages from the site map.

I go to these pages / nodes, edit them, then go to the XML sitemap tab and set it to "Exclude".

Then I save and erase the cache.

But when I look at my sitemap.xml file, nothing has changed: the URLs of the page are still there.

How can I delete them?

plugins – WordPress Sitemap including missing pages

I do not know if there is a serious cache or bug, I have installed several plug-ins "seo" to try to fix this problem, I I have scanned the file system, cleaned up the database, I do not know what generates this sitemap and why it includes links to things that are not there!

https://eastcoasttyreandauto.com.au/sitemap.xml

Example: https://eastcoasttyreandauto.com.au/articles/662043552335510/3138icps06200/ezdrdmh/114838124

Are there any tips for understanding what controls this Sitemap?

Can any one help me to do SEO on the page?

How to add these activities on the website.

Add a site map,
Add Google Analytics,
Search by keyword
Add a Meta title,
Add Meta Keywords
Content Duplication Control
Add meta descriptions,
Canonization
Add fresh content (because content is the king of SEO),
Add a Robot.txt file,
Add friendly URLs for SEO (this means a readable URL)
Cleaning and optimization of HTML code
Add title tags like H1 and subtitles as H2,
Do not forget to make the website responsive,
Check the speed of the site,
Optimization of the image.
Google Penalty Check

.