The Google Search Console creates URLs that do not exist in my Sitemap, and complains about the error in these pages.

You have a misconception of what a sitemap is.

The sitemap is used to audit the crawling of the site by the bot search engine. The site plan and the exploration of a site are two different and independent things. Google will continue to explore your site regardless of any site map. The sitemap will be used to check / see if Google is able to correctly analyze your site. For example, if pages are in your sitemap and Google has not seen the page, Google can add the page to the recovery queue to include.

The opposite is not true. If a page is not in the sitemap, Google will not remove it from its index. Why? Because Google found it while exploring the site.

What you seem to believe is that the sitemap is the only authority used by Google to know which pages exist on a given site. This is not the case. The crawl is. The site map only allows Google to know if they can properly analyze your site and, if not, the missing pages in Google that need to be added to the recovery queue.

Your Google expectations no longer try to access pages because these pages no longer appear in your sitemap are incorrect. Sitemaps are cached and checked only periodically. Why? Because it is an audit process.

You have a real problem to solve.

You return a 500 error for pages not found. It's bad. Your website should return a 404 Not Found error. The error 500 is a system error and Google will treat the condition as temporary. If your site returned a 404 error, Google will continue to try the page several times over a period of time until it decides that the page does not exist anymore. If possible, you want to issue a Deleted 410 error for the pages you have deleted. If this is too much work or it is not possible, the 404 will be the same over time.

You must correct your 500 error.

The Google Search Console creates URLs that do not exist in my sitemap, and complains about the error in these pages.

You have a misconception of what a sitemap is.

The sitemap is used to audit the crawling of the site by the bot search engine. The site plan and the exploration of a site are two different and independent things. Google will continue to explore your site regardless of any site map. The sitemap will be used to check / see if Google is able to correctly analyze your site. For example, if pages are in your sitemap and Google has not seen the page, Google can add the page to the recovery queue to include.

The opposite is not true. If a page is not in the sitemap, Google will not remove it from its index. Why? Because Google found it while exploring the site.

What you seem to believe is that the sitemap is the only authority used by Google to know which pages exist on a given site. This is not the case. The crawl is. The site map only allows Google to know if they can properly analyze your site and, if not, the missing pages in Google that need to be added to the recovery queue.

Your Google expectations no longer try to access pages because these pages no longer appear in your sitemap are incorrect. Sitemaps are cached and checked only periodically. Why? Because it is an audit process.

You have a real problem to solve.

You return a 500 error for pages not found. It's bad. Your website should return a 404 Not Found error. The error 500 is a system error and Google will treat the condition as temporary. If your site returned a 404 error, Google will continue to try the page several times over a period of time until it decides that the page does not exist anymore. If possible, you want to issue a Deleted 410 error for the pages you have deleted. If this is too much work or it is not possible, the 404 will be the same over time.

You must correct your 500 error.

upgrade – missing update. Complains of the root space

I use Mint on a laptop Aspire ES 15.

Same mistake for several packages …

gzip: stdout: no available space on the device
E: mkinitramfs failure to find 141 cpio 141 gzip 1
update-initramfs: failed for /boot/initrd.img-4.15.0-52-generic with 1.
dpkg: linux-firmware error handling package (–configure):
the post-installation script installed in the subprocess returned the error output status 1
Configuration of linux-image-4.15.0-54-generic (4.15.0-54.58 ~ 16.04.1) …
Trigger processing for initramfs-tools (0.122ubuntu8.14) …
update-initramfs: Generating /boot/initrd.img-4.15.0-52-generic
W: Possible missing firmware /lib/firmware/i915/kbl_guc_ver9_14.bin for i915 module
W: Possible missing firmware /lib/firmware/i915/bxt_guc_ver8_7.bin for i915 module
Warning: No support for local: en_CA.utf8

update-initramfs: Generating /boot/initrd.img-4.15.0-54-generic
W: Possible missing firmware /lib/firmware/i915/kbl_guc_ver9_14.bin for i915 module
W: Possible missing firmware /lib/firmware/i915/bxt_guc_ver8_7.bin for i915 module
Warning: No support for local: en_CA.utf8

What is the best way to proceed?

Ag.algebraic geometry – if it complains as a conifold resolution and if it waddles as a conifold resolution, $ ldots $

Assume that $ X $ is a triple projective with at worst singularities conifold and assumes $ omega_X $ commonplace. assume $ Y $ is a projective variety with a birational morphism $ f: Y to X $ which is an isomorphism of the points of conifold and such that $ f ^ {- 1} (p) = mathbb {P} ^ 1 $ for each point of conifold $ p in X $. Can I conclude that $ Y $ is smooth? it is to say that $ f: Y to X $ is a conifold resolution?

It sounds too good to be true, but I was unable to find a counterexample and it would be really helpful (at least for me) if that was true.

seo – Alexa-rank complains of unsafe generator meta-tag

The tag itself is not precarious. It's just a tag in an HTML file.

On the other hand, it provides direct / immediate information about the software you are using and can therefore be used by hackers to choose the hacking system to use to get into your system.

Without this information, they would be forced to launch probes, which could take a long time. Also with advanced Web firewalls, you can detect such probes and block the IP address (Some people say that blocking IP is not a good idea, my experience is that it works great.)

Note that a tag that only gives one name is rather safe (that is to say content = "Wordpress"). If the tag includes the name, version, build date, etc., then it becomes really easy (ie. content = "Wordpress 1.2.3 of May 13, 2019"). So, in your example, he is perfectly safe. Especially if your generator is proprietary and you use it for one or two websites (unlike WordPress which is used for millions of websites).


What is a probe?

Whenever you access a website, you can detect the content management system used to generate it by viewing the HTML content.

Examples:

  • WordPress uses paths that include wp- as an introducer.
  • Drupal uses .../sites//files/...

Each CMS will do the same for all the websites that use them. However, detecting the version is more difficult. You have subtle differences that will tell you but it makes the hacker's life impossible more interesting (from my point of view – but read Stronger).

So hide the GeneratorThe tag does not make any difference if you just have the name of the CMS in there.

8 – The site works well but the composer complains of the unmet requirements Do I have a problem?

I'm trying to get my permissions correctly, so that I do not have to run dialer as root. As part of this process, I deleted vendor, core and composer.lock and then launched the installation to deal with my non-root user. The site seems to be working well despite the message posted by the composer:

        Problem 1
- drupal / core 8.2.x-dev requires symfony / psr-http-message-bridge v0.2 -> satisfiable by symfony / psr-http-message-bridge[v0.2] but these conflict with your requirements or with minimal stability.
- do not install drupal / core-render 8.2.0 | remove drupal / drupal dev-master
- do not install drupal / core-render 8.2.0-beta1 | remove drupal / drupal dev-master
- do not install drupal / core-render 8.2.0-beta2 | remove drupal / drupal dev-master
- do not install drupal / core-render 8.2.0-beta3 | remove drupal / drupal dev-master
- do not install drupal / core-render 8.2.0-rc1 | remove drupal / drupal dev-master
- do not install drupal / core-render 8.2.0-rc2 | remove drupal / drupal dev-master
- do not install drupal / core-render 8.2.1 | remove drupal / drupal dev-master
- do not install drupal / core-render 8.2.2 | remove drupal / drupal dev-master
- do not install drupal / core-render 8.2.3 | remove drupal / drupal dev-master
- do not install drupal / core-render 8.2.4 | remove drupal / drupal dev-master
- do not install drupal / core-render 8.2.5 | remove drupal / drupal dev-master
- do not install drupal / core-render 8.2.6 | remove drupal / drupal dev-master
- do not install drupal / core-render 8.2.7 | remove drupal / drupal dev-master

etc.

What does it mean? Is my site correct or did I take a wrong turn?

I'm not sure why the composer's message refers to "drupal / core 8.2.x-dev". Here is the "require" section of my composer.json:

"need": {
"composer / installers": "^ 1.0.24",
"wikimedia / composer-merge-plugin": "^ 1.4",
"drupal / entity_clone": "^1.0@beta",
"drupal / features": "^ 3.8",
"drupal / metatag": "^ 1.8",
"drupal / webform": "^ 5.1",
"drupal / config_installer": "^ 1.8",
"drupal / entity_browser": "^ 2.1",
"drupal / media_entity_browser": "^1.0@beta",
"drupal / entity_embed": "^1.0@beta",
"drupal / file_browser": "^ 1.1",
"drupal / admin_menu_search": "^ 1.0",
"drupal / admin_toolbar": "^ 1.26",
"drupal / backup_migrate": "^ 4.0",
"drupal / block_field": "^1.0@alpha",
"drupal / contact_block": "^ 1.4",
"drupal / contribution": "^5.0@beta",
"drupal / ctools": "^ 3.2",
"drupal / entity_reference_revisions": "^ 1.6",
"drupal / facets": "^ 1.3",
"drupal / field_group": "^ 1.0",
"drupal / form_placeholder": "^ 1.0",
"drupal / formblock": "^1.0@beta",
"drupal / glazed_helper": "^ 1.3",
"drupal / google_analytics": "^ 3.0",
"drupal / honeypot": "^ 1.29",
"drupal / imce": "^ 1.7",
"drupal / insert_block": "1.x-dev",
"drupal / linkit": "^ 4.3",
"drupal / login_emailusername": "^ 1.1",
"drupal / material_admin": "^1.0@alpha",
"drupal / menu_link_attributes": "^ 1.0",
"drupal / minifyhtml": "^ 1.6",
"drupal / paragraphs": "^ 1.6",
"drupal / pathauto": "^ 1.3",
"drupal / redirect": "^ 1.3",
"drupal / require_login": "^ 2.0",
"drupal / search_api": "^ 1.11",
"drupal / simple_sitemap": "^ 3.0",
"drupal / simplenews": "^1.0@alpha",
"drupal / tfa": "^ 1.0.0@alpha",
"drupal / token": "^ 1.5",
"drupal / video_embed_field": "^ 2.0",
"drupal / view_unpublished": "^1.0@alpha",
"drupal / viewport": "^ 1.1",
"drupal / viewsreference": "^ 1.4",
"drupal / workflow_state_config": "^ 1.0.0@alpha",
"drupal / xmlsitemap": "^1.0@alpha",
"drupal / allowed_formats": "^ 1.1",
"drupal / asset_injector": "^ 2.4",
"drupal / taxonomy_access_fix": "^ 2.6",
"drupal / text_summary_options": "^ 1.0",
"drupal / migrate_source_csv": "^ 2.2",
"drupal / migrate_plus": "^ 4.1",
"drupal / migrate_tools": "^ 4.1",
"drupal / migrate_file": "^ 1.1",
"drupal / entityqueue": "^1.0@alpha",
"drupal / module_missing_message_fixer": "^1.0@beta",
"drupal / rules": "^ 3..0 @ alpha",
"ckeditor / autogrow": "^ 4.8",
"ckeditor / codemirror": "^ 1.17",
"ckeditor / fakeobjects": "^ 4.8",
"ckeditor / image": "^ 4.8",
"ckeditor / link": "^ 4.8",
"codemirror / codemirror": "^ 5.36",
"jquery / geocomplete": "^ 1.7",
"jquery / icheck": "^ 1.0",
"jquery / image-picker": "^ 0.3.0",
"jquery / inputmask": "^ 3.3",
"jquery / intl-tel-input": "^ 12.1",
"jquery / rateit": "^ 1.1",
"jquery / select2": "^ 4.0",
"jquery / timepicker": "^ 1.11",
"jquery / toggle": "^ 4.0",
"jquery / word-and-character-counter": "^ 2.5",
"progress-tracker / progress-tracker": "^ 1.4",
"signature_pad / signature_pad": "^ 2.3",
"drupal / image_widget_crop": "^ 2.2",
"drupal / crop": "^ 1.5",
"drupal / advagg": "^ 3.5",
"drupal / advagg_js_minify": "^ 3.5",
"drupal / advagg_css_minify": "^ 3.5",
"drupal / better_exposed_filters": "^ 3.0.0@alpha",
"drupal / block_visibility_groups_admin": "^ 1.3",
"drupal / chosen": "^ 2.6",
"drupal / selected_field": "^ 2.6",
"drupal / content_export_csv": "^ 3..0 @ beta",
"drupal / contentimport": "^ 4.1",
"drupal / event": "1.x-dev",
"drupal / entity_reference_views_select": "^ 1.3",
"drupal / imce_search_plugin": "^ 1.0",
"drupal / inline_entity_form": "^ 1.0.0RC",
"drupal / job": "^ 3..0 @ alpha",
"drupal / media_bulk_upload": "^1.0@alpha",
"drupal / memcache": "^ 2.0",
"drupal / purge_ui": "^ 3..0 @ beta",
"drupal / purge_processor_lateruntime": "^ 3..0 @ beta",
"drupal / purge_tokens": "^ 3.0.0beta",
"drupal / twig_tweak": "^ 2.1",
"drupal / views_bulk_operations": "^ 2.5",
"drupal / block_visibility_groups": "^ 1.3",
"drush / drush": "^ 9.5",
"drupal / we_megamenu": "^ 1.5",
"drupal / views_infinite_scroll": "^ 1.5",
"drupal / embed": "^ 1.0",
"drupal / dropzonejs": "^ 2.0.0@alpha",
"drupal / purge": "^ 3..0 @ beta",
"drupal / address": "^ 1.4",
"drupal / acquia_connector": "^ 1.16",
"drupal / purge_purger_http": "^ 1.0.0beta",
"drupal / smtp": "^1.0@beta",
"drupal / content_sync": "^ 2.1",
"drupal / console": "~ 1.0",
"drupal / devel": "^ 2.0",
"drupal / core": "8.6.13"
}

seo – Google Webmaster tool complains about the blocked URL, but this is not the case

We recently integrated Google Webmaster Tools.

Now, on one of the pages, Google complains:
Indexed, although blocked by the robots.txt file

Our robots.txt file looks like this:

User Agent: *
Forbidden: / wp-admin /
Allow: /wp-admin/admin-ajax.php


Site map: http: ///sitemap.xml.gz

So, if I understand how it works, the rest of the URL (with the exception of / wp-admin /) should not be blocked.

Why do I have this warning?