seo – How to generate sitemap for angular web application

I am developing a website using MEAN stack technology. So I want to create a sitemap for all urls present in webpage and also video and image sitemap. Website is dynamic and there will surely be more links in future and more images and audios as well. So can you help me in creating a dynamic sitemap so that it does not require to update for each and every link, image and video.
You can share your any idea regarding sitemap generation. Because I have zero knowledge in sitemap generation. Any help will be appreciated.

8 – How do I debug/kint Simple XML Sitemap link array?

I am trying to find how I can debug/kint a variable/array from the Simple XML Sitemap?

I worked through the documentation here: https://www.drupal.org/docs/8/modules/simple-xml-sitemap/api-and-extending-the-module#s-api-hooks to find the hook I need.

My goal is to unset any links that have node/ to remove published, but un-aliased nodes of included content types.

The array key ('path') looks to be the unaliased URL and the below code removes all links except the home page. I am unsure how I can kint($link) in this function so I can see what other array keys are available to see what else I may use for comparison.

function HOOK_simple_sitemap_links_alter(array &$links, $sitemap_variant) {

  foreach ($links as $key => $link) {
    if (strpos($link('meta')('path'), 'node/') !== FALSE) {
      unset($links($key));
    }
  }
}

Is there a way to kint() these sitemap arrays? Or maybe some documentation that shows the structure of these arrays?

Separate Sitemap for Mobile Website

Hello Friends,

Do we need to create separate sitemap for mobile website. If i have subdomain mobile website m.axydotcom (i.e)

Is a video sitemap enough for SEO or should video markup schema also be used?

I already implemented a video sitemap, based on Google’s recommendation. Now, I am reading about video markups. The only information, I got on the relationship of both approaches to each other, is:

Use both on-page markup and video sitemaps. A common scenario is to use on-page markup for all of your videos and a video sitemap to tell Google about any new, time-sensitive, or hard to find videos.

Afaik, the video sitemap is more widely supported among search engine providers, e.g. Bing seems to just start supporting JSON-LD.

Is there any reason that speaks for using video markups in addition to video sitemaps?

xml sitemap – Can’t view my page map correctly?

I have installed the geolocation module to view the location maps that I have on my website
enter image description here

Also I checked the Google map API , and added google maps API key , after create the location content type and make sure that Manage form display to show me the geolocation google map api – geocoding and map
enter image description here

then I added the content that I want but I got this error (this page can’t load google map correctly)
enter image description here
is there something wrong with the steps that I followed?

sitemap – Problem with how Google indexes multi-language and multi-domain website

I have a problem with how Google indexes multilanguage/multidomain website.

Website itself handles two regions/languages and two domains. Language is identified by url param, not by the domain. Ideally we would want the website open either as website.com/en or website.de/de.

However that is not how Google indexes the site as I can see for example website.de/en or website.com/de in the results (depending how search is done). This is very confusing for the clients as it leads to the wrong version of the site.

What I’m trying to figure out is how to tell google to only index links for one site as website.com/en/* and the other as website.de/de/* and do not create a mix of both, like it does currently.

My initial though was to tweak the sitemap. At the moment it shows all the links existing on both en and de website, but with the .com domain. It looks like this

website.com/en
website.com/de
website.com/en/contact
website.com/de/kontakt
website.com/en/our-team
website.com/de/unser-team

and so on.

My question is. How can this problem be solved? Is a sitemap the best way to go? And if so, how should actually such sitemap look AD 2020 to handle this setup?

Or is there other solution?

Can you request refresh of the sitemap in Google Search Console?

TL;DR My Sitemap shows a last crawl date of 2-3 days ago in Search Console and I wondered if there was a method of requesting Google re-crawl the sitemap, in the same way as you can ask it to re-crawl a page?

Explanation – I have a problem with a updated page not showing correctly in Search results. I have asked for several re-crawls over several days without luck. I believe this may be as a result of an old lastmod tag on my sitemap. I have corrected that issue and now a live view of sitemap now shows a correct lastmod. I think Google was skipping over the page re-crawl request because its last index was newer than the lastmod before I corrected that issue. Now the lastmod is newer than the last crawl date but now its not touching the Sitemap file to see the updated lastmod. I suspect it will in its own time in the next few days but still, its annoying…

Is there a way of requesting a sitemap re-crawl?

google search console – Can I combine sitemap and sitemap index together?

I want to create a sitemap for Google and Bing.

Referring to sitemap protocol: https://www.sitemaps.org/protocol.html#sitemapXMLExample

I want to combine sitemap file and sitemap-index file in one. What I mean is as follows:

<?xml version="1.0" encoding="UTF-8"?>

<sitemapindex xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"

         xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9 http://www.sitemaps.org/schemas/sitemap/0.9/siteindex.xsd"

         xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">

   <sitemap>

      <loc>http://www.example.com/sitemap-index-1.xml</loc>

      <lastmod>2004-10-01T18:23:17+00:00</lastmod>

   </sitemap>

   <sitemap>

      <loc>http://www.example.com/sitemap-index-2.xml</loc>

      <lastmod>2004-10-01T18:23:17+00:00</lastmod>

   </sitemap>

</sitemapindex>

<!-- Sitemap for individual URLs -->

<urlset xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"

         xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9 http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd"

         xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">


   <url>

      <loc>http://www.example.com/page-1/</loc>

      <lastmod>2005-01-01</lastmod>

      <changefreq>monthly</changefreq>

      <priority>0.8</priority>

   </url>

   <url>

      <loc>http://www.example.com/page-2/</loc>

      <lastmod>2005-01-01</lastmod>

      <changefreq>monthly</changefreq>

      <priority>0.8</priority>

   </url>

   <url>

      <loc>http://www.example.com/page-3/</loc>

      <lastmod>2005-01-01</lastmod>

      <changefreq>monthly</changefreq>

      <priority>0.8</priority>

   </url>


</urlset>

Technically this is possible, and I will make sure to maintain directory hierarchy. Although, I am not sure if this is allowed as per the protocol.

If I create a sitemap like this, then is it acceptable by Google, and Bing?

When is it better to use multiple XML sitemaps vs one sitemap?

When is it better to use multiple XML sitemaps vs one sitemap?

web crawlers – What is difference between robots.txt, sitemap, robots meta tag, robots header tag?

So I am trying to learn SEO and I am honestly confused and have following 8 questions.

  • Do I tell a bot not to visit a certain link through X-Robots-Tag or through robot meta tag or robots.txt?

  • Is it ok to include all 3 (robot.txt, robot meta tag, and X-Robots-Tag header) or I should always only provide 1?

  • Do I get penalized if I show same info in X-Robots-Tag and in robot meta tag and robots.txt?

  • Let’s say for /test1 my robots.txt says Disallow but my robots meta tag says follow,index and my X-Robots-Tag says nofollow,index,noarchive. Do I get penalized if those values are different?

  • Let’s say for /test1 my robots.txt says Disallow but my robots meta tag says follow,index and my X-Robots-Tag says nofollow,index,noarchive. Which rule will be followed by the bot? What is the importance here?

  • Let’s say my robots.txt has a rule saying Disallow: / and Allow: /link_one/link_two and my X-Robots-Tag and robot meta tag for every link except /link_one/link_two says nofollow,noindex,noarchive. From what I understand bot will never get to /link_one/link_two since I prevented it from crawling at root level. Now if I provide a sitemap.xml in the robots.txt that has /link_one/link_two there, will it actually end up being crawled?

  • Will bot crawl into the directory provided by sitemap.(xml/txt) even though it is not accessible through home page or any pages following the home page?

  • And overall I would appreciate some clarification on what is the difference between robots.txt, X-Robots-Tag and robot meta tag and sitemap.(xml/txt). To me they seem like they do the exact same thing.

  • I already saw that there are some questions that answer a small subset of what I asked. But I want the whole big explanation.