google search – Should I noindex or remove the old site after moving

Mueller from Google recommends to keep redirects in place for at least a year:

301 is permanent, it means forever and that’s a mighty long time, but I’m here to tell you, there’s something else: the server maintenance.
After a few years the old URLs are often no longer accessed & you can drop those redirects.

If they’re no longer needed after a while (usually I recommend keeping them at least a year), and you don’t see traffic to them, then removing them is fine since it makes long-term maintenance easier.

Then, if you don’t have an appreciable amount of users still accessing the old domain, you can let the domain and redirects lapse.

Do not discourage search engines from indexing domain A. This might cause them to stop indexing the redirects, which could harm the site move.

google search – Should I noindex the old site after moving or it’s preferred to remove it

I moved from that was mapped to a domain to another domain for 8 months ago.

Now the mapping between domain and is expired but still domain redirects (301) to domain

So what Should I do,

  • Remove site permanently.

  • Extend the mapping with domain

  • Discourage search engine to index (I set this option for 1 month ago)

  • Actually, I lost 100% rank from google, so if I rollbacked the move and remapped the to domain after enabling the discouragement of the search engine option, it will go back to its old rank!

8 – how to apply ‘noindex’ to all nodes and leave Alias indexable to prevent perceived duplicates for SEO

We’re working with an SEO specialist who is finding that our underlying /node/### page URLs are competing with our aliases and bringing our page ranking, etcetera down – because they’re perceived as duplicate content. (Drupal 8)

I’m looking at modules that are out there, but not sure if any of these cover what I need. Is there a simple way to set all urls containing /node/* to noindex, nofollow? Maybe a way in robots.txt?

Would simply adding this to robots.txt work?
Disallow: /node/*

(as a side question, what are the ramifications of making such a change? Any issues I need to be aware of, protect against?)

Or maybe there’s another way people have solved this problem?

scrape noindex page

hello everyone I would like to know if scrapebox can search within a site and also extrapolate non-indexed pages and return the urls to me. thanks

Can having too many noindex pages affect ranking?

Can having too many noindex pages affect ranking?

For example: 50,000 low quality noindex forum posts with 1,000 quality indexed posts.

Can these 50,000 low-quality noindex forum posts affect 1,000 indexed quality rankings?

seo – Advantage of using pretty urls on search results with Noindex content?

Google doesn't want light content like search results pages to be indexed. So I use noindex on all of these search results on my website.

Then there is an advantage to using nice URLs like below.

On this:

the other then look and easy to use.

Thank you.

url – I'm looking for a way to define noindex tags on all string filter parameters

I don't want to use robots.txt for this. Any PHP code or any .http access line will work like a charm. Now, I show you what types of URLs I want to delete.

Here is a robots.txt code that I use to block these URLs.

Disallow: /*sort=
Disallow: /*filters=

But the problem is that Robots.txt does not implement Noindex tags, because it just blocks the analysis of URLs.

noindex – WordPress / Yoast: How to solve the problem with the pagination page?

before switching to YOAST, I was using All in One SEO but I had huge problems with it. There were incompatiblliys which made it impossible to use this plugin later.

Either way, I am facing a paging problem on my website. Here is the example:



This / page / 2 poses SEO problems: double content, double h1, double description, etc.

With all-in-one SEO, I was able to define a term without tracking pages / without indexing. In YOAST, I can't find anything similar.

I guess it's a general wordpress thing.

I was able to find a "solution" for this, but that produces another problem.

If I change the steetings of the permalinks from "postname" to "default" -> page / 2 disappears. But then the links of my pages are not pleasant to read (p? = 123). See the attached example.
post name in permalink settings
bad links after switching to permalinks by default

Can you tell me how to fix it?

Thanks in advance and greetings,

Can noindex tags on individual articles affect global SEO or ranking on Google?

Some SEO gurus promise that noindex attributes are good for devaluing unimportant pages. But when I asked in the Q&A moz, they are quite sure that most of the noindex article on the website can hurt the overall SEO ranking. What is your opinion on this.