According to my research, old ways of consolidating paginated URLs are no longer recommended.
rel = prev / nextbut:
"Rel = prev / next is no longer an indexing signal" (Google WebmasterBlog)
noindex, follow on paginated pagesbut:
"If we see the noindex longer than we think that this page really does not want to be used in the search, then we will delete it completely, and in this case, we will not follow the links anyway. , basically follow a little bit the same as a noindex, nofollow ". – John Mueller
This means that these pages will not distribute PageRank and may even be excluded from the analysis, which will result in the discovery of new links on these pages.
Use rel = canonical on the first pagebut:
"Googlebot will not index the pages that appear later in the chain, nor acknowledge the content linked to these pages" SearchEngineJournal
It looks like the
noindex, nofollow problem, see above.
Use a "View All" pagebut:
This is not an applicable solution for large pages, which would have the effect of displaying thousands of links, for performance reasons.
So it seems that the only method for large sets of results is to do nothing, except give some tips to GoogleBot, for example.
rel = prev / nextbecause it will not hurt.
- Use standard URL parameters such as
page = 2for easy recognition.
- Use tips in the title to report pagination, for example.
Shoes - Page 2
- Pray that everything goes well.
If a domain has 1000 categories, each of which has 20 pages, the result is 20000 indexed URLs, instead of 1000 if we used the old methods. Formerly, it was considered bad.
Is there a consensus on how to proceed today? Because I have seen some high performing domains that do not adapt to these new developments and do not keep the old methods.