How to filter a long list of Google search sites?

I have a list of sites that I want to permanently exclude from Google search results. This is the big family of sites that plagiate (and some stop with automatic translation) Q & A of the StackExchange network, as qaru.site. I've created a search engine filtering in chrome, with a URL like https://google.com/search?q=%s+-site:site1+-site:site2+-...+-site:siteN, with about 30 sites on the blacklist. Naturally, after putting another site on the blacklist, Google began to complain:

"site: qano.site" (and the following words) was ignored because we limit queries to 32 words.

So some of those blacklisted areas are not actually blacklisted.

I've tried using Chromium extensions to filter search results, but these usually work at a higher level, filtering only the results page, often leaving me with an almost empty page instead to present the more relevant results page, as does the -site:domain.com token in the search query.

So, how can I always have a blacklist for a potentially large set of sites? I would rather not have to log in to Google for this.

Sell ​​- Cost Effective Fb Ads Agency – Working Together Long Term

Seller's notes
HOW IT WORKS

VISIT THE SITE

No headaches, the services offered on the site are totally outsourced. The provider processes all customer requests – all the details of the provider are included in the sale of this website, including backup providers.

No matter who can run this business, no Facebook advertising experience, no advertising optimization experience on Facebook, no sales experience needed – no problem ………… .. Manage your own advertising agency on Facebook in this multi-billion dollar niche encompassing the composition and optimization of Facebook. Advertising campaigns.

All the companies in the world will have to use Facebook ads to stay competitive. Facebook will keep advertising costs down to allow this phenomenon to reach its full potential!

What happens when a sale takes place?

You will receive an email and payment via PayPal immediately when a customer places an order on your new website. You then place the same order with the supplier to perform the required service.

You provide the supplier with the required information that you already have from your customer as they have entered it in the order form. From there, your provider takes over and provides the service quickly. You can thus assign your customer to recurring billing via Paypal and leave the same margin every month without hands.

Package 1

COST $ 50

SELL $ 149

PROFIT $ 99

Package 2

COST 80 $

SELL $ 299

PROFIT $ 219

Package 3

COST 130 $

SELL $ 399
SEMrush

PROFIT $ 269

Very reliable supplier with proven experience. Including backup providers.

This site runs on autopilot if you have reasonable communication skills. No telephone contact with customers is required. All you have to do is send e-mails. No Skype calls, just exchange emails.

Do not forget that you will receive several orders from the same customer as well as reference services. As a result, income for 10 clients per week could quickly double from US $ 3,999 to US $ 8,000 per week, depending on repeat and referral activities.

Why are Relevant Facebook ads on the current market?

Facebook has more than 2 billion active users a month!

These are rich and ready to buy leads that can be targeted very accurately. Facebook puts literally millions of customers at your fingertips instantly and lets you focus on your ideal customers for optimal results.

An effective Facebook advertising campaign targets prospects based on their age, gender, location, interests, publications and pages they liked, and more.

This valuable data makes it possible to target ads to an "ideal" customer even before he knows it!

A forest full of fruit at your fingertips is within reach of a smart advertising campaign on Facebook.

Digital marketing will ultimately rely exclusively on Facebook ads as the only way to accurately target customers.

How will it work again?

1. You buy this reseller site:

2. You promote it with the simple marketing plan I provide (which works).

3. You get inquiries, they are easy to manage because all the information required of the potential customer is simply indicated on the order form of the website.

4. The customer is impressed … then completes a purchase via PayPal and you immediately receive the funds from your PayPal account, for example. $ 399 for package 4.

5. You then contact your supplier by email with the details of the order that have been completed by your customer.

6. The supplier then delivers the project – a Facebook advertising campaign fully managed directly to your customer.

7. The campaign earns money for your customer to continue the service by placing it in automatic billing via PayPal.

More importantly, your provider provides this service quickly and efficiently … and your customer will be blown away by his return on investment. The provider also provides a free Facebook page to your customer if he does not have a Facebook page. You use it as a hook for new customers.

Best of all you can put your customer on recurring billing via Paypal and make the same margin each month without the hands!

This site is a cash cow for all those who have reasonable communication skills and the ability to follow a simple marketing plan to get …

So what should you do after buying this site?

You generate traffic to your site by following the marketing plan included in the sale. The marketing element of this plan is the latest strategy used by marketers to quickly get converted sales traffic. You will have access to all this. And YOU DO NOT NEED TO USE YOUR PERSONAL ACCOUNTS ON SOCIAL NETWORKS.

Get your VIRAL site and you could serve more than 100 customers a month, hands free. Your provider manages everything.

All you have to do is focus on the traffic generated by your site. There is an advanced plan for this included in the sale – and it works.

Operate from anywhere in the world.

What do you get when you buy this site?

You obtain:

Premium Domain Name

1 year of accommodation

Premium design

Facebook ads made for you with personalized audiences

Marketing Plan

Social Media Marketing Guide

Operational Guide

Supplier contacts

Email Hosting

8 – paragraphs_update_8018 takes a very long time to complete

the paragraphs_update_8018 the update is time consuming, so far it has already exceeded 1h40m and is still running.

My question is: is it possible to run this update without maintenance mode? I do not want to stop the site too long for this update.

information architecture – Short or long paths

Jabob Nielsen, in his article URL as a UI of 1999, emphasizes the importance of user-friendly and hackable URLs. The 2005 and 2007 updates mention eye tracking studies suggesting that users pay close attention to URLs.

Another NNGroup article, Navigation: You Are Here, states that:

Well-chosen, human-readable web addresses are important for sharing, credibility, recognition and recall. The web address of a page can be used to reveal part of the information architecture in order to contextualize the content.

I do not know how many times your users will see the full URL instead of the link text, but I do not think it's advantageous to make the links less meaningful.

For me personally, legible links are important, I say more trust for the links with the fragrance information. In addition, when browsing a documentation or wiki linked to each other, I often move the mouse over the link to display the URL in the lower left corner to see if I have visited the site or if I have already open the link. In this scenario, a human-readable URL is also very useful.

How long does it take to check?

I put in the DNS TXT and always say that I have to check ???
How can I do it correctly?

Should long paragraphs be aligned, justified, or just left aligned?

For better readability, what would be better for long paragraphs: align-justified or left-aligned?

Reason to ask: Aligned justification gives better visual appeal, but many tutorials and learning sites are not aligned. Why?

Is there an algorithm to compress an array of strings represented as pointers into a long string of pointers with a compressed version of the long string?

In a program I write, I represent an array of long string strings, and pointers point to different substrings to present my array. For example.

str_array = struc string_array
   long_str = "abcdefab"
   pointer_array = ((start = 0, len = 3), (start = 3, len = 3), (start = 6, len=2))
end

So str_array = ("abc", "def", "ab"), but notice that I can actually compress the long string by removing "ab" at the end. For example.

str_array2 = struc string_array
   long_str = "abcdef"
   pointer_array = ((start = 0, len = 3), (start = 3, len = 3), (start = 0, len=2))
end

and note that str_array2 is also ("abc", "def", "ab") === str_array.

What is this type of compression called in computing? I suppose there is already literature on this type of algorithms?

bitcoin core – Resolve BTC long memory pool (lots of unconfirmed messages)

I started with the Bitcoin exchange service, creating rawtransaction and broadcasting it. The problem I have to deal with is the mempool string too long (more than 25 unconfirmed transactions), for example:
64: chain too long

Using limitancestorcount and limitdescendantcount Just solve on the local node, these late txs (transaction after the first 25) have to wait a long time to be rebroadcast like this:
Transaction not found in blockchain

Here is my testnet address has a lot of unconfirmed txs and still waiting, if I search on another explorer, I can not see 45 unconfirmed txs, same for getrawtransaction on the node, respose is No such mempool or blockchain transaction:

https://live.blockcypher.com/btc-testnet/address/mxHqrQBWuCndNaubTYUbcEVzeNPsT34TP6/

So, how can I avoid this problem, I just think of several methods, such as: divide the main address into a few sub-addresses, which divides the main UTXO into a few UTXOs, create a many-to-many transaction (currently 1 to 2, 1 target and 1 changeAddress back), using UTXO first with high confirmation, … Which options can bring better results and performance for a long scaling? term? Any advice is really appreciated. And last question, is there a way to mark 1 UTXO used 25 times (for example, I use UTXO X first, then it is returned, Y is used, Z is returned, total using X = 2, I have no related information to calculate it because the node returns, UTXO only contains txId).

ease of use – Integration with many fields of user profile – long and detailed vs short and essential?

So, if you can imagine a dating app, the essential is the following: kind, location, age, name, photo

Then say that you have optional fields that are very useful, such as "search" – their purpose of using the application.

And then, let's say you have 20 other interesting areas, but you do not need to have: religion, politics, languages, diet, type of relationship / status, etc.

If you force the user to inform them during the integration, the integration becomes a bit long and becomes painful. If you stay brief with the essentials, you will get many minimal profiles, for example. people who have just downloaded a photo, do not complete their biography and stick to it.

What do you suggest?

18.04 – Long list of Hit and Get of sudo apt update

Why are there so many repetitions to get similar packages during sudo apt update? Is this normal? Previously, I had noticed that there was only 5 to 7 Hit or Get in total. Recently, I noticed that this list of updates seemed to lengthen.

Hit:1 http://archive.ubuntu.com/ubuntu bionic InRelease                                                    
Hit:2 http://archive.canonical.com/ubuntu bionic InRelease                                                 
Get:4 http://archive.ubuntu.com/ubuntu bionic-updates InRelease (88.7 kB)                                                                                               
Get:7 http://archive.ubuntu.com/ubuntu bionic-backports InRelease (74.6 kB)     
Get:8 http://archive.ubuntu.com/ubuntu bionic-security InRelease (88.7 kB)     
Get:9 http://archive.ubuntu.com/ubuntu bionic-updates/main amd64 Packages (728 kB)
Get:10 http://archive.ubuntu.com/ubuntu bionic-updates/main i386 Packages (580 kB)
Get:11 http://archive.ubuntu.com/ubuntu bionic-updates/main amd64 DEP-11 Metadata (285 kB)
Get:12 http://archive.ubuntu.com/ubuntu bionic-updates/main DEP-11 48x48 Icons (70.9 kB)                                                                                                                          
Get:13 http://archive.ubuntu.com/ubuntu bionic-updates/main DEP-11 64x64 Icons (140 kB)                                                                                                                           
Get:14 http://archive.ubuntu.com/ubuntu bionic-updates/universe amd64 DEP-11 Metadata (253 kB)                                                                                                                    
Get:15 http://archive.ubuntu.com/ubuntu bionic-updates/universe DEP-11 48x48 Icons (209 kB)                                                                                                                       
Get:16 http://archive.ubuntu.com/ubuntu bionic-updates/universe DEP-11 64x64 Icons (452 kB)                                                                                                                       
Get:17 http://archive.ubuntu.com/ubuntu bionic-updates/multiverse amd64 DEP-11 Metadata (2,468 B)                                                                                                                 
Get:18 http://archive.ubuntu.com/ubuntu bionic-backports/universe amd64 DEP-11 Metadata (7,924 B)                                                                                                                 
Get:19 http://archive.ubuntu.com/ubuntu bionic-security/main i386 Packages (368 kB)                                                                                                                               
Get:20 http://archive.ubuntu.com/ubuntu bionic-security/main amd64 Packages (502 kB)                                                                                                                              
Get:21 http://archive.ubuntu.com/ubuntu bionic-security/main Translation-en (170 kB)                                                                                                                              
Get:22 http://archive.ubuntu.com/ubuntu bionic-security/main amd64 DEP-11 Metadata (22.6 kB)                                                                                                                      
Get:23 http://archive.ubuntu.com/ubuntu bionic-security/main DEP-11 48x48 Icons (10.4 kB)                                                                                                                         
Get:24 http://archive.ubuntu.com/ubuntu bionic-security/main DEP-11 64x64 Icons (31.7 kB)                                                                                                                         
Get:25 http://archive.ubuntu.com/ubuntu bionic-security/restricted amd64 Packages (6,600 B)                                                                                                                       
Get:26 http://archive.ubuntu.com/ubuntu bionic-security/restricted Translation-en (2,840 B)                                                                                                                       
Get:27 http://archive.ubuntu.com/ubuntu bionic-security/universe i386 Packages (590 kB)                                                                                                                           
Get:28 http://archive.ubuntu.com/ubuntu bionic-security/universe amd64 Packages (604 kB)                                                                                                                          
Get:29 http://archive.ubuntu.com/ubuntu bionic-security/universe Translation-en (201 kB)                                                                                                                          
Get:30 http://archive.ubuntu.com/ubuntu bionic-security/universe amd64 DEP-11 Metadata (42.1 kB)                                                                                                                  
Get:31 http://archive.ubuntu.com/ubuntu bionic-security/universe DEP-11 64x64 Icons (116 kB)                                                                                                                      
Get:32 http://archive.ubuntu.com/ubuntu bionic-security/multiverse amd64 DEP-11 Metadata (2,464 B)                                                                                                                
Fetched 5,652 kB in 23s (245 kB/s)                                                                                                                                                                                
Reading package lists... Done