## at.algebraic topology – obstruction cocycle for non-simple spaces using local coefficients

This question is similar to here, but I was hoping for a concrete theorem statement surrounding the obstruction cocycle for non-single spaces.

I hope for a theorem like this:

Let $$A subset X$$ such as $$pi_1 (A) = pi_1 (X)$$ and $$f: A to Y$$ to be a function. leasing $$pi_n (Y)$$ be a $$mathbb Z ( pi_1 (A))$$ module, where $$pi_1 (A)$$ acts via $$f _ *$$ and the usual action of $$pi_1 (Y)$$. Assume that $$H ^ * (X, A, pi_n (Y)) = 0$$ for everyone $$n in mathbb N$$. There is then an extension of $$f$$ to all $$X$$. More generally if $$pi_1 (X) = pi_1 (A) / N$$ and $$N subset ker f _ *$$, then if $$H ^ * (X, A, pi_n (A) / N) = 0$$, there is an extension.

The slightly awkward generalization is due to my desire to prove universal property for construction Quillen & # 39; s more presented in these notes, prop 1.1.2.

A naive assumption on my part would be to apply the usual obstruction theorem to universal coverage of $$Y$$, then if we use local coefficients, we can make sure that $$f: A à tilde {Y}$$ is a lift of $$f: A to Y$$, but I'm not completely sure.

## I am selling my BLOGGER site for adults (porn videos with local subtitles) | NewProxyLists

Hi;

I am selling my adult site that I have been using for years and I have earned thousands of dollars.

The site is a site where videos with subtitles are shared.

There are hundreds of subtitled videos awaiting publication. You just have to share these videos every day.

I give the archive where the videos shared in the past are stored.

I give my cloud email account with 1 TB storage area. Lifetime storage.
https://ibb.co/1Mh3KS2

I give social media accounts compatible with the name of the site. Users access the site from here.

pinterest:
Tumblr:
VK:

If you wish, I give the advertising accounts with which I work.

Some of the payments I have received:

Clickaine: https://ibb.co/3zFFgJn

Openload: (unfortunately it is closed)

Exoclick: https://ibb.co/m6WwYCV

Ouo.io: https://ibb.co/Y8545j8

Vidoza:
https://ibb.co/QX8B5vc
https://ibb.co/KFgKtWj
https://ibb.co/54Y09Lk

Teknik destek sağlıyorum. Eğer ilgileniyorsanız, pm lütfen.

## Local version of "White Balance" in GIMP to clean up the photo of the document

I have a photograph of a document with black text on a white background.

The photo has some problems:

• The text is a bit vague.
• Overall, there is noise even in the white areas.
• The background does not appear really white but a little yellowish.
• Most important: Some areas of the image, even the areas supposed to be white, are darker than others.

I would like to clean this photo. I'm on Linux.

The "Colors> Auto> White Balance" filter in GIMP gives promising results. However, it does not match the background in different areas of the image.

But if I only select one sub-area of ​​the image, the "White Balance" filter actually works better in that area.

So I imagine that the progressive application of "White Balance" in the local areas of the image would work really well.

I imagine that the "White Balance" filter is implemented like this:

1. Collect color statistics of the entire image.
2. Create a color conversion matrix and apply it globally.

So what I would rather like:

1. Collect local color statistics for each area, e.g. 100 * 100 px.
2. Create local color conversion matrices.
3. Create a continuous function of pixel conversion matrices.
4. Apply locally per pixel.

Do you know anything like that that exists either in GIMP or as standalone software, for example for the Linux command line?

## New directory of local businesses

For the UK only

Local UK business directory for suggesting / viewing businesses, owners can submit the following information –

• Company logo
• Description of the activity
• Location (region / county)
• Working hours
• Contact number
• Company website (optional / nofollow attribute used)

Site visitors can –

• Search / browse regions / counties in categories.
• Share detailed company profile pages …

New directory of local businesses

## ssh – local IP not accessible by a computer – all others succeed

I have a strange problem and after days of research, I have not been able to solve it.

I have a network of 4 computers and 2 mobile devices. The problem is between a computer running kubuntu 18.04 (and Windows) (the "Client") and my NAS (Synology) ("the Server").

Both machines have static IP addresses, both have the same subnet mask. I don't use any DNS and connect computers using static IP addresses from my network.

On Linux, running `arp -a`, the client (192.168.0.108) can see the server:

``````_gateway (192.168.0.1) at ac:22:05:64:77:76 (ether) on wlp3s0
? (192.168.0.136) at 04:d6:aa:97:07:60 (ether) on wlp3s0
? (192.168.0.52) at 00:11:32:20:f1:31 (ether) on wlp3s0
``````

If I try to ping, ssh or connect via nfs to the server via the local network, however, all connections fail. (Free time)

I have configured my server so that ssh is accessible on the Internet via a public IP address. However, if I use my client's ssh to the server's public IP, I can connect.

I can ping the client from the server on the local network, so this works the other way around.

The client is running Windows 10 as the second operating system. On Windows, some of the shared folders are mounted via smb. It works sometimes, sometimes it doesn't and I can't understand what could be the reason for this.

All other machines on my network can reach the server and all of the services running on it without problems.

I have a feeling that it must be related to my client's settings. Since it seems (always) to happen on Linux and (sometimes) on Windows, it seems to be a hardware issue. I guess maybe some hardware settings for my network card in the BIOS (MSI Tomahawk B450), but this is just an uneducated guess.

Does any of you have any idea where I could start looking?

Thanks, any help would be greatly appreciated.

## python – How to publish a local website on a specific server?

I have a local website that is currently working perfectly and uses Plotly-Dash and a vial server. This can be a terrible practice, but I only have one .py file (1000 lines of code) that contains all of my HTML and Python code. It works inside a virtual environment. What I need to do is publish this site and upload all the necessary files to my school server.

So there is a detailed process for doing this. I need to run my project on Adobe Dreamweaver is my first step, then upload the working files to the server. I created a new site on Dreamweaver and connected it to the school server. My problem is that Dreamweaver seems to be specifying that it needs an HTML file, and I only have my .py file, as well as the necessary libraries inside my virtual environment to run this .py file.

I lack experience with web development, this is probably obvious. I come here to ask more experienced web developers what to do here? Should I split my html / css into a separate file and point to my python file, or is it possible to upload everything I have directly to the server?

I will detail my directory further:

``````C:UsersJoshDocumentsdash_appScripts
``````

Inside the scripts, there are these files:

I am using the app.py folder to create my entire website, and it takes the data from the Excel file named data1.

## Local SEO – JSON-LD for schema.org markup – what to include and where

I am trying to take advantage of using JSON-LD for Schema.org markup on microdata, as it seems much easier to implement and maintain. Additionally, I've heard that consumers like Google and Bing now prefer its use. However, since JSON-LD markup is not directly part of the source code, it presents a possible problem. That being the case, it's now a lot easier to add whatever you want in your markup, which makes me wonder if this could be considered spam or duplicate content.

For example, take my code example:

JSON-LD marking:

``````{
"@context": "http://schema.org",
"postalCode":"12345",
"streetAddress": "123 any st."
},
"description": "Detailed description of the Company",
"name": "Company Name",
"openingHours": "Mo-Fri 08:30-17:00",
"geo": {
"@type": "GeoCoordinates",
"latitude": "42.000",
"longitude": "-81.000"
},
"logo": "https://www.campany-name.com/logo.png",
"image": "https://www.company-name.com/logo.png",
"url": "https://www.company-name.com",
"telephone": "(xxx) xxx-xxxx",
"faxNumber": "(xxx) xxx-xxxx",
"foundingDate": "1900",
"priceRange": "\$\$",
"email": "info@company-name.com",
"currenciesAccepted": "USD",
"paymentAccepted":"Cash, Credit Card, ACH, Debit",
"sameAs" : ( "https://www.facebook.com/company-name/",
}
``````

The above markup example seems to satisfy the Google markup validator for a `LocalBusiness`, but the page to which it refers can contain only a few elements of this information.

## My question is therefore in a few parts:

1. Would it be better to only include the markup that actually appears in the source code or should I provide all the bits of markup information that consumers like Google and Bing want?

2. If the answer to question 1 is to include only the markup that appears in the source code. Then what if it is not possible to include certain elements in the source code such as the `geo` article. Would it be better not to include it at all in the markup?

3. Is it necessary to include the same markup on each page, or are consumers smart enough to understand that if the `LocalBusiness` type has been filled in on a different page inside a domain that it applies to the whole domain?

4. If the answer to question 3 should include only `LocalBusiness` markup data on a page on an entire site, is there a best practice for the page on which to place it? that is to say. home page, contact us page …. any SEO advantage?

## ssh – How can I upload a file from a local server to a remote server via another remote server?

Here's the thing, I have two remote servers `remote1` and `remote2`. I can only access `remote2` using the private key of `remote1`, locally I have access to `remote1` by entering psw. now i want to upload a local file to `remote2`. I know I could upload the file to `remote1` first then `remote1` at `remote2`through `scp`, but I found ssh tunneling, which did not work for me at the first moment.

I use this command `ssh -l localhost:8022:remote2:22 user1@remote1`but then it shows me
`localhost:8022:half of remote2 string @remote1 's password:`
After entering the password for `remote1`, this shows
`Permission denied, please try again.`

Could someone help me find out where the problem is, thank you very much!

env: local win10, remote1,2 linux

## rdma – Role of the local routing header (LRH) in RoCEv2

In the RoCEv2 specification (Annex 17)

the BTH + header includes the local routing header (LRH). Routing is done however with IP. Is the LRH useful?

Are the source and destination local IDs used somewhere or are they set to 0?

## What is the most efficient and reliable method currently for storing the Bitcoin blockchain in a local database?

I have already seen several articles on this subject and I have tried all the suggested solutions, but I cannot make any of them work.

I'm interested in analyzing transactions with the OP_RETURN opcode for a project, and I need to dump the entire Bitcoin into a database to start working on it. I have already installed Bitcoin Core and downloaded all of the data until today.

There is a list of tools available to do this, and I have had problems with each of them:

• WebBTC: Files do not appear to be available
• BitcoinDataBaseGenerator: This seems to be the most promising solution, since it is easy to configure on Windows and an MS SQL Server database is easy to use. However, after the file blk005 **. Dat, it crashed, saying that there were blocks with an unknown version. I also cannot try to run it again as the requests to delete the data from this file will not end and the expiration time after about 2 hours. Looking at the GitHub issues, people also seem to be having issues with this, as the repository is no longer maintained.
• Blockparser + SQL: This only keeps the data in memory.
• BitcoinABE: I haven't tried this one yet. I have read many issues, but it is very slow, and there are usually many outstanding issues. Could work however, I will try later.
• NBitcoin.Indexer: Designed for Azure, however, I want to have a local database.
• Blockchain2Graph: It also looks promising and under development. It uses neo4j, which is good. At the moment, I have no preference for SQL or graphical databases. I installed a Debian virtual machine and followed all the steps that worked. However, in the end, I can't seem to bring up the app in the browser. As I understand it, it uses RPC calls and not blk * .dat files directly.

Then I found one more on Google:

• Bitcoin at neo4j: This takes the block files directly and places them in a neo4j database. The problem is that the database is apparently six times the size of the actual Bitcoin data. I don't have 1.7 terabytes of free storage space available. He also mentions that it takes several weeks to process everything. I just don't have time to wait for this.

So my question is whether there are any other tools currently available to easily write Bitcoin blk * .dat files to a local database. I may have done something wrong, with the solutions mentioned above, but none of them seem to be an ideal solution. Would it be easy to write a custom application that extracts only the data that interests me, and are there any code models for that?