Oakley selling Webpros? [split from large Cpanel thread]

Quote Originally published by chrismfz
See the article

Increased revenue (and market value) virtually right after the purchase of cPanel – with the new pricing model.

With probably reduced expenses (based on my information about the drop in quality of technical support, especially for WHMCS, after cPanel's price rise).

I guess DirectAdmin has received a very good offer in the meantime but has refused to sell (sold out).

So now the goal is perhaps to give the books a good appearance and sell everything for a big profit.

New owners can either reduce prices (or change the pricing model) and invest more in their staff, or simply drain the profits to cover the cost of purchase and leave the product (s) to faith. who is waiting for them.
… or propose something completely different, or do something "between" the two extremes. We will see.

The fact is that current owners can most likely sell it at a low price (compared to the expected profits as they are) and still make a fortune (compared to the purchase price).

autofocus – What is the advantage of the large number of AF points?

With my Pentax K10D, which only has 11 AF points, I find that if I'm following a small object, like a distant bird, it's possible that the bird falls in a space between the AF points, or out of the edge of the pattern. As a result, the AF system "hunts" everywhere for something to focus, and I do not even see the bird anymore to follow it.

A large number of control points allows them to be compacted so that a moving target moves smoothly from one point to another without falling into a space.

(You can also solve the problem of "space" by making each AF point sensitive to a larger area, but it would make it harder to tell what the system is focusing on. focuses on the eyes of a portrait subject, but it's actually focusing on the nose.)

Ag.algebraic geometry – Building a very large package on a projective package

Let $ X $ to be a complex projective variety smooth and $ p: Y to X $ be a smooth $ mathbb CP ^ k $-package (that is, locally trivial in analytical topology). Suppose that there is a set of lines $ L $ sure $ Y $ which is limited to $ mathcal O (1) $ on each $ mathbb CP ^ k $-fiber.

Question. Is it true that there is a very large package of lines $ L & # 39; sure $ X $ such as $ p ^ * The L $ ot $ is very ample on $ Y $?

I think that should be deduced from the disappearance of Kodaira, but I can not prove it for the moment.

What would make a single photo of my phone's camera have large areas of magenta stained?

enter the description of the image here

What would cause the rose on this image? All before and after photos were normal. No filter, no flash.enter the description of the image hereenter the description of the image here

Directory-based music player for a large collection of unlabeled music

I have a large collection of music with only partially labeled music files, but carefully stored in directories and I am looking for a music player for this scenario. I do not want the music player to "scan" my collection and sort it by artist or not, but just browse to a directory and read all the files it contains (not in subdirectories).

And I do not want to create playlists for all my directories (how can I manage them?). I am not looking for a workaround, but a music player dedicated to this scenario. Is there such an application?

dg.differential geometry – Large class of curves that intersect very finely multiple times

I'm trying to find a large subset of finite length plane curves distinguishable by pieces (subsets of $ mathbb {R} ^ 2 $) with the following property:

For any pair $ gamma_1, gamma_2 $ curves of this class, their images $ Gamma_1, Gamma_2 $ are such that $ Gamma_1 cap Gamma_2 $ has a lot of connected components.

I've tried to prove that this was the case for this set of curves:

Piecewise smooth curves (finite length) in which each piecewise component is either a line segment or a curve whose derivative is injective,

but I have failed to produce evidence or a counterexample to the assertion that this satisfies the desired properties.

Could anyone suggest how to prove it or why he believes that it could be a false assertion? If it is false, will an additional restriction produce the desired properties?

Obviously, we can only look at one component per piece at a time – and so, it's easy to show that no such curve can intersect a line segment over infinite points. (using the injectivity of the curve). And of course, any line segment can not intersect another segment at a point or the entire segment. I failed to produce evidence when both components are bent. I think having an infinite number of intersection points should lead to non-injectivity curves at one point, but I have not been able to show it.

postgresql – Normalization of a large existing table

I have a big table. Each line has an identifier, some columns pertaining specifically to that identifier, a short sequence (DNA) and some columns pertaining to that particular sequence. If they have not yet been calculated (for this line), they are null, but the calculations will always be identical for a given sequence.

If this turns out to be relevant: the sequence is indexed and the DBMS is Postgres.

There is a lot duplicate sequences. Obviously, this is not optimal – both because we do not want to store duplicates and because we do not want to waste time recalculating these properties. There will already be properties calculated in duplicate.

I want to move the properties of the sequence into a new table, using the sequence as a foreign key. The problem here is the size of the table – hundreds of millions of records and properties are also quite large.

With a small table, it would be easy enough, but I need a better strategy for a huge table.

mysql – Edit on a large table consumes all disk space

I'm using MySQL 8 on Ubuntu 19. I have a 543G table and I need to add a new column and a new index. I started with the new column:

alter table hugeTable add column newCol tinyint(1) after existingColumn;

After a few hours of operation, I received an error, roughly:

... table 'hugeTable' is full

I was looking around and I saw nothing wrong. I still had about 294 GB of disk space. Looking at the error log, it became clear that what has happened is generating a #sql file that keeps growing. I started again to get an idea of ​​the speed with which it grows up and in one hour it exceeds 35G:

-rw-r----- 1 mysql mysql  35G Nov 19 21:53 '#sql-ib1124-819861495.ibd'

and grow quickly.

So the real question is: Is there a way around this? I have a backup of this table, so you can risk to turn it off and run the update without undoing, #sql files, and so on. OR should I use sed to modify the backup file and use it?

How to rotate a large number of images, each image having a specific rotation order in a corresponding CSV file

I have a large batch of images (thousands) and a corresponding CSV file containing the image file names and a unique rotation value.

I need a way to apply each single rotation to the corresponding image quickly and in bulk.

Can someone help with that?

Cheers,

Mast

Hard disk write speed increases from 150 MB / s to 2 MB / s when copying large files

When I copy 600 GB of large video files from hard drive F to hard drive G, once 100 GB is copied, the write speed of hard drive G starts to go from 150 MB / s to 1-2 Mo / s. When I cancel the copy, the drive remains 100% active for a long time because it continues to write files (cache of about 2.6 GB) from 1 MB / s.

When the memory write is complete and the G hard disk activity reaches 0%, I try to copy the files again, but the "memory cache" fills up in a few seconds and the hard drive has a speed of 1 to 2 MB / s almost from the beginning. So, effectively, if the reader clogs once, he continues to slow down even after the end of memory writing. The only way to remedy the slowness is to restart the PC. When I restart, the system is working properly until I copy large files, which causes the slowness mentioned.

I did the same copy experience on another hard drive L and the same problem occurs.

I've been trying to exchange SATA cables and plug it into different SATA ports, but that does not help me.

The motherboard has 6 SATA ports, 2x SATA3 and 4xsata2. All locations are used for some readers. F G L drives are plugged into two sata 2 ports. During copy experiments, no other intense processes are running.

Is it a problem of motherboard, RAM or hard drive?

PC: i7 lga1155, asrock pro4, 24 GB RAM, Seagate 4 TB drives.