Buying small, medium and large hosting companies!

I am looking to buy a hosting site or two with active clients. It does not matter, big or big. Please, PM me the details, including the … | Read the rest of http://www.webhostingtalk.com/showthread.php?t=1748631&goto=newpost

Help with weird behaviour of Map[Take… on large lists

Lets say I create a giant list:

``````tmp=RandomReal[{1,10},600000000];
``````

Now I want to take 10 random sublists of length 150000:

``````rand = RandomInteger[{1, 600000000 - 149999}, 10];
First@AbsoluteTiming@Map[Take[tmp, {#, # + 149999}] &, rand]
(*0.007*)
``````

this works fine and is speedy. However if I want to extract a 100 sublists of 150000 I get:

``````rand = RandomInteger[{1, 600000000 - 149999}, 100];
First@AbsoluteTiming@Map[Take[tmp, {#, # + 149999}] &, rand]
(*87.707*)
``````

An increase of several orders of magnitude in time! If I use `Do[]` instead of `Map[]` I can speed it up substantially:

``````foo = Table[{}, 100];
First@AbsoluteTiming@
Do[foo[[i]]=Take[tmp, {rand[[i]], rand[[i]] + 149999}], {i, 1, 100}]
(*0.05*)
``````

Does anyone know what could be causing this?

Max

Windows 10 – Software to transfer files from a large hard drive to multiple smaller hard drives / flash?

Software suggestions for copying files from a large disk to multiple smaller disks, preserving all file attributes, such as the creation / modification date?

I'm looking for something that could fill a drive and ask for another destination to copy the remaining files until this drive is full and so on until all the files are copied.

FastCopy works almost, except that it does not allow to change destination without resetting what has already been copied (I can however be wrong)

python – Test class returning a large dictionary

I have simple course with the public `to build` method that I want to test. Currently, I affirm all the values ​​that it refers to each test. Is this a good practice or should I write a test for static values ​​and in other tests, check only the values ​​that change according to the inputs?

Implementation

``````FiltersAttachment class:
TYPE_OPTIONS = [
{"text": "All types", "value": "all"},
]

STATUS_OPTIONS = [
{"text": "Available / Unavailable", "value": "all"},
{"text": ":white_circle: Available", "value": "available"},
{"text": ":red_circle: Unavailable", "value": "unavailable"}
]

@classmethod
def _filter_options (keys, options, selected):
returns the list (filter (lambda t: t['value'] == selected, options))

@classmethod
def build (cls, check_type = & # 39 ;, status = & # 39;)
return {
'Fallback': 'Filters',
& # 39; callback_id: & # 39; resource_filters & # 39;
& # 39 ;: ####
& # 39; & # 39 ;: mrkdwn_in ['text'],
& # 39; & # 39 ;: acts [
{
'name': 'resource_type',
'text': 'Type',
'type': 'select',
'options': cls.TYPE_OPTIONS,
'selected_options': cls._filter_options(
cls.TYPE_OPTIONS, check_type)
},
{

'name': 'resource_status',
'text': 'Status',
'type': 'select',
'options': cls.STATUS_OPTIONS,
'selected_options': cls._filter_options(
cls.STATUS_OPTIONS, status)
}
]
}
``````

Tests

``````TestFiltersAttachment class (TestCase):
def assert_attachment (self, attachment):
self.assertEqual (attachment['fallback'], & # 39; Filters & # 39;)
self.assertEqual (attachment['callback_id'], & # 39; resource_filters & # 39;)
self.assertEqual (attachment['color'], # D2dde1 #
self.assertEqual (attachment['mrkdwn_in'], ['text'])

action_type = attachment['actions'][0]

self.assertEqual (action_type['name'], & # 39; resource type & # 39;)
self.assertEqual (action_type['text'], & # 39; Type & # 39;)
self.assertEqual (action_type['type'], & # 39; select & # 39;)
self.assertEqual (action_type['options'][0]['text'], & # 39; All types & # 39;)
self.assertEqual (action_type['options'][0]['value'], & # 39; all & # 39;)
self.assertEqual (action_type['options'][1]['value'], & # 39; webpages & # 39;)

status_action = attachment['actions'][1]

self.assertEqual (status_action['name'], & # 39; resource_state & # 39;)
self.assertEqual (status_action['text'], & # 39;)
self.assertEqual (status_action['type'], & # 39; select & # 39;)
self.assertEqual (status_action['options'][0]['text'], & # 39; Available / Not Available & # 39;)
self.assertEqual (status_action['options'][0]['value'], & # 39; all & # 39;)
self.assertEqual (status_action['options'][1]['text'], White_circle: available & # 39;)
self.assertEqual (status_action['options'][1]['value'], & # 39;) & # 39;)
self.assertEqual (status_action['options'][2]['text'],: Red_circle: not available & # 39;)
self.assertEqual (status_action['options'][2]['value'], & # 39; not available & # 39;)

def test_all_type_selected (auto):
attachment = FiltersAttachment.build (check_type = & # 39; all)
self.assert_attachment (attachment)

selected_type = attachment['actions'][0]['selected_options'][0]

self.assertEqual (type_selected['text'], & # 39; All types & # 39;)
self.assertEqual (type_selected['value'], & # 39; all & # 39;)

def test_all_status_selected (auto):
attachment = FiltersAttachment.build (status = & # 39; all)
self.assert_attachment (attachment)

selected_status = attachment['actions'][1]['selected_options'][0]

self.assertEqual (selected_status['text'], & # 39; Available / Not Available & # 39;)
self.assertEqual (selected_status['value'], & # 39; all & # 39;)
...
``````

Try reusing my old Dell Optiplex 745 as a media server. Ubuntu 16.04.5 LTS loaded, configure the drives of the extension 4 with snapRAID and SAMBA. I'm trying to download movie and TV files from my new computer running Windows 10. The server crashes when sending each file on 3 to 8. I receive different messages. Some begin with "BUG: Unable to handle datalink from the NULL Pointer of the kernel to 00000000000001". Some with "general defect: 0000 [#1] SMP "Sometimes the server PC restarts spontaneously, the files are between 1 and 5 GB and are transferred very quickly (about 100 MB / s), when they work.I tried it with the Windows firewall enabled and disabled. I've tried it with 8GB of memory and with 2 DIMMs removed to get 4GB. I've tried it with FTP using Filezilla on Windows, and dragging and dropping with the file manager None of this changes anything.I have 3 dmesg log files of 3 crashes, which can be provided if you can tell me how to attach them.They seem too long to stick here.I would appreciate it, if someone can help you.

large estate

big estate get it now

Hard Disk – How can I create large disks in Azure currently in public preview?

Azure began supporting very large disk sizes in September 2018 (as described here) as a public preview, currently only available in the previous version. `West Central United States` Region.

I'm trying to create a new `S60` (4-8 TiB HDD) in a resource group located in the region above. However, the disk creation dialog always limits me to a disk size of up to 4095 GB. It even tells me that the largest sizes are currently in public preview.

How can I create the disks described in the preview?

dynamic programming – is MCTS an appropriate method for this problem size (large action / state space)?

I'm doing a research on a problem of decision over time with $$t = 1, dots, 40$$ periods. At every step of time $$t$$, the (only) agent must choose an action $$a (t) in A (t)$$, while the agent is in state $$s (t) in S (t)$$. The chosen action $$a (t)$$ in condition $$s (t)$$ affects the transition to the next state $$s (t + 1)$$. There is therefore a decision problem of markov with a finite horizon.

In my case, the following is true: $$A (t) = A$$ and $$S (t) = S$$, while the size of A is $$6000000$$ and the size of S is $$10 ^ 8$$. In addition, the transition function is stochastic.

Since I am relatively new to the theory of Monte Carlo tree research (MCTS), I am wondering: is MCTS an appropriate method for my problem (due in part to the large size of A and S and the stochastic transition function?)

I've already read many articles on MCTS (for example, progressive enlargement and double progressive enlargement, which seems very promising), but someone may be able to talk to me about his experience in the application of MCTS to similar problems or appropriate methods to solve / space of action and a stochastic transition function).

dynamic programming – is MonteCarloTreeSearch an appropriate method for this problem size (large action / state space)?

I am doing research on a finite horizon decision problem with t = 1, …, 40 periods. At each step t, the agent (alone) must choose an action a (t) A (t), while the agent is in the state s (t) S (t). The chosen action a (t) in the state s (t) affects the transition to the next state s (t + 1). There is therefore a decision problem of markov with a finite horizon.

In my case, the following is true: A (t) = A and S (t) = S, while the size of A is 6,000,000 and the size of S is 10 ^ 8. In addition, the function of transition is stochastic.

Since I am relatively new to the theory of Monte Carlo tree research (MCTS), I am wondering: is MCTS an appropriate method for my problem (due in part to the large size of A and S and the stochastic transition function?)

I've already read many articles on MCTS (for example, progressive enlargement and double progressive enlargement, which seems very promising), but someone may be able to talk to me about his experience in the application of MCTS to similar problems or appropriate methods to solve / space of action and a stochastic transition function).

postgresql – Creating an index on a large column of text in postgres

We are talking about postgresql 9.5 on Windows with a database encoded in UTF8 and "English_United States.1252" (default) LC_COLLATE and LC_CTYPE.

The column is of type text and is used for queries ~~% zzz% & # 39; (Search in the newspaper messages)

Try to create a btree index on the column

``````CREATE INDEX tbl_col_btree_idx ON tbl USING btree (col);
``````

results:

``````ERROR: The index line requires 8456 bytes, the maximum size is 8191
``````

I've tried following the suggestion here: https://stackoverflow.com/questions/1566717/postgresql-like-query-performance-variations/13452528#13452528 and create a gin / gist index.

The attempt to create essential index

``````CREATE INDEX tbl_col_gist_trgm_idx ON tbl USING gist (col gist_trgm_ops);
``````

also failed with a maximum size problem.

The attempt to create gin index

``````CREATE INDEX tbl_col_gin_trgm_idx ON tbl USING gin (col gin_trgm_ops);
``````

failed with the error:

``````ERROR: Invalid multi-byte character for locale.
The LC_CTYPE locale of the server is probably incompatible with the database encoding.
``````

This sounds too complicated for a simple ordinary need to perform text search queries.
I just want to improve the performance of queries ~~% zzz% & # 39; A "simple" way to do it?