numpy – Get rid of loops for slow in Python

I have created a small python script that will generate test sets for my project.

The script generates 2 datasets with the same dimensions n*m. One contains 0.1 binary values ​​and the other contains floats.

# Probabilities must sum to 1
AMOUNT1 = {0.6 : get_10_20,
           0.4 : get_20_30}

AMOUNT2 = {0.4 : get_10_20,
           0.6 : get_20_30}

OUTCOMES = (AMOUNT1, AMOUNT2)

def pick_random(prob_dict):
    '''
    Given a probability dictionary, with the first argument being the probability,
    Returns a random number given the probability dictionary
    '''
    r, s = random.random(), 0
    for num in prob_dict:
        s += num
        if s >= r:
            return prob_dict(num)()


def compute_trade_amount(action):
    '''
    Select with a probability, depending on the action.
    '''
    return pick_random(OUTCOMES(action))


ACTIONS = pd.DataFrame(np.random.randint(2, size=(n, m)))
AMOUNTS = CLIENT_ACTIONS.applymap(compute_trade_amount)

The script runs correctly and generates the output that I need, but if I want to scale to for many dimensions the for in loop pick_random() slows down my computation time.
How can I get rid of it? Maybe with some understanding of the table using numpy?

What throws my reasoning is the if-stmt. Because sampling has to happen with probability.

Nikon – Slow shutter speed in aperture mode

You said "all night", which seems to be too dark for your settings. The ways to get a faster shutter speed are:

  1. Enter a brighter area with enough light, where automation can do better. Photography is difficult without enough light. One way to provide more light is to use the flash.

  2. Open the aperture and / or increase the ISO so that a faster shutter can work. If you now see a shutter speed of around 1 second, it seems like you need at least 5 or 6 more stops, to be still slow, but maybe enough.

  3. You can use S or M mode to directly set a faster shutter speed. Then the automation will increase automatic ISO and / or mode A will open more the opening, if there is still room for it to be able to open more.

Frankly, what you really need to know is the exposure of the camera. In Google searches, this is often called the exposure triangle which is not a big name (there is no triangle, there are only the three factors), but it's an extremely important idea to know everything about using the camera. This is how the shutter speed, aperture and ISO combinations work together to provide exposure, but in particular, the parameters you need for a situation, such as Stopping motion or increasing the depth of field. You can find a lot on this subject on Google to inform you about the exhibition. This is the first thing a photographer has to learn.

Slow January Sales

Is January still so calm? From November to December, we made 20 to 25 new registrations per month. This month we did 3. Has anyone else … … | Read the rest of https://www.webhostingtalk.com/showthread.php?t=1794463&goto=newpost

seo – How can I get rid of those outgoing links that slow my YSlow score?

I am trying to make my website faster and GTMetrix tells me that, among other things, these links reduce my YSlow score due to insufficient browser caching. These are all third party links and although I have disabled and uninstalled the plugins from which they come, they continue to slow down my site as I cannot expire them due to their outgoing nature. I have browsed cPanel and all index.php, uninstalled them from Google Tag Manager and they still prevail. What should I do? Where can I find them or how can I assign them a longer term? One of them, I have to delete it because it blocks the first loading of my site. When I look at the Chrome Element Inspector, the links appear inside a script in the index of the page.

Here are the links: take advantage of browser caching for the following cacheable resources:

https://serve.albacross.com/track.js (expiration not specified)
https://js.hs-scripts.com/4992870.js (1 minute)
https://js.hs-analytics.net/analytics/1579265100000/4992870.js (5 minutes)
https://www.google.com/recaptcha/api.js?render=6LeTjcEUAAAAAGHEgVExfcfx9p8ABN9Lck5wv9wa&ver=3.0 (5 minutes)
https://www.google.com/recaptcha/api2/webworker.js?hl=en&v=A1Aard-wURuGsXRGA7JMOqVO (5 minutes)
https://js.hsadspixel.net/fb.js (10 minutes)
https://www.googletagmanager.com/gtm.js?id=GTM-KDR5T9R (15 minutes)
https://www.google-analytics.com/plugins/ua/linkid.js (1 hour)
https://www.google-analytics.com/analytics.js (2 hours)
https://snap.licdn.com/li.lms-analytics/insight.min.js (8 hours 47 minutes)
https://www.linkedinbranding.es/cdn-cgi/scripts/5c5dd728/cloudflare-static/email-decode.min.js (2 days)

I have my htaccess configured so that all files have a duration of at least 1 month, I think the lower expiration was 2 weeks, but I cannot reach these links.

sql – Slow execution subquery

This query is running very slowly and I wonder if it can be improved?

We have an Access database divided in front / back end with about 50 users. BE is in a network folder and FE is on users' hard drives.

The data entered in the FE is stored in FE tables until the user has finished entering all the data required for a record. They then click on a button to send the data to the BE in one go, to tables identical to those of the FE. In the sql below the table BE is suffixed with & # 39; _Share & # 39 ;.

The table contains 2 keys: QuoteID and OptionID. There is one to many between the two, for example:

QuoteID   OptionID

1234      1
1234      2

3333      1
3333      2
3333      3

As they work, they create data for new options to accompany existing quotes. What the code does is check if a QuoteID on the BE already has an OptionID created by the user on the FE, otherwise the data of this OptionID is added to the BE.

INSERT INTO T_Option_Category_Benefits_Share
SELECT T_Option_Category_Benefits.*, * 
FROM T_Option_Category_Benefits 
WHERE (((T_Option_Category_Benefits.QuoteID)=1971) 
AND ((T_Option_Category_Benefits.OptionID) 
NOT IN (SELECT T_Option_Category_Benefits_Share.OptionID FROM T_Option_Category_Benefits_Share WHERE T_Option_Category_Benefits_Share.QuoteID=1971)));

Key fields are not indexed. Table BE contains 18 columns with approximately 100,000 rows. The network is generally quite slow during peak hours. We are using Office 365 on Windows 10.

design – How to keep user input consistent with supposed input to slow backend calculations?

Context:

I am writing a clinical trial simulator. The user defines future trial options, for example a trial with 100 placebo patients, 200 treated patients, an "optimistic" outcome scenario, etc. There may be 1 to 20,000 of these options. For each option, 10,000 to 100,000 test results are simulated. The simulated data is then used for analysis, for each assumed option.

Implementation:

It is an Angular / Electron desktop application (in the future there will be a web extension). The front end sends a REST API request with trial options. The Python back-end performs simulations in parallel. It stores the results in a PostgreSQL database, with one line per option. The database is located on the user's laptop or on external storage. A single user works on a simulation.

Performance:

Simulations can take a long time, their results requiring a lot of memory. Therefore, once the user has added the trial options, I would only simulate for these additions, rather than for each option. I also allow to interrupt a simulation in progress.

Data consistency:

What the user assumes that the inputs – the trial options – must coincide with the inputs of the last recorded simulations (and subsequent analyzes). Therefore, I always want to inform the user of a difference between the front end and the database. Analyzes will therefore be disabled if there is a difference.

Question:

How to guarantee this consistency? For example, the front end could follow three sets of test options: (1) for completed simulations; (2) for current simulations and (3) for current user inputs without simulation launched. But it seems fragile. Also, I'm not comfortable with using the front end rather than the back end and the database as the source of truth. Does this front end make sense, or should I go with the back end? Or what else would work better?

Note: The architecture of predictive modeling software deals with similar software, albeit with stable user input and a focus on performance. On top of that, I haven't found a lot of relevant information about SE.

IT architecture – Which of these devices could slow down the processor?

I have a test question.

What devices inside the processor are used to indirectly speed up work
(then the program does not execute code for this device)

Possible answers: DRAM | Cache | Pipeline | GPU | RAM | ARM | Battery | FPU

I think we can immediately say that DRAM, GPU and RAM are bad choices because they are not inside the processor – these are different parts of the computer.
The battery is also in RAM, not in the CPU.
The answers on the left are therefore cache, pipeline, arms and fpu?
Also unsure of the floating point number.

optimization – What is the best way to host a site with many images without it being too slow?

I have a friend who wants me to help him with his WordPress site which contains around 150 images for his gallery. Their site is quite slow even after optimizing the images using the ImageOptim application. They use SiteGround as their web hosting provider, but they only use the StartUp package because it is the most affordable. I know the site is slow to load because of the images, but I don't know what is the best approach to try to speed it up without paying for a more expensive hosting plan.

I think maybe the best way to do that would be to just put their images on Google Images and use them as a gallery instead, or maybe an Instagram feed like Smash Balloon.

What would you all suggest?

Thank you

Slow WHMCS support? | Web Talk Hosting

Has anyone else seen slow support from WHMCS recently? After upgrading to 7.9, their upgrade script deleted a database table that broke our installation, and in 24 hours, we had only two responses, which were not neither informative nor useful.

I guess this is a sign of the times when they are picked up by WebPros and then resold again? It is frustrating when you are paying for a maintenance contract and the support is far from helpful or responsive.

acegen – slow calculations for Mathematica 12.0 + Ace 7.006

I upgraded Matehmatica to version 12.0 and Ace to version 7.006. I am surprised that the calculations now take much longer than using the old version of Ace 6.804. For example, using the built-in example:

AceGen -> Help -> AceFEM Manual -> AceFEM Examples -> Cyclic tension test, advanced post-processing, animations

I have the following total absolute times:

As 7.006: 122. 9 s

Ace 6,804: 14.3 s

Can anyone explain the reason? Obviously, the calculations are done using the same computer and software, the only difference is the version of Ace. Now I am not able to calculate anything advanced, it hangs at the first step.