Why does the average Liberal continue to believe story after story that has turned out to be a fake? You would think that after being duped?

You are stupid idiots who ignore reality.

The reality of the situation ….. Does the nation have any meaning? The unity of the people has become stronger or weaker since this so-called "nationalist" took office.

Once you have accepted the truth about it. You can see how these stories, even if they are not true, show his character and that of his associates. Guilty by association.

It's as if the police had told me. If you do not sell drugs, why do you hang around drug traffickers?

Moms have said. If teachers call home for any reason. I do not believe you. Give them no reason. Your apologies will not be accepted.

I want Trump at the same level.

Average TTFB (Time To First Byte) for a shared reseller?

Hi all,

My host has recently moved all resellers to its new high performance cloud infrastructure. At least that's how it was announced to us.

Since then, some of my most serious clients have complained that the Time To First Byte (TTFB) website was much longer than 10 seconds when testing their site with GTMetrix.

This ultimately harms their SEO because, as we know, Google penalizes slow sites. (Search it on Google)

So, when I checked on my side, he was absolutely right. The TTFB was atrocious! I immediately contacted my host to inform him and they quickly started to solve problems and solve problems.

Since I reported back to them, they worked closely with the vendor to make configuration changes, migrate hypervisors, restart the cloud infrastructure, resynchronize disks, and so on.

All this work has improved the TTFB to 3 seconds but I ALWAYS feel that it is not good at all.

I've compared the TTFB to other hosts with which I also have accounts and their TTFB is less than 200ms.

I've researched this and all the content I've read indicates that an optimal TTFB should be 200 ms and below what I see on my other providers. ;accommodation. (Another Google It)

I sent follow-up emails to my host and I have not heard from them yet. It's been almost 3 weeks now and the problem persists.

Needless to say, my clients are not happy at all and they plan to leave. How can I stop this? I can not because in my opinion they have a valid reason.

My question is: what is your average TTFB on your reseller host? I would like to get an idea of ​​what others are experiencing and what the average TTFB should be on shared reseller servers.

Thank you

SQL Server – more efficient calculation of moving average over 3 periods

I have a table of numerical answers to the survey questions. The goal is to call a stored procedure, with a parameter for PeriodType ("Quarter", "Month" or "Year") that returns this output:

Year    Period    Score    WeightedAverage
2019    1         85.7     85.7
2019    2         87.6     85.9
2019    3         90.5     88.6
...    ...        ....     ....  

Couple of notes:

  • Weighted should be a moving average of 3 periods, but NOT the average of the current averages and the previous two. It must take into account all individual responses so that a period of 1,000 responses has a greater effect than a period of 50 responses. The use of windowed functions such as LEAD / LAG or AVG OVER () does not seem to be the solution.
  • The configuration is shown below and the #answers table contains approximately 302,000 rows. The last query takes 35 seconds. Would like this optimized the best possible.
  • The scan of the table to get # answers (c) creates an index spool (eager) then a table spool (lazy) in the execution plan. Line estimate 34; running estimate 2,705; actual rows 160,773,768; actual performances 33,037 (yikes). There must be a faster way that my lower brain can not solve.
  • Running SQL Server 2017 Enterprise, but compatibility level 100 (SQL 2008)

CREATE TABLE #answers (
ResponseID int,
QuestionID int,
Reply float,
CompletedDate datetime,
dateID int
)

– the code to insert 302 000 answers would go here

– create a table of all possible dates (there are no gaps), with an identity for each
SELECT id = identity (int, 1, 1), (year) = DATEPART (year, CompletedDate), period = CASE @PeriodType WHEN & # 39; Month & # 39; THEN DATEPART (month, CompletedDate) WHEN & # 39; Quarter & # 39; THEN DATEPART (quarterly, CompletedDate) ELSE NULL FIN
EN #date FROM #answers
GROUP BY DATEPART (Year, CompletedDate), CASE @PeriodType WHEN & # 39; Month & # 39; THEN DATEPART (month, CompletedDate) WHEN & # 39; Quarter & # 39; THEN DATEPART (quarter, CompletedDate) ELSE NULL FIN
ORDER BY DATEPART (year, CompletedDate), CASE @PeriodType WHEN & # 39; Month & # 39; THEN DATEPART (month, CompletedDate) WHEN & # 39; Quarter & # 39; THEN DATEPART (quarter, CompletedDate) ELSE NULL FIN

– update the answer table with the dateID assigned
UPDATE a SET dateID = b.id
# Answers to
INNER JOIN #dates b ON DATEPART (year, a.CompletedDate) = b. (Year) AND ((CASE @PeriodType WHEN 'Month THEN DATEPART (month, a.CompletedDate) WHEN' Quarter THEN DATEPART (quarter, a.CompletedDate) ELSE NULL END) = b.period OR (b.period IS NULL AND @PeriodType NOT IN ('Month', 'Quarter')))

– attach the answers to themselves according to (dateID – 2) for a score of 3 periods
SELECT #dates. (Year), #dates. (Period), internal.score, internal.weightedaverage
DE # dates
LEFT EXTERNAL JOINT (
SELECT b. (Year), b.period, score = CAST (10 * AVG (a.Answer) AS decimal (10, 1)), weightedparate = CAST (10 * AVG (c.Answer) AS decimal (10, 1))
# Answers to
INNER JOIN #dates b ON a.dateID = b.id
INNER JOIN #answers c ON ID.dat> a.dateID – 2 AND c.dateID <= a.dateID
GROUP BY b. (Year), b.period
) internal ON #dates. (year) = internal. (year) AND #dates. (period) = internal. (period)
ORDER BY #dates. (Year), #dates. (Period)

My apologies for formatting, I do not see how to indent and clean how that t-sql is displayed. Thank you for your help.

[ Politics ] Open question: Is Briscoe Cain an average Republican without self-control?

[Politics] Open question: Is Briscoe Cain an average Republican without self-control?

python – Average π: Archimedes against Gauss – Calculation of π by generalized means

I wrote this simplified code to calculate Pi for educational / demonstration purposes.

These methods are based on generalized means: see a presentation on Pi and AGA.

The Archimedean method gives the linear convergence, which means that you get two more precision bits per iteration.

The Gaussian method gives a quadratic convergence, which means that your precision doubles each iteration.

Once can integrate these methods into timers or print current results to see their convergence.

The method of Archimedes could have been used to refute several millennia of false claims about Pi. Alas, history.

import decimal


def pi_arc():
    """Archimedes c. ~230 B.C.E."""
    a, b = D(3).sqrt() / D(6), D(1) / D(3)
    pi = 0
    while True:
        an = (a + b) / 2
        b = (an * b).sqrt()
        a = an
        piold = pi
        pi = 2 / (a + b)
        if pi == piold:
            break
    return D(str(pi)(:-3))


def pi_agm():
    """Gauss AGM Method c. ~1800 A.D. """
    a, b, t = 1, D(0.5).sqrt(), 1 / D(2)
    p, pi, k = 2, 0, 0
    while True:
        an = (a + b) / 2
        b = (a * b).sqrt()
        t -= p * (a - an)**2
        a, p = an, 2**(k + 2)
        piold = pi
        pi = ((a + b)**2) / (2 * t)
        k += 1
        if pi == piold:
            break
    return D(str(pi)(:-3))


if __name__ == "__main__":
    prec = int(input('Precision for Pi: '))
    """Plus 3 for error"""
    decimal.getcontext().prec = prec + 3
    D = decimal.Decimal
    print(pi_arc())
    print(pi_agm())

How to get the average value of each second? MySQL

I wonder how to get the average data out of every second.
My dataset is in millisecond and i can output them in millisecond by

SELECT * FROM some_table 
where RealTime > '2019-09-10 23:00:00' LIMIT 200;

Below the data format

AcX       RealTime
-15836   2019-09-05 15:02:37.502
-16666   2019-09-05 15:02:37.508
-17645   2019-09-05 15:02:37.512
...

But how can I get average data every second? THX

calculation – special condition in the average cost function

written here that the absolute minimum of a cost function will occur at its critical point when $ overline {C} (x) = 0 $
since $ overline {C} (x) = frac {C (x)} {x} $
so, the derivative is
$$ overline {C} (x) = frac {xC & # 39; (x) – C (x)} {x ^ 2} $$
in this case, the marginal cost (derived from the main function) will be equal to the average cost (function at x / x)

but I do not know how to handle that. I mean, if this happens at a critical point, does not that mean that the marginal cost (first derivative) is zero at this point? (since x is a critical point)
and above the function will turn into
$$ 0 = frac {0 – C (x)} {x ^ 2} $$ and that will force C (x) to become 0 also zero, and that makes no sense, I hope someone will tell me where I got it wrong

Why can not I find the average?

Each student in a course must submit 3 lab work and pass 2 tests. Design a program to do the following. Ask the user to enter 3 lab notes and 2 test notes. Calculate and display the average of the laboratory and the average of the test. Also calculate and post the course mark, which is 55% of the lab average plus 45% of the test average.

lab1 = int (input ("Lab 1:"))
lab2 = int (input ("Lab 2:"))

lab3 = int (input ("Lab 3:"))

labTotal = ((lab1 + lab2 + lab3) /3*.55))

print = (labTotal)

test1 = int (input ("Test 1:"))

test2 = int (input ("Test 2:"))

testTotal = ((test1 + test2) /2*.45))

enter code hereprint (testTotal)

enter code heretotal = labTotal + testTotal

enter code hereprint (total)

when I run the program This does not average the scores.

Please help!

On average, how long do Adventurer's League sessions last?

Just curious how long AL games tend to last.

What is the average (if any) of irrational numbers between 0.1?

Intuition tells me that it exists and that it is equal to 0.5, but when it is infinite, my intuition is a little confusing.

I imagine that it would be equal to 0.5 if the set of irrational numbers is even distributed, and I wonder more about the fact that this might be true, but I'm not sure. I do not have the tools to start proving it in any way.

That said – here's my stab at it:

OK, we know that the only way that two combined irrational numbers can add to a rational number is if they are the additive complement of a rational number.

IE is true and only true when the irrational numbers have the following form: (r1 + i1) + (r2 – i1) = r1 + r2, where r1, r2 are rational and i1 is irrational.

Suppose now that ni is a list of irrational numbers between 0 and 1 and that r1 = 0, r2 = 1.

So the average will be SUM (ni) / n. Here is the clever part. Take our formula above and do a little substitution.

mean = SUM ((r1 + ni) + (r2-ni)) / 2n * = SUM ((0 + ni + 1-ni)) / 2n

= SUM (1) / 2n and SUM (1) n times is basically n

average = 1/2

Now the question is – is this irrational number list of the form r1 = (i1 + i2) complete? I do not know

I think it does. Someone sees holes? Am I (as I suspect) just working in a circle?

* 2n because we go through the list of irrational numbers twice from 0 to 1 and from 1 to 0