# reference request – expectation value of m parallel games

Easy example to start: you start a $$n$$on the side until your lucky number shows up. This is a Bernoulli process with $$p = 1 / n$$, your expected number $$E$$ throws is $$n$$.
Now imagine that you play this game $$m$$ times in parallel. Stop if all The dice indicate your lucky number, that is, the first victory wins the whole game. (Do not hesitate to also discuss the last winning case 🙂 It's still a Bernoulli process with $$p = 1- (1-1 / p) ^ m, E & # 39; = 1 / p$$.
Now generalize. The game $$G$$ is defined by setting a probability of gain $$p (i)$$ for each movement $$i$$, we only assume $$E$$ must be finished. To play $$m$$ copies of $$G$$ parallel, again the first victory in a copy ends the game. What can we say about $$E$$?
1. $$E & # 39; the E$$. (Trivial)
2 $$E / m$$ ? (tempting 🙂
3. leave $$p (i)$$ come from a well-known probability distribution, for example, instead of the above geometry of a fish or the like. Surely $$E$$ has already been calculated for many of these distributions?