Consider the following exercise, from Lawden *Stochastic calculation*:

Let $ X_1, X_2, dots $ to be independent random variables and identically distributed with

$$

P {X_j = 1 } = q, qquad

P {X_j = -1 } = 1 – q.

$$

Let $ S_0 = $ 0 and for $ n ge $ 1, $ S_n = X_1 + X_2 + cdots + X_n $. Let $ Y_n = e ^ {S_n} $.

- For what value of $ q $ is $ Y_n $ a martingale?
- For the other parts of this exercise, let's assume $ q $ takes the value of

Part 1. Explain why $ Y_n $ satisfies the conditions of the martingale convergence theorem.
- Let $ Y_ infty = lim_ {n to infty} Y_n $. Explain why $ Y_ infty = 0 $. (Hint: there are at least two

ways to show that. We are to consider $ log Y_n $ and use the law of the great

Numbers. Another is to note that with a probability a $ Y_ {n + 1} / Y_n $ Is

not converge.)
- Use the optional sampling theorem to determine the probability that

$ Y_n $ never reaches a value greater than 100.
- Is there a $ C < infty $ such as $ E[Y_n^2 ] C $ for everyone $ n $?

For the first question, we have to check that $ E (Y_ {n + 1} | mathcal {F} _n) = E (Y_n | mathcal {F} _n) $, or $ mathcal {F} _n $ is the filtration generated by $ X_1, dots, X_n $.

Clearly,

$$

begin {align}

E (Y_ {n + 1} | mathcal {F} _n) & = E (Y_n e ^ {X_ {n + 1}} | mathcal {F} _n) = Y_n E (e ^ {X_ {n + 1}} | mathcal {F} _n) \

& = Y_n left (qe + (1-q) e ^ {- 1} right).

end {align}

$$

Therefore $ Y_n $ is a martingale so summer

$$

qe + (1-q) e ^ {- 1} = 1,

$$

which gives

$$

q = frac {1-e ^ {- 1}} {e – e ^ {- 1}}.

$$

For Part 2, we need to verify that $ E (| Y_n |) <C $ for everyone $ n $. I guess it comes from the law of the subconscious statistician:

$$

E (| e ^ {S_n} |) = E (e ^ {S_n}) = e ^ {E (S_n)}.

$$

however,

$$

E (S_n) = n (2q-1),

$$

who is not independent on $ n $, since $ 2q-1 $ is not zero.

For Part 3, let's consider $ log Y_n = S_n $and the density function of $ S_n $ can be found for example by the moment generating function of $ X_i $:

$$

M_ {S_n} (t) = M_ {X_i} (t) ^ n = left (qt + (1-q) t ^ {- 1} right) ^ n

$$

and I guess the limit $ Y_ infty to0 $ can be justified by noting that $ M_ {S_n} (t) to0 $ as $ n to infty $. Indeed, with the binomial formula:

$$

M_ {S_n} (t) = sum_ {k = 0} ^ n {n choose k} q ^ kt ^ k (1-q) ^ {n-k} t ^ {k-n}

$$

it's pretty obvious like $ q ^ k (1-q) ^ {n-k} $ tend to zero and $ t $ is in a neighborhood of 1.

For point 4, I think it might be easier to change the question to the equivalent: what is the probability that $ S_n $ never reaches the value $ ceil { log100} = $ 5?

By the optional sampling theorem:

$$

begin {align}

E (S_n) & = E (S_0) = 2q-1 \

& = 5 P {S_n = 5 } – a P {S_n = -a } = 5 P {S_n = 5 } – a (1-P {S_n = 5 }).

end {align}

$$

This gives:

$$

P {S_n = 5 } = frac {2q-1 + a} {5 + a},

$$

so the probability that $ S_n $ never reached 5 (and that $ Y_n $ never reaches 100) is:

$$

lim_ {a to infty} frac {2q-1 + a} {5 + a} = 1,

$$

therefore, sooner or later, the stochastic process will reach the value 100.

With regard to point 5, by repeating a reasoning inspired by point 2, I would say that such a constant does not exist, but I am not sure.

Any help on this problem would be much appreciated (especially if there are faster and more normal ways to do it).