






Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
15 questions on this theory with their solutions
Typology: Lecture notes
1 / 10
This page cannot be seen from the preview
Don't miss anything!
The solutions of problems 1,2,3,4,5,6, and 11 are written down. The rest will come soon.
3.1 Let ξj , j = 1, 2 ,... be i.i.d. random variables with common distribution
P
ξi = +
= p, P
ξi = − 1
= q := 1 − p,
and Fn = σ(ξj , 0 ≤ j ≤ n), n ≥ 0 , their natural ltration. Denote Sn := ∑nj=1 ξj , n ≥ 0. (a) Prove that Mn := (q/p)Sn^ is an (Fn)n≥ 0 -martingale. (b) For λ > 0 determine C = C(λ) so that
Znλ := CnλSn
be an (Fn)n≥ 0 -martingale.
3.2 Gambler's Ruin, 1 A gambler wins or looses one pound in each round of betting, with equal chances and independently of the past events. She starts betting with the rm determination that she will stop gambling when either she won a pounds or she lost b pounds. (a) What is the probability that she will be winning when she stops playing further.
(b) What is the expected number of her betting rounds before she will stop playing further.
j=
3.4 Let ξj , j = 1, 2 , 3 ,... , be independent and identically distributed random variables and Fn := σ(ξj , 0 ≤ j ≤ n), n ≥ 0 , the natural ltration generated by them. Assume that for some γ ∈ R the exponential moment m(γ) := E
eγξj
< ∞ exists. Denote S 0 := 0, Sn := ∑nj=1 ξj , n ≥ 1. Prove that the process
Mn := m(γ)−n^ exp{γSn}, n ∈ N,
is an (Fn)n≥ 0 -martingale.
3.5 Let (Ω, F, (Fn)n≥ 0 , P) be a ltered probability space and Yn, n ≥ 0 , a sequence of absolutely integrable random variables adapted to the ltration (Fn)n≥ 0. Assume that there exist real numbers un, vn, n ≥ 0 , such that
E
Yn+
Fn
= unYn + vn.
Find two real sequences an and bn, n ≥ 0 , so that the sequence of random variables Mn := anYn + bn, n > 1 , be martingale w.r.t. the same ltration.
k=
k=
We place N balls in K urns (in whatever way) and perform the following discrete time process. At each time unit we choose one of the balls uniformly at random (that is : each ball is chosen with probability 1 /N ) and place it in one of the urns also uniformly chosen at random (that is: each urn is chosen with probability 1 /K). Denote by Xn the number
of balls in the rst urn at time n and let Fn := σ(Xj , 1 ≤ j ≤ n), n ≥ 0 , be the natural ltration generated by the process n 7 → Xn. (a) Compute E
Xn+
Fn
(b) Using the result from problem 5, nd real numbers an, bn, n ≥ 0 , such that Zn := anXn + bn be martingale with respect to the ltration (Fn)n≥ 0.
3.7 Let Xj , j ≥ 1 , be absolutely integrable random variables and Fn := σ(Xj , , 1 ≤ j ≤ n), n ≥ 0 , their natural ltration. Dene the new random variables
Z 0 := 0, Zn :=
n∑− 1
j=
j+1 −^ E
j+
Fj^ ))^.
Prove that the process n 7 → Zn is an (Fn)n≥ 0 -martingale.
3.8 A biased coin shows HEAD=1 with probability θ ∈ (0, 1), and TAIL=0 with probability 1 − θ. The value θ of the bias is not known. For t ∈ [0, 1] and n ∈ N we dene pn,t : { 0 , 1 }n^ → [0, 1] by pn,t(x 1 ,... , xn) := t
∑n j=1 xj^ (1 − t)n−
∑n j=1 xj^.
We make two hypotheses about the possible value of θ: either θ = a, or θ = b, where a, b ∈ [0, 1] and a 6 = b. We toss the coin repeatedly and form the sequence of random variables Zn := p pn,a(ξ^1 ,... , ξn) n,b(ξ 1 ,... , ξn)^
where ξj , j = 1, 2 ,... , are the results of the successive trials (HEAD=1, TAIL=0). Prove that the process n 7 → Zn is a martingale (with respect to the natural ltration generated by the coin tosses) if and only if the true bias of the coin is θ = b.
Prove that Zn := Q(ηn) is an (Fn)n≥ 0 -martingale.
Galton-Watson Branching Process Let ξn,k, n = 1, 2 ,... , k = 1, 2 ,... be independent and identically distributed random variables which take values from N = { 0 , 1 , 2 ,... }. Assume that they have nite second moment and denote μ := E
ξn,k
, σ^2 := Var
ξn,k
. Dene the Galton-Watson branching process Z 0 := 1, Zn+1 :=
∑^ Zn k=
ξn+1,k
and let Gn := σ(Zj : 0 ≤ j ≤ n), n ≥ 0 , be its natural ltration. (a) Prove that Mn := μ−nZn, n = 0, 1 , 2 ,... is a (Gn)n≥ 0 -martingale. (b) Prove that E
Z^2 n+
Gn
= μ^2 Z n^2 + σ^2 Zn.
(c) Using the result from (b) prove that
Nn :=
M (^) n^2 − σ
2 μn+
μn^ − 1 μ − 1 Mn^ if^ μ^6 = 1, M (^) n^2 − nσ^2 Mn if μ = 1 is also a (Gn)n≥ 0 -martingale. (d) Using the result from (c) prove that if μ > 1 then sup 0 ≤n<∞ E
M (^) n^2
< ∞ (that is: the martingale Mn is uniformly bounded in L^2 ) while if μ ≤ 1 then limn→∞ E
M (^) n^2
k=
k=
k=
l=
2
2
2
3.12 Bonus Pólya Urn, 1 At time n = 0, an urn contains B 0 = 1 blue, and R 0 = 1 red ball. At each time n = 1 , 2 , 3 ,.. ., a ball is chosen at random from the urn and returned to the urn, together with a new ball of the same colour. We denote by Bn and Rn the number of blue, respectively, red balls in the urn after the n-th turn of this procedure. (Note that Bn + Rn = n + 2.) Denote by Fn := σ(Bj , 0 ≤ j ≤ n) = σ(Rj , 0 ≤ j ≤ n), n ≥ 0 , the natural ltration of the process. Let Mn := (^) B Bn n +^ Rn be the proportion of blue balls in the urn just after time n. (a) Show that n 7 → Mn, is an (Fn)n≥ 0 -martingale. (b) Show that P
Bn = k
= 1/(n + 2) for 0 ≤ k ≤ n + 1. (Hint: Write down the probability of choosing k blue and n − k red balls in whatever xed order.) (c) We will prove soon that M∞ := lim Mn exist almost surely. What is the distribution of M∞? (Hint: What is the limit of the distribution of Mn (identied in the previous point) as n → ∞?) (d) (To be done after learning about the Optional Stopping Theorem.) Let T be the number of balls drawn until the rst blue ball is chosen. Use the optional stopping theorem to show that E(^ T 1 +2^ )^ = 1/ 4.
(with Bn, Rn dened in this problem) is an an (Fn)n≥ 0 -martingale.
(c) Prove that
Nn(θ) := (^) (B(Bn^ +^ Rn^ −^ 1)! n −^ 1)!(Rn −^ 1)!^
θBn−^1 (1 − θ)Rn−^1.
(with Bn, Rn dened in this problem) is exactly the (regular) conditional density function of the random variable Θ, given Fn.