Plan Martingales 1. Basic Definitions 2. Examles 3. Overview of Results Reading: G&S Section 12.1-12.4 Next Time: More Martingales Midterm Exam: Tuesday 28 March in class Samle exam roblems ( Homework 5 ) available tomorrow at the latest A Peculiar Etymology 1. 1. Horse Riding. A stra or arrangement of stras fastened at one end to the noseband, bit, or reins of a horse and at the other to its girth, in order to revent it from rearing or throwing its head back, or to strengthen the action of the bit. 2. Nautical. A stay which holds down the jib-boom of a square-rigged shi, running from the boom to the dolhin-striker. 3. Gambling. Any of various gambling systems in which a losing layer reeatedly doubles or otherwise increases a stake such that any win would cover losses accrued from receding bets. 4. Probability and Statistics. stay tuned... The Basic Definition 2. (from which the more general definition below will sring, and which will be the most common case, by why by secific when we can be general, eh?) A stochastic rocess (X n ) n 0 is a martingale if 1. E X n < for all n 0, and 2. E(X n+1 X 0,..., X n ) = X n. This definition will be suerseded below but is the base case. Useful metahor: Let X n be a gambler s wealth at after the nth bet. martingale catures a notion of a fair game. How? This definition of a The Original Martingale 3. Suose you start with a large fortune w 0. (How large it needs to be, we will see.) You are offered a series of bets with a robability of winning equal to 1/2. (Yeah, right.) The original martingale strategy is to bet $d, say, on the first bet. If you win, sto. If you lose, bet $2d on second bet. Continue this way, stoing lay as soon as you win and betting $2 n d on the n + 1st bet. You will win eventually, say at bet t 1, at which oint you will have won 2 t 1 d (d + 2d + 2 2 d + 2 t 2 d) = d (1) dollars, where the second term is zero when t = 1. Seems like a sure thing, eh? Let W n denote your wealth after the nth bet in a series of bets like the above but without the stoing rule. Then, W 0 = w 0 and where Ξ k are iid Bernoulli 1/2. W n = w 0 + n Ξ k d2 k 1, (2) k=1 1 21 Mar 2006
Now, (W n ) is a random walk, with E W n < and E(W n+1 W 0,..., W n ) = E(W n+1 Ξ 0,..., Ξ n ) = W n + E(d2 n Ξ n+1 ) = W n. (3) Hence, (W n ) n 0 is a martingale by the above definition. Still, it s not quite what we want. Let T = min n 1: Ξ n = 1 be the first bet that you win. What can we say about T? Well first off, it s Geometric 1/2, that is, P{T = n} = 2 n for n 1. But wait, is there more? There s something familiar abou this time.... X 0 = W 0 = w 0. This is your wealth. Define X n = W T n for n 1. This is your wealth on the the martingale system. Any questions? Loosely, if n T, X n+1 = X n. If n < T, then X n+1 = W n+1 and X n = W n. So, (X n ) should be a martingale. Formally, we need to show that E X n <. And that E(X n+1 X 0,..., X n ) = X n. Note that X n+1 1{T n} = X n 1{T n} (4) X n+1 1{T > n} = (X n + Ξ n+1 d2 n ) 1{T > n}. (5) That is, X n+1 = X n 1{T n} + (X n + Ξ n+1 d2 n ) 1{T > n}. (6) What do we need to bring this home?... the class stes in......and thus we get that (X n ) n 0 is a martingale as defined above. But would you want to use this strategy. Let D be the biggest debt you owed rior to winning. That is, D = X T 1. Then, n 2 ED = 2 n d 2 k = d 2 n (2 n 1 1). (7) n=1 k=0 n=1 Even Bill Gates should think twice about using this aroach. 2 21 Mar 2006
Examle 4. Another aroach to Gambler s Ruin Let S n be a simle random walk with S 0 = w {0,..., N}. Suose we sto the walk when it first hits either 0 or N. Which will it hit first? Write S n = w + n k=1 Ξ k where the Ξ K are iid Bernoulli. Let q = 1. ( ) Define Y n = q Sn. Let T be the first hitting time of {0, N}. We want to understand the limiting behavior of X n = Y T n. As before, ( ) q Sn+Ξ n+1 X n+1 = X n 1{T n} + 1{T < n}. (8) Note that T is a stoing time with resect to Ξ 1,..., Ξ n. Hence, ( (q ) Sn+Ξ n+1 ) E(X n+1 Ξ 1,..., Ξ n ) = E(X n 1{T n} Ξ 1,..., Ξ n ) + E 1{T < n} Ξ 1,..., Ξ n (9) ( ) ( q Sn (q ) Ξn+1 ) = X n 1{T n} + E Ξ 1,..., Ξ n 1{T < n} (10) ( ) q Sn ( ) q Ξn+1 = X n 1{T n} + E 1{T < n} (11) = X n 1{T n} + ( ) q Sn ((q/) + q(/q))1{t < n} (12) = X n 1{T n} + X n 1{T < n} (13) = X n. This isn t quite the definition, though. Perhas we would work it out that the information in Ξ 1,..., Ξ n is the same as the information in X 0,..., X n. A good strategy. But what a bother. Take this as motivation for the general definition below. Now, EY n = (q/) w for all n, so doesn t it make sense that EY T would be the same? And thus, EX n = (q/) w = EX T for all n. Hmmm...let s suose it s true. Take this as motivation for one of the main results later. If true, then ( ) q 0 ( ) q N ( ) q w EX T = r w + (1 r w ) =, (15) where r w is the ruin robability with initial wealth w. Then, ( ) q w ( q N ) (14) r w = 1 ( q ) N n (16) as long as 1 2. That s what we got before. Nice. We ve seen a martingale argument for the = 1 2 assumtion. case as well. A lot of ower in that simle 3 21 Mar 2006
Intermediate Definition 5. A sequence of random variables (Y n ) n 0 is a martingale with resect to a sequence (X n ) n 0 if for all n 0, 1. E Y n < 2. E(Y n+1 X 0,..., X n ) = Y n. If X n = Y n for all n, this reduces to the revious definition. Definition 6. A filtration in a σ-field F is a sequence (F n ) n 0 of sub-σ-fields of F such that F n F n+1 for all n 0. Write F = lim n F n for the smallest σ-field containing all the F n s. A sequence of random variables (X n ) n 0 is adated to the filtration if X n is F n -measurable for all n 0. That is, events {X n A} F n. Intuitive unacking follows. General Definition 7. Let (F n ) n 0 be a filtration in F and let (Y n ) n 0 be a sequence of random variables adated to that filtration. Then, (Y n ) is a martingale (with resect to (F n )) if 1. E Y n < 2. E(Y n+1 F n ) = Y n. If F n = σ(y 0,..., Y n ), we get the basic definition. If F n = σ(x 0,..., X n ), we get the intermediate definition. More intuitive unacking.... Definition 8. Let (F n ) n 0 be a filtration in F and let (Y n ) n 0 be a sequence of random variables adated to that filtration. Then, (Y n ) is a sub-martingale (with resect to (F n )) if 1. E max(y n, 0) < 2. Y n E(Y n+1 F n ). And (Y n ) is a suer-martingale (with resect to (F n )) if 1. E max( Y n, 0) < 2. Y n E(Y n+1 F n ). Questions 9. a. Show that Y n is a martingale if and only if it is a sub-martingale and a suer-martingale. b. Suose that Y n is a sub-martingale, what can you say about Y n? c. Find three examles of a sub- or suer-martingale. 4 21 Mar 2006
Examle 10. Simle Random Walk Let S n be a simle random walk as we have defined so often. Then, E S n < for all n because S n n and E(S n+1 S 0,..., S n ) = S n + ( q), (17) Note that σ(s 0,..., S n ) = σ(s 0, Ξ 1,..., Ξ n ). Define X n = S n n( q). Then X n is a martingale. (Why?) Examle 11. Sums of Random Variables The same trick works with more general sums. Suose that (X n ) n 0 are indeendent random variables with E X n <. Let Y n = X 1 + + X n for n 0. Then, (Y n ) is a martingale by the same argument. (Make it.) We can generalize this. Let (X n ) n 0 be a sequence of real-valued random variables. Let g k and h k be functions on R k with the h k H k < for some constants H k. Let f be a function so that E f(g k+1 (X 0,..., X k )) <. Let Z k = f(g k+1 (X 0,..., X k )). Define Y n by Y n = n (Z k E(Z k X 0,..., X k 1 )) h k (X 0,..., X k 1 ), (18) k=0 Then, (Y n ) n 0 is a martingale. Why? This is examle is not so interesting by itself, but it is quite a general mechanism for constructing martingales. And it hels you unack comlicated exressions. 5 21 Mar 2006
Examle 12. Variance of a Sum Let (X n ) n 0 be iid random variables with X 0 = 0 and EX n = 0 and EXn 2 = σ 2 for n 1. Define ( n ) 2 Y n = X k nσ 2. (19) Then (Y n ) is a martingale. Show this. k=1 What can you say about M n = ( 1 n n k=1 X k ) 2 σ2 n? Examle 13. The Doob Process Let X be a random variable with E X <. Let (Z n ) n 0 be an arbitrary sequence of random variables. Define X n = E(X Z 0,..., Z n ). Note that E X n E X. (Why?) And by the Mighty Conditioning Identity. E(X n+1 Z 0,..., Z n ) = E(E(X Z 0,..., Z n+1 ) Z 0,..., Z n ) (20) = E(X Z 0,..., Z n ) (21) = X n, (22) Examle 14. Harmonic Functions on Markov Chains Let X be a countable-state Markov chain on S with transition robabilities P. Let be the drift oerator: V = P V V. Recall that any V for which V = 0 we called harmonic. This is a function for which E(V (X n+1 ) X n ) = V (X n ). (23) Hmmm... Define Y n = V (X n ) for a harmonic V. As long as E V (X n ) is finite, we ve got a martingale. This will hold, for examle, if we choose a bounded harmonic function. This is a useful mechanism for discovering martingales in Markov Chains. 6 21 Mar 2006
Examle 15. Eigenvector Induced Martingales for Markov Chains A slight generalization. Now, let V be an eigenfunction (eigenvector) of P with eigenvalue λ. That is, for all s S, P (s, s )V (s ) = λv (s), (24) s S or equivalently, E(V (X n+1 ) X n ) = λv (X n ). (25) So, define Y n = λ n V (X n ) (26) for such a V and n 0. If E V (X n ) <, we have ) E(Y n+1 X 0,... X n ) = E (λ (n+1) V (X n+1 )givenx 0,..., X n (27) = λ n λ 1 E(V (X n+1 ) X n ) (28) = λ n λ 1 λv (X n ) (29) = Y n, (30) then (Y n ) is a martingale with resect to X. This has a direct alication to Branching Processes which we ll see in the near future. Examle 16. Discretization and Derivatives Let U be a Uniform 0, 1 random variable. Define X n = k2 n for the unique k such that k2 n U < (k + 1)2 n. As n increases, X n gives finer and finer information about U. Let f be a bounded function on [0, 1] and define What is Y n aroximating here as n? Y n = 2 n ( f(x n + 2 n ) f(x n ) ). (31) What is the distribution of U given X 0,..., X n? Show that Y n is a martingale wrt X n. 7 21 Mar 2006
Past Examle 17. Likelihood Ratios Examle from homework. Very imortant in some alications such as sequential analysis. Future Examle 18. False Discovery Rates Next time will show how a martingale argument roves the result of Benjamini and Hochberg (1995). Outline 19. Main Results for (sub and suer) martingales 1. Decomosition into martingale lus redictable rocess 2. Strong convergence theorems 3. Ucrossing Inequalities 4. Large Deviation Bounds 5. Maximal Inequalities 6. Otional Samling of Process at Stoing Times 7. Otional Stoing of Process at Stoing Times 8 21 Mar 2006