MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 17

Size: px
Start display at page:

Download "MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 17"

Transcription

1 MS&E 32 Spring 2-3 Stochastic Systems June, 203 Prof. Peter W. Glynn Page of 7 Section 0: Martingales Contents 0. Martingales in Discrete Time Optional Sampling for Discrete-Time Martingales Martingales for Discrete-Time Markov Chains The Strong Law for Martingales The Central Limit Theorem for Martingales Martingales in Discrete Time A fundamental tool in the analysis of DTMC s and continuous-time Markov processes is the notion of a martingale. Martingales also underlie the definition we will adopt for defining stochastic integrals with respect to Brownian motion. A martingale is basically a real-valued sequence that is a suitable generalization of a random walk with independent, mean-zero increments. Definition 0.. Let (M n : n 0) be a sequence of real-valued random variables. Then, (M n : n 0) is said to be a martingale (with respect to the sequence of random elements (Z n : n 0) if: (i) E M n < for n 0; (ii) for each n 0, there exists a deterministic function g n ( ) such that (iii) E[M n+ Z 0, Z,..., Z n ] = M n, for n 0. M n = g n (Z 0, Z,..., Z n ); Remark 0.. When a process (M n : n 0) satisfies condition (ii), one says that (M n : n 0) is adapted to (Z n : n 0). The critical component of the martingale definition is condition (iii). If we view M n as the fortune of a gambler at time n, then condition (iii) is asserting that the gambler is involved in playing a fair game, in which he/she has no propensity (in expectation) to either win or lose on any given gamble. As we asserted earlier, a random walk with independent mean-zero increments is a martingale. To see this, let S 0, X, X 2,... be independent random variables with finite mean, and suppose that EX i = 0 for i. Set Z n = S n = S 0 + X + + X n. Then, conditions (i) and (ii) of Definition 0.. are trivial to verify. For condition (iii), observe that E[S n+ S 0,..., S n ] = E[S n + X n+ S 0,..., S n ] = S n + E[X n+ S 0,..., S n ] = S n + EX n+ = S n. Martingales inherit many of the properties of mean-zero random walks. In view of the analogy with random walks, it is natural to consider the increments D i = M i M i, i

2 namely, the martingale differences. The following proposition is a clear generalization of two of the most important properties of mean-zero random walks. Proposition 0.. Let (M n : n 0) be a martingale with respect to (Z n : n 0). Then, In addition, if EM 2 n < for n 0, then EM n = EM 0 n 0. (0..) Cov(D i, D j ) = 0, i j (0..2) so that Var[M n ] = Var[M 0 ] + Var[D i ]. (0..3) Proof: Relation (0..) is immediate from condition (iii) of the martingale definition. For (0..2), note that (0..) implies that ED i = 0, so that (0..2) is equivalent to asserting that ED i D j = 0 for i < j. But E[D i D j Z 0,..., Z j ] = D i E[D j Z 0,..., Z j ] = 0, where condition (ii) of the martingale definition was used for the first equality, and condition (iii) was used for the final step. Taking expectations with respect to (Z 0,..., Z j ), we get (0..2). Finally, (0..3) is immediate from (0..2). Definition 0..2 A martingale (M n : n 0) for which EM 2 n < for n 0 is called a squareintegrable martingale. Before we turn to exploring further properties of martingales, let us develop some additional examples of martingales in the random walk setting. Example 0.. Let (X n : n ) be a sequence of iid mean-zero random variables with finite variance σ 2. Let S n = X + + X n and let M n = S 2 n nσ 2. Then (M n : n 0) is a martingale with respect to (S n : n 0). The critical property to verify is (iii). Note that E[M n+ S 0,..., S n ] = E[(S n + X n+ ) 2 (n + )σ 2 S 0,..., S n ] = E[S 2 n + 2S n X n+ + X 2 n+ (n + ) 2 σ 2 S 0,..., S n ] = S 2 n + 2S n E[X n+ S 0,..., S n ] + E[X 2 n+ S 0,..., S n ] (n + ) 2 σ 2 = S 2 n + σ 2 (n + ) 2 σ 2 = M n. Example 0..2 Let (X n : n ) be a sequence of iid random variables with common density g. Suppose that f is another density with the property that whenever g(x) = 0, then f(x) = 0. Set L 0 = and n f(x i ) L n = g(x i ), n 2

3 Then, (L n : n 0) is a martingale with respect to (X n : n ). Again, the critical property is verifying (iii). Here, [ ] [ ] f(x n+ ) E[L n+ X,..., X n ] = E L n g(x n+ ) X f(x n+ ) f(x),..., X n = L n E L n = L n g(x n+ ) g(x) g(x) dx = L n, since f is a density that integrates to. This is known as a likelihood ratio martingale. To show why the likelihood ratio martingale arises naturally, suppose that we have observed an iid sample from a population, yielding observations X, X 2,..., X n. Assume that the underlying population is known to be iid, either with common density f or with common density g. To test the hypothesis that the X i s have common density f (the f-hypothesis ) against the hypothesis that the X i s have common density g (the g-hypothesis ), the Neyman-Pearson lemma asserts that we should accept the f-hypothesis if the relative likelihood f(x ) f(x n ) g(x ) g(x n ) (0..4) is sufficiently large, and reject it otherwise. So, studying L n in the case where the X i s have common density g corresponds to studying the test statistic (0..4) when the state of nature is that the g-hypothesis is true. Given this interpretation, it seems natural to expect that L n converges to zero as the sample size n goes to positive infinity. This is because for a large sample size n, it is extremely unlikely that such a sample will be better explained by the f-hypothesis than by the other one. The fact that L n ought to go to zero as n is perhaps a bit surprising, given that EL n = for n 0. To prove that L n 0 almost surely as n, note that log L n = ( ) f(xi ) log. g(x i ) Then, the strong law of large numbers guarantees that ( ) n log L f(xi ) n E log g(x i ) a.s. as n. In other words, n log L n log ( ) f(x) g(x) dx. (0..5) g(x) (The right-hand side of (0..5) is what is known as a relative entropy.) Since log is strictly concave, Jensens inequality asserts that if f g, E log ( ) ( ) f(xi ) f(xi ) < log E = 0 (0..6) g(x i ) g(x i ) As a consequence, not only does L n converge to zero as n a.s, but the rate of convergence is exponentially fast. It is worth noting that this is an example of a sequence of random variables (L n : n 0) for which L n 0 a.s. and yet EL n 0 as n (in other words, passing limits through expectations is not always valid). 3

4 Example 0..3 In this example, we specialize the likelihood ratio martingale a bit. Suppose that the X i s are iid with common density g, and suppose that the moment generating function m X (θ) = Ee θx i converges in some neighborhood of the origin. For θ within the domain of convergence of m X ( ), let f(x) = eθx g(x) m X (θ), or, equivalently, f(x) = e θx ψ(θ) g(x), where ψ(θ) = log m X (θ). In this case, n f(x i ) L n = g(x i ) = eθsn nψ(θ) (0..7) The martingale (L n : n 0) defined by (0..7) is known as an exponential martingale. Because the random walk (S n : n 0) appears explicitly in the exponent of the martingale, (L n : n 0) is well-suited to studying random walks. Some indication of the power of this martingale should be apparent, if we explicitly display the dependence of L n on θ as follows: L n (θ) = e θsn nψ(θ) The defining property (iii) of a martingale asserts that E[L n+ (θ) S 0,..., S n ] = L n (θ). For θ inside the domain of convergence of m X ( ), one can interchange the derivative and expectation, yielding E[L n+(θ) S 0,..., S n ] = L n(θ). In particular, (L n(0) : n 0) is a martingale. But L n(0) = S n nψ (0). It turns out that ψ (0) = EX. So, by differentiating our exponential martingale, we retrieve the random walk martingale. And by differentiating a second time, it turns out that L n(0) is the martingale of Example Through successive differentiation, we can obtain a whole infinite family of such martingales. Exercise 0.. (a) Prove that ψ( ) is convex. (b) Prove that ψ ( ) = EX. (c) Prove that ψ (0) = Var[X ]. (d) Prove that L n(0) = (S n nµ) 2 nσ 2. (e) Compute L n (0). We now turn to a fundamental result in the theory of martingales known as the Martingale Convergence Theorem. Theorem 0.. (Martingale Convergence Theorem in L 2 ) Let (M n : n 0) be a martingale with respect to (Z n : n 0). If sup n 0 EM 2 n <, then there exists a square-integrable random variable M such that E[(M n M 0 ) 2 ] 0 as n, i.e. M n converges to M in mean square. 4

5 Proof: The space L 2 of square-integrable random variables is a Hilbert space under the inner product X, Y = E[XY ]. Since EMn 2 = EM0 2 + EDi 2, it follows that ED2 i <. For ɛ > 0, choose m = m(ɛ) so that n 2 > n m, E(M n2 M n ) 2 = n 2 j=n + ED 2 j < ɛ i=m ED2 i < ɛ. Then, for so that (M n : n 0) is a Cauchy sequence in L 2. Then, the completeness of L 2 yields the conclusion of the theorem. Actually, one does not need square integrability in order that the Martingale Convergence Theorem hold. Theorem 0..2 (Martingale Convergence Theorem) Let (M n : n 0) be a martingale with respect to (Z n : n 0). If sup n 0 E M n <, then there exists a finite-valued random variable M such that M n M a.s. as n. For a proof, see p. 233 of Probability: Theory and Examples 3rd ed. by R. Durrett. We conclude this section with a brief discussion of stochastic integrals in discrete time. Let (M)n : n 0) be a square-integrable martingale with respect to (Z n : n 0). Suppose that (W n : n 0) is a sequence of random variables that is adapted to (Z n : n 0). We define the stochastic integral of (W n : n 0) with respect to (M n : n 0) as the sequence V n = W i D i = W i (M i M i ) = W i M i We could also have defined the stochastic integral here as n W i M i. But in that case, we would lose the nice properties listed below. Exercise 0..2 Let (M n : n 0) be a square-integrable martingale with respect to (Z n : n 0), with M 0 = 0. Suppose (W n : n 0) is a square-integrable sequence that is adapted to (Z n : n 0). (a) Prove that if V 0 = 0 and V n = n W i M i for n, then (V n : n 0) is a martingale with respect to (Z n : n 0). (b) Suppose that the martingale differences (D i : i ) are a stationary sequence of independent random variables. Show that EVn 2 = σ 2 n i=0 EW i 2, where σ 2 = Var[D i ]. 0.2 Optional Sampling for Discrete-Time Martingales An important property of martingales is EM n = EM 0 n 0 (0.2.) The theory of optional sampling is concerned with extending (0.2.) from deterministic n to random times T. As in the discussion of the strong Markov property, it is natural to restrict ourselves to stopping times. However, EM T = EM 0 (0.2.2) 5

6 fails to hold for all finite-valued stopping times T. Example 0.2. Let (S n : n 0) be a random walk with S 0 = 0 and iid increments (X n : n ) defined by P(X n = ) = P(X n = ) = 2. Put T = inf{n 0 : S n = }. Since (S n : n 0) is null recurrent, T < a.s. and S T =. Therefore, ES T = and ES 0 = 0, and so ES T ES 0. Hence, the class of stopping times needs to be restricted somewhat. Theorem 0.2. Let (M n : n 0) be a martingale with respect to (Z n : n 0). Suppose that T is a bounded random variable that is a stopping time with respect to (Z n : n 0). Then EM T = EM 0. Proof: Let m be such that P (T m) =. Then M T = M 0 + m D ii(t i), and thus EM T = EM 0 + E m D i I(T i) (0.2.3) Because T is a stopping time, E[D i I(T i) Z 0,..., Z i ] = I(T i)e[d i Z 0,..., Z i ] = 0, and so m E D i I(T i) = 0 If T is a stopping time, then T n is a stopping time for n 0 (and is clearly bounded). So, optional sampling applies at T n (see Theorem 0.2.), i.e. EM T n = EM 0 for n 0. If T < a.s., then M T n M T a.s. as n. Hence, if then (0.2.2) holds, since E lim n M T n = lim n EM T n, (0.2.4) EM T = E lim n M T n = lim n EM T n = lim n EM 0 = EM 0. Therefore, the key to establishing (0.2.2) is (0.2.4). There are various results which one can invoke to justify (0.2.4); the most powerful of these results is the Dominated Convergence Theorem. To apply this result, we need to find a random variable W having finite mean, such that M T n W for n 0. The obvious candidate for W is W = M 0 + T D i, (0.2.5) where D i = D i. So, if EW <, we conclude that (0.2.4) is valid. Proposition 0.2. Suppose that there exists c < such that P(D i c) = for i. ET <, then EM T = EM 0. If 6

7 Proof: Note that W M 0 + ct. Since ET <, then EW <. Then, the Dominated Convergence Theorem implies that EM T n EM T as n, yielding the result. Now, let s turn to an application of optional sampling. Application 0.2. Let (S n : n 0) be a random walk with S 0 = 0 and iid increments (X n : n ) defined by P(X n = ) = P(X n = ) = 2. Let T = inf{n 0 : S n a or S n b} be the exit time from [ a, b]. Suppose that we wish to compute for P(S T = a), the probability that the random walk exits the left boundary. (This is basically the gambler s ruin computation for the probability of ruin.) Note that D i = and ET < (see Exercise 0.2.). Hence, Proposition 0.2. applies and ES T = 0. However, ES T = ap(s T = a) + bp(s T = b) = ap(s T = a) + b[ P(S T = a)]. Therefore, P(S T = a) = b/(a + b). Exercise 0.2. (a) Prove that ET < in Application (b) Compute the value of P(S T = a) by setting up a suitable system of linear equations involving the unknowns P x (S T = a) and solving them. (This is an alternative approach to computing the exit probability.) Application In this continuation of Application 0.2., we wish to compute ET (In the gambler s ruin setting, this is the mean duration of the game). Let M n = Sn 2 nσ 2, where σ 2 = VarX i =. Assuming that (0.2.2) holds, Solving for EST 2, we have ES 2 T = σ 2 ET = ET. (0.2.6) ES 2 T = a 2 P(S T = a) + b 2 P(S T = b) = a2 b + ab 2 so ET = ab. Does Proposition 0.2. apply? Here, a + b = ab, D i = S 2 i S 2 i = (S i + S i )X i. Clearly, the D i do not satisfy the hypotheses of Proposition 0.2., so something else is needed here. Proposition Suppose that there exists c < for which If ET <, then EM T = EM 0. E[ D i Z 0, Z,..., Z i ] c on {T i} for i. Proof: Note that EW = E M 0 + E D i I(T i) = E M 0 + E D i I(T i). However, E[ D i I(T i) Z 0, Z,..., Z i ] = I(T i)e[ D i Z 0, Z,..., Z i ] ci(t i). Thus, EW E M 0 + c EI(T i) = E M0 + cet <, and consequently the Dominated Convergence Theorem applies. 7

8 Application (continued) Here, D i ( S i + S i )+. So, on {T i}, D i 2( a b )+, validating the hypotheses of Proposition 0.2.2, and thus completing the desired computation. How do we perform corresponding calculations if the random walk does not have mean zero? Specifically, suppose that (S n : n 0) is a random walk with S 0 = 0 and iid increments (X n : n ) given by P(X n = ) = p = P(X n = ). Here, the key is to switch to our exponential martingale. Application Here, m X (θ) = pe θ + ( p)e θ, so ψ(θ) = log(pe θ + ( p)e θ ). Then, the martingale of interest is L n (θ) = e θsn nψ(θ). Assuming that optional sampling applies at time T, we arrive at EL T (θ) =, or, in other words, Ee θs T T ψ(θ) =. (0.2.7) To compute the exit probabilities from [ a, b], it is desirable to eliminate the term T ψ(θ) from the exponent of (0.2.7). Recall that ψ is convex (see Exercise 0..). There exists a unique θ 0 such that ψ(θ ) = 0, given by ( ) p θ = log. p Substituting θ = θ into (0.2.7), we get Ee θ S T =. But Eeθ S T = e θ ap(s T = a) + eθ bp(s T = b). Hence, P(S T = a) = ( p p ( ) b p p ) b ) a ( p p (This is basically the probability of ruin in a gambler s ruin problem that is not fair.) Exercise Rigorously apply the optional sampling theorem in Application Application Let (S n : n 0) be a random walk with S 0 = 0 and iid increments (X n : n ) given by P(X n = ) = p = P(X n = ) with p > /2. This is a walk with positive drift, so that T < a.s. if we set T = inf{n 0 : S n b}. Our goal here is to compute the moment generating function of T, using martingale methods. Assuming that we can invoke the optional sampling theorem at T, Ee θsn nψ(θ) = (0.2.8) For T as described above, S T = b (This is a consequence of the continuity of the nearestneighbor random walk. If X i can take on values greater than or equal to 2, then S T would not be deterministic, and this calculation becomes much harder). Relation (0.2.8) yields Set γ = γ(θ) so that θ = ψ (γ). Then, Ee T ψ(θ) = e θb Ee T γ = e ψ (γ)b 8

9 is the moment generating function of T (In computing ψ (γ), one may find multiple roots; to formally determine the appropriate root, note that the function Ee γt of the non-negative random variable T must be non-increasing in γ). To make this result rigorous, note that if p > /2, then ψ (0) > 0. The convexity of ψ( ) then guarantees that ψ(θ) > 0 for θ > 0. Consequently, for θ > 0, e θs T n ψ(θ)(t n) e θs T n e θb, so the Dominated Convergence Theorem ensures that (0.2.8) holds for θ > 0. Then for γ > 0, let η = ψ (γ) be the non-negative root of ψ(η) = γ (0.2.9) Relation (0.2.9) yields the expression Ee γt = e ψ (γ)b = e ηb, where ψ is defined as above. Note that a rigorous application of optional sampling theory has led us to the correct choice of root for the equation (0.2.9). A similar analysis is possible for the one-sided hitting time T = inf{n 0 : S n a} with a > 0. Since p > /2, T is infinite with positive probability in this case. Again, consider the sequence e θs T n ψ(θ)(t n). Note that if θ < 0 and ψ(θ) > 0, this sequence is bounded above by e θa. Hence, we may interchange limits and expectations in the expression thereby yielding the identity e θa E[e T ψ(θ) I(T n)] + E[e θsn nψ(θ) I(T > n)] =, E[e T ψ(θ) ; T < ] = e θa, for θ < 0 satisfying ψ(θ) > 0. So, for γ 0, let η = ψ (γ) be the root less than or equal to θ = log(( p)/p) < 0 defined by ψ(η) = γ. For the root defined as above, we then have E[e γt ; T < ] = e ψ (γ)a. Note that by setting γ = 0, we obtain the identity P(T < ) = e θ a. In other words, we have computed the probability that a positive drift nearest neighbor random walk ever drops below a. The theory of optional sampling extends beyond the martingale setting to supermartingales and submartingales. Definition 0.2. Let (M n : n 0) be an integrable sequence of random variables that is adapted to (Z n : n 0). If for n 0, E[M n+ Z 0,..., Z n ] M n, then (M n : n 0) is said to be a supermartingale with respect to (Z n : n 0). On the other hand, if E[M n+ Z 0,..., Z n ] M n, then (M n : n 0) is said to be a submartingale with respect to (Z n : n 0). 9

10 If M n corresponds to the fortune of a gambler at time n, then a supermartingale indicates that the game is unfavorable to the gambler, whereas a submartingale indicates that the game is favorable. Proposition Let T be a stopping time that is adapted to (Z n : n 0). If (M n : n 0) is a supermartingale with respect to (Z n : n 0), then EM T n EM 0, n 0. On the other hand, if (M n : n 0) is a submartingale with respect to (z n : n 0), then Exercise Prove Proposition EM T n EM 0, n 0. Exercise Let (M n : n 0) be a martingale with respect to (Z n : n 0). Suppose that φ : R R is a convex function for which E φ(m n ) < for n 0. Prove that (φ(m n ) : n 0) is a submartingale with respect to (Z n : n 0). 0.3 Martingales for Discrete-Time Markov Chains In this section, we show how the random walk martingales introduced earlier generalize to the DTMC setting. Each of the martingales constructed here will have natural analogs in the SDE context. Let (Y n : n 0) be a real-valued sequence of random variables, not necessarily Markov. A standard trick for constructing a martingale in this very general setting is to set D i = Y i E[Y i Y 0,..., Y i ] for i. Assuming that the Y i s are integrable, then the D i s are martingale differences with respect to the Y i s. Hence, M n = [Y i E[Y i Y 0,..., Y i ]] is a martingale. The same kind of idea works nicely in the DTMC setting. For f : S R that is bounded, note that D i = f(x i ) E[f(X i ) X 0,..., X i ] = f(x i ) E[f(X i ) X i ] = f(x i ) (P f)(x i ) is a martingale difference with respect to (X i : i 0). Hence, is a mean-zero martingale. But M n = = M n = [f(x i ) (P f)(x i )] [f(x i ) (P f)(x i )] n [f(x i ) (P f)(x i )] + f(x n ) f(x 0 ) i=0 n = f(x n ) f(x 0 ) (Af)(X i ) 0 i=0

11 It follows easily that M n = f(x n ) n i=0 (Af)(X i) is a martingale whenever f is bounded. We have proved the following result. Proposition 0.3. For f : S R bounded, M n = f(x n ) n i=0 (Af)(X i) is a martingale with respect to (X n : n ). This martingale is known as the Dynkin martingale. Viewing (Af)(X i ) as the increment of a random walk-type process, this is clearly the DTMC analog to the random walk martingale. Suppose that Af = 0. Then Proposition 0.3. implies that (f(x n ) : n 0) is a martingale with respect to (X n : n 0). Definition 0.3. A function f : S R for which Af = 0 is called a harmonic function. The term harmonic function is widely used in the analysis literature. It refers to functions f : R d R for which f = 0, where = 2 2 x x x 2. d (The operator is known as the Laplacian operator.) Note that if the Markov chain X corresponds (for example) to simple random walk on the lattice plane, then { /4 if (x 2, y 2 ) {(x +, y ), (x, y ), (x, y + ), (x, y )} P ((x, y ), (x 2, y 2 )) =. 0 otherwise Requiring that f be harmonic in this setting forces f to satisfy f(x +, y ) + f(x, y ) + f(x, y + ) + f(x, y ) 4f(x, y ) 4 = 0 (0.3.) The left-hand side term turns out to be a finite-difference approximation to f in two dimensions. Thus, Definition 0.3. legitimately extends the classical notion of harmonic functions. Proposition (a) If f is a bounded function for which Af 0, then (f(x n ) : n 0) is a supermartingale with respect to (X n : n 0). (b) If f is a bounded function for which Af 0, then (f(x n ) : n 0) is a submartingale with respect to (X n : n 0). Exercise 0.3. Prove Proposition Definition A function f for which Af 0 is said to be superharmonic. If instead Af 0, then f is said to be subharmonic. Again, this definition extends the classical usage, which states that f is superharmonic if f 0 and subharmonic if f 0. It is in order to remain consistent with the classical usage that we apply the term supermartingale rather than submartingale to an unfavorable game in which M n has a tendency to decrease in expectation. There is a nice connection between harmonic functions and recurrence.

12 Exercise Suppose that X is an irreducible DTMC. (a) If X is recurrent, prove that all the bounded harmonic functions are constants. (Hint: This is easy if S <. To prove the general case, use Theorem 0..2.) (b) If X is transient, show that there always exists at least one non-constant bounded harmonic function. To apply martingale theory to additive processes of the form n g(x j ) (0.3.2) j=0 with X Markov, the obvious device to apply is Proposition So, note that if we could find f such that Af = g (0.3.3) then we effectively would have our desired martingale for (0.3.2), namely n M n = f(x n ) + g(x j ) (0.3.4) (In the Markov setting, one cannot expect (0.3.2) itself to be a martingale it just isn t. But (0.3.4) shows that it can be represented as a martingale if one adds on the correction term f(x n ).) Because (0.3.3) plays a key role in representing (0.3.2) as a martingale, this equation has an important place in the theory of Markov processes. Equation (0.3.3) is called Poisson s equation. (In the symmetric simple random walk setting, (0.3.3) is just a finite-difference approximation to f = g, which is Poisson s equation in the partial differential equations setting.) j=0 Poisson s equation need not have a solution for arbitrary g. Exercise Suppose that X is an irreducible transient DTMC. If g has finite support (i.e. {x S : g(x) 0} has finite cardinality), show that Poisson s equation has a solution. Exercise Suppose that X is an irreducible finite-state DTMC. Let π be the stationary distribution of X. Let Π be the matrix in which all rows are identical to π (a) Prove that ΠP = P Π = Π 2. (b) Prove that (P Π) n = P n Π for n. (c) Prove that if X is aperiodic, then n=0 (P Π)n converges absolutely. (d) Prove that if X is aperiodic, then (I P + Π) exists. (e) Extend (d) to the periodic case. (f) Prove that if g is such that πg = 0, then f = (Π A) g solves Poisson s equation Af = g. (g) Prove that if g is such that πg 0, then Af = g has no solution. 2

13 Exercise We extend here the existence of solutions to Poisson s equation to infinite state irreducible positive recurrent Markov chains X = (X n : n 0). Let f : S R be such that x π(x) f(x) <. Set f c(x) = f(x) y π(y)f(y), and put where τ(z) = inf{n : X n = z}. τ(z) u (x) = E x n=0 f c (X n ), (a) Prove that E x τ(z) n=0 f c (X n ) < for each x S (so that u ( ) is finite-valued). (b) Prove that u (x) = f c (x) + y S P (x, y)u (y) so that u is a solution of Poisson s equation. We now turn to developing an analog to the likelihood ratio martingale that was discussed in the random walk setting. Let X = (X n : n 0) be an S-valued DTMC with initial distribution ν and (one-step) transition matrix Q = (Q(x, y) : x, y S). Suppose that we select a stochastic vector µ and transition matrix P such that (i) µ(x) = 0 whenever ν(x) = 0 for x S; (ii) P (x, y) = 0 whenever Q(x, y) = 0 for x, y S. Proposition The sequence (L n : n 0) is a martingale with respect to (X n : n 0), where L n = µ(x n 0) P (X j, X j+ ) ν(x 0 ) Q(X j=0 j, X j+ ), n 0 Exercise Prove Proposition We close this section with a discussion of the exponential martingale s extension to the DTMC setting. Suppose that we wish to study an additive process of the form n j=0 g(x j), where (X n : n 0) is an irreducible finite-state DTMC. In the random walk setting, the moment generating function of the random walk played a critical role in constructing the exponential martingale. This suggests considering u n (θ, x, y) = E x [e θ n j=0 g(xj) ; X n = y] for x, y S. Observe that u n (θ, x, y) = x,...,xn e θg(x) P (x, x )e θg(x ) P (x, x 2 ) e θg(x n ) P (x n, y) = K n (θ, x, y), where K n (θ, x, y) the x y;th component of the nth power of the matrix K(θ), where K(θ, x, y) = e θg(x) P (x, y). (0.3.5) 3

14 Note that K(θ) is a non-negative finite irreducible matrix. Then, the Perron-Frobenius theorem for non-negative matrices implies that there exists a positive eigenvalue λ(θ) and corresponding positive column eigenvector r(θ) such that Let ψ(θ) = log λ(θ). We can rewrite (0.3.6) as e ψ(θ) y Substituting (0.3.5) into (0.3.7), we obtain e θg(x) ψ(θ) r(θ, x) P (x, y) r(θ, y) = or equivalently, Proposition For each θ R, is a martingale with respect to (X n : n 0). y K(θ)r(θ) = λ(θ)r(θ). (0.3.6) r(θ, x) K(θ, x, y) =, x S. (0.3.7) r(θ, y) E x e θg(x) ψ(θ) r(θ, X ) r(θ, X 0 ) = L n (θ) = e θ n j=0 g(x j) nψ(θ) r(θ, X n ) r(θ, X 0 ) Proof: The critical verification involves showing that E[L n+ (θ) X 0,..., X n ] = L n (θ). But [ E[L n+ (θ) X 0,..., X n ] = L n (θ)e e θg(xn) ψ(θ) r(θ, X ] n+) r(θ, X n ) X 0,..., X n = L n (θ). We can rewrite this martingale as follows. Set h(θ, x) = log r(θ, x). Then, Proposition asserts that e h(θ,xn)+θ n j=0 g(x j) nψ(θ) is a martingale. This exponential martingale can be used in a manner identical to the random walk setting to study n j=0 g(x j). 0.4 The Strong Law for Martingales As for sums of independent mean zero rv s, we expect that in great generality, n D i 0 a.s. (0.4.) as n. This is easy to establish if we weaken the a.s. convergence to convergence in probability, since P ( n D i > a) E(n n D i) 2 ɛ 2 4 = n 2 ɛ 2 EDi 2,

15 so that if it clearly follows that sup EDn 2 <, (0.4.2) n n p D i 0 (0.4.3) as n. To prove (0.4.3) to a.s. convergence, we need to apply the Martingale Convergence Theorem. Since (n n D i : n ) is not a martingale, we need to use something to bridge the gap between n n D i and the world of martingales. The appropriate bridge is Kronecker s lemma. Kronecker s Lemma: If (x n : n ) and (a n : n ) are two real-valued sequences for which (a n : n ) is non-negative and increasing to infinity, then the existence of a finite-valued z such that ( ) xj z as n implies that as n. j= a j a n j= x j 0 To apply this result in our martingale setting, let M n = j= ( Dj and observe that ( M n : n 0) is a martingale for which j ) E M n E D j /j j= = EDj 2/j2, j= so that in the presence of (0.4.2), the Martingale Convergence Theorem can be applied, yielding the conclusion that there exists a finite-valued M for which 2 M n M a.s. as n. An application of Kronecker s lemma path-by-path then yields a.s. as n. n D i 0 5

16 Exercise 0.4. Use the above argument to prove the strong law f(x i ) z n n i=0 π(z)f(z) a.s. as n for a given finite-state irreducible Markov chain (with equilibrium distribution π = (π(x) : x S)). 0.5 The Central Limit Theorem for Martingales We discuss here general conditions under which n /2 as n in discrete time or under which D i σn(0, ) t /2 M(t) σn(0, ) (0.5.) as n in continuous time. Since discrete-time martingales are just a special case of continuous time martingales, we focus on (0.5.). Note that M(t) = M(0) + (M(it/n) M((i )t/n)) so that in the presence of square integrability EM 2 (t) = EM 2 (0) + E (M(it/n) M((i )t/n)) 2. For a given square-integrable martingale (M(t)t 0) define the quadratic variation of M to be [M](t) = lim n (M(it/n) M((i )t/n)) 2. i=0 Theorem 0.5. Let (M(t) : t 0) be a square-integrable martingale with right continuous paths with left limits. If either: E sup M(s) M(s ) 0 t 0 s t and as t, or t [M](t) p σ 2 t E sup M(s) M(s ) 2 0, 0 s t t E sup M (s) M (s ) 0 0 s t 6

17 and as t, then t M (t) p σ 2 t /2 M(t) σn(0, ) as t. Remark 0.5. Note that Markov jump processes have right continuous paths with left limits, so this result applies in the Markov jump process setting. Remark When specialized to discrete time, [M](n) = M 2 (0) + D 2 i and M (n) = E[Di 2 Z 0,..., Z i ] Exercise 0.5. Use the Martingale CLT to prove that there exists σ for which ( n n /2 f(x i ) n ) π(z)f(z) σn(0, ) z i=0 as n, provided that (X n : n 0) is a finite state irreducible Markov chain. 7

Martingales. by D. Cox December 2, 2009

Martingales. by D. Cox December 2, 2009 Martingales by D. Cox December 2, 2009 1 Stochastic Processes. Definition 1.1 Let T be an arbitrary index set. A stochastic process indexed by T is a family of random variables (X t : t T) defined on a

More information

6. Martingales. = Zn. Think of Z n+1 as being a gambler s earnings after n+1 games. If the game if fair, then E [ Z n+1 Z n

6. Martingales. = Zn. Think of Z n+1 as being a gambler s earnings after n+1 games. If the game if fair, then E [ Z n+1 Z n 6. Martingales For casino gamblers, a martingale is a betting strategy where (at even odds) the stake doubled each time the player loses. Players follow this strategy because, since they will eventually

More information

Math-Stat-491-Fall2014-Notes-V

Math-Stat-491-Fall2014-Notes-V Math-Stat-491-Fall2014-Notes-V Hariharan Narayanan December 7, 2014 Martingales 1 Introduction Martingales were originally introduced into probability theory as a model for fair betting games. Essentially

More information

Lecture 23: April 10

Lecture 23: April 10 CS271 Randomness & Computation Spring 2018 Instructor: Alistair Sinclair Lecture 23: April 10 Disclaimer: These notes have not been subjected to the usual scrutiny accorded to formal publications. They

More information

Asymptotic results discrete time martingales and stochastic algorithms

Asymptotic results discrete time martingales and stochastic algorithms Asymptotic results discrete time martingales and stochastic algorithms Bernard Bercu Bordeaux University, France IFCAM Summer School Bangalore, India, July 2015 Bernard Bercu Asymptotic results for discrete

More information

Advanced Probability and Applications (Part II)

Advanced Probability and Applications (Part II) Advanced Probability and Applications (Part II) Olivier Lévêque, IC LTHI, EPFL (with special thanks to Simon Guilloud for the figures) July 31, 018 Contents 1 Conditional expectation Week 9 1.1 Conditioning

More information

1 Rare event simulation and importance sampling

1 Rare event simulation and importance sampling Copyright c 2007 by Karl Sigman 1 Rare event simulation and importance sampling Suppose we wish to use Monte Carlo simulation to estimate a probability p = P (A) when the event A is rare (e.g., when p

More information

X i = 124 MARTINGALES

X i = 124 MARTINGALES 124 MARTINGALES 5.4. Optimal Sampling Theorem (OST). First I stated it a little vaguely: Theorem 5.12. Suppose that (1) T is a stopping time (2) M n is a martingale wrt the filtration F n (3) certain other

More information

4 Martingales in Discrete-Time

4 Martingales in Discrete-Time 4 Martingales in Discrete-Time Suppose that (Ω, F, P is a probability space. Definition 4.1. A sequence F = {F n, n = 0, 1,...} is called a filtration if each F n is a sub-σ-algebra of F, and F n F n+1

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 11 10/9/2013. Martingales and stopping times II

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 11 10/9/2013. Martingales and stopping times II MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 11 10/9/013 Martingales and stopping times II Content. 1. Second stopping theorem.. Doob-Kolmogorov inequality. 3. Applications of stopping

More information

IEOR 3106: Introduction to Operations Research: Stochastic Models SOLUTIONS to Final Exam, Sunday, December 16, 2012

IEOR 3106: Introduction to Operations Research: Stochastic Models SOLUTIONS to Final Exam, Sunday, December 16, 2012 IEOR 306: Introduction to Operations Research: Stochastic Models SOLUTIONS to Final Exam, Sunday, December 6, 202 Four problems, each with multiple parts. Maximum score 00 (+3 bonus) = 3. You need to show

More information

Drunken Birds, Brownian Motion, and Other Random Fun

Drunken Birds, Brownian Motion, and Other Random Fun Drunken Birds, Brownian Motion, and Other Random Fun Michael Perlmutter Department of Mathematics Purdue University 1 M. Perlmutter(Purdue) Brownian Motion and Martingales Outline Review of Basic Probability

More information

Lecture 4. Finite difference and finite element methods

Lecture 4. Finite difference and finite element methods Finite difference and finite element methods Lecture 4 Outline Black-Scholes equation From expectation to PDE Goal: compute the value of European option with payoff g which is the conditional expectation

More information

Lecture 1: Lévy processes

Lecture 1: Lévy processes Lecture 1: Lévy processes A. E. Kyprianou Department of Mathematical Sciences, University of Bath 1/ 22 Lévy processes 2/ 22 Lévy processes A process X = {X t : t 0} defined on a probability space (Ω,

More information

Homework Assignments

Homework Assignments Homework Assignments Week 1 (p. 57) #4.1, 4., 4.3 Week (pp 58 6) #4.5, 4.6, 4.8(a), 4.13, 4.0, 4.6(b), 4.8, 4.31, 4.34 Week 3 (pp 15 19) #1.9, 1.1, 1.13, 1.15, 1.18 (pp 9 31) #.,.6,.9 Week 4 (pp 36 37)

More information

Stochastic Dynamical Systems and SDE s. An Informal Introduction

Stochastic Dynamical Systems and SDE s. An Informal Introduction Stochastic Dynamical Systems and SDE s An Informal Introduction Olav Kallenberg Graduate Student Seminar, April 18, 2012 1 / 33 2 / 33 Simple recursion: Deterministic system, discrete time x n+1 = f (x

More information

1 IEOR 4701: Notes on Brownian Motion

1 IEOR 4701: Notes on Brownian Motion Copyright c 26 by Karl Sigman IEOR 47: Notes on Brownian Motion We present an introduction to Brownian motion, an important continuous-time stochastic process that serves as a continuous-time analog to

More information

Stochastic Calculus - An Introduction

Stochastic Calculus - An Introduction Stochastic Calculus - An Introduction M. Kazim Khan Kent State University. UET, Taxila August 15-16, 17 Outline 1 From R.W. to B.M. B.M. 3 Stochastic Integration 4 Ito s Formula 5 Recap Random Walk Consider

More information

Optimal Stopping. Nick Hay (presentation follows Thomas Ferguson s Optimal Stopping and Applications) November 6, 2008

Optimal Stopping. Nick Hay (presentation follows Thomas Ferguson s Optimal Stopping and Applications) November 6, 2008 (presentation follows Thomas Ferguson s and Applications) November 6, 2008 1 / 35 Contents: Introduction Problems Markov Models Monotone Stopping Problems Summary 2 / 35 The Secretary problem You have

More information

RMSC 4005 Stochastic Calculus for Finance and Risk. 1 Exercises. (c) Let X = {X n } n=0 be a {F n }-supermartingale. Show that.

RMSC 4005 Stochastic Calculus for Finance and Risk. 1 Exercises. (c) Let X = {X n } n=0 be a {F n }-supermartingale. Show that. 1. EXERCISES RMSC 45 Stochastic Calculus for Finance and Risk Exercises 1 Exercises 1. (a) Let X = {X n } n= be a {F n }-martingale. Show that E(X n ) = E(X ) n N (b) Let X = {X n } n= be a {F n }-submartingale.

More information

Definition 9.1 A point estimate is any function T (X 1,..., X n ) of a random sample. We often write an estimator of the parameter θ as ˆθ.

Definition 9.1 A point estimate is any function T (X 1,..., X n ) of a random sample. We often write an estimator of the parameter θ as ˆθ. 9 Point estimation 9.1 Rationale behind point estimation When sampling from a population described by a pdf f(x θ) or probability function P [X = x θ] knowledge of θ gives knowledge of the entire population.

More information

BROWNIAN MOTION Antonella Basso, Martina Nardon

BROWNIAN MOTION Antonella Basso, Martina Nardon BROWNIAN MOTION Antonella Basso, Martina Nardon basso@unive.it, mnardon@unive.it Department of Applied Mathematics University Ca Foscari Venice Brownian motion p. 1 Brownian motion Brownian motion plays

More information

Probability without Measure!

Probability without Measure! Probability without Measure! Mark Saroufim University of California San Diego msaroufi@cs.ucsd.edu February 18, 2014 Mark Saroufim (UCSD) It s only a Game! February 18, 2014 1 / 25 Overview 1 History of

More information

Brownian Motion. Richard Lockhart. Simon Fraser University. STAT 870 Summer 2011

Brownian Motion. Richard Lockhart. Simon Fraser University. STAT 870 Summer 2011 Brownian Motion Richard Lockhart Simon Fraser University STAT 870 Summer 2011 Richard Lockhart (Simon Fraser University) Brownian Motion STAT 870 Summer 2011 1 / 33 Purposes of Today s Lecture Describe

More information

STOCHASTIC CALCULUS AND BLACK-SCHOLES MODEL

STOCHASTIC CALCULUS AND BLACK-SCHOLES MODEL STOCHASTIC CALCULUS AND BLACK-SCHOLES MODEL YOUNGGEUN YOO Abstract. Ito s lemma is often used in Ito calculus to find the differentials of a stochastic process that depends on time. This paper will introduce

More information

Midterm Exam: Tuesday 28 March in class Sample exam problems ( Homework 5 ) available tomorrow at the latest

Midterm Exam: Tuesday 28 March in class Sample exam problems ( Homework 5 ) available tomorrow at the latest Plan Martingales 1. Basic Definitions 2. Examles 3. Overview of Results Reading: G&S Section 12.1-12.4 Next Time: More Martingales Midterm Exam: Tuesday 28 March in class Samle exam roblems ( Homework

More information

1 Geometric Brownian motion

1 Geometric Brownian motion Copyright c 05 by Karl Sigman Geometric Brownian motion Note that since BM can take on negative values, using it directly for modeling stock prices is questionable. There are other reasons too why BM is

More information

Definition 4.1. In a stochastic process T is called a stopping time if you can tell when it happens.

Definition 4.1. In a stochastic process T is called a stopping time if you can tell when it happens. 102 OPTIMAL STOPPING TIME 4. Optimal Stopping Time 4.1. Definitions. On the first day I explained the basic problem using one example in the book. On the second day I explained how the solution to the

More information

BROWNIAN MOTION II. D.Majumdar

BROWNIAN MOTION II. D.Majumdar BROWNIAN MOTION II D.Majumdar DEFINITION Let (Ω, F, P) be a probability space. For each ω Ω, suppose there is a continuous function W(t) of t 0 that satisfies W(0) = 0 and that depends on ω. Then W(t),

More information

Math 489/Math 889 Stochastic Processes and Advanced Mathematical Finance Dunbar, Fall 2007

Math 489/Math 889 Stochastic Processes and Advanced Mathematical Finance Dunbar, Fall 2007 Steven R. Dunbar Department of Mathematics 203 Avery Hall University of Nebraska-Lincoln Lincoln, NE 68588-0130 http://www.math.unl.edu Voice: 402-472-3731 Fax: 402-472-8466 Math 489/Math 889 Stochastic

More information

Log-linear Dynamics and Local Potential

Log-linear Dynamics and Local Potential Log-linear Dynamics and Local Potential Daijiro Okada and Olivier Tercieux [This version: November 28, 2008] Abstract We show that local potential maximizer ([15]) with constant weights is stochastically

More information

Arbitrages and pricing of stock options

Arbitrages and pricing of stock options Arbitrages and pricing of stock options Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November

More information

The stochastic calculus

The stochastic calculus Gdansk A schedule of the lecture Stochastic differential equations Ito calculus, Ito process Ornstein - Uhlenbeck (OU) process Heston model Stopping time for OU process Stochastic differential equations

More information

STAT 830 Convergence in Distribution

STAT 830 Convergence in Distribution STAT 830 Convergence in Distribution Richard Lockhart Simon Fraser University STAT 830 Fall 2013 Richard Lockhart (Simon Fraser University) STAT 830 Convergence in Distribution STAT 830 Fall 2013 1 / 31

More information

Stochastic calculus Introduction I. Stochastic Finance. C. Azizieh VUB 1/91. C. Azizieh VUB Stochastic Finance

Stochastic calculus Introduction I. Stochastic Finance. C. Azizieh VUB 1/91. C. Azizieh VUB Stochastic Finance Stochastic Finance C. Azizieh VUB C. Azizieh VUB Stochastic Finance 1/91 Agenda of the course Stochastic calculus : introduction Black-Scholes model Interest rates models C. Azizieh VUB Stochastic Finance

More information

FE 5204 Stochastic Differential Equations

FE 5204 Stochastic Differential Equations Instructor: Jim Zhu e-mail:zhu@wmich.edu http://homepages.wmich.edu/ zhu/ January 13, 2009 Stochastic differential equations deal with continuous random processes. They are idealization of discrete stochastic

More information

M5MF6. Advanced Methods in Derivatives Pricing

M5MF6. Advanced Methods in Derivatives Pricing Course: Setter: M5MF6 Dr Antoine Jacquier MSc EXAMINATIONS IN MATHEMATICS AND FINANCE DEPARTMENT OF MATHEMATICS April 2016 M5MF6 Advanced Methods in Derivatives Pricing Setter s signature...........................................

More information

MTH The theory of martingales in discrete time Summary

MTH The theory of martingales in discrete time Summary MTH 5220 - The theory of martingales in discrete time Summary This document is in three sections, with the first dealing with the basic theory of discrete-time martingales, the second giving a number of

More information

Lecture Notes for Chapter 6. 1 Prototype model: a one-step binomial tree

Lecture Notes for Chapter 6. 1 Prototype model: a one-step binomial tree Lecture Notes for Chapter 6 This is the chapter that brings together the mathematical tools (Brownian motion, Itô calculus) and the financial justifications (no-arbitrage pricing) to produce the derivative

More information

Class Notes on Financial Mathematics. No-Arbitrage Pricing Model

Class Notes on Financial Mathematics. No-Arbitrage Pricing Model Class Notes on No-Arbitrage Pricing Model April 18, 2016 Dr. Riyadh Al-Mosawi Department of Mathematics, College of Education for Pure Sciences, Thiqar University References: 1. Stochastic Calculus for

More information

Convergence. Any submartingale or supermartingale (Y, F) converges almost surely if it satisfies E Y n <. STAT2004 Martingale Convergence

Convergence. Any submartingale or supermartingale (Y, F) converges almost surely if it satisfies E Y n <. STAT2004 Martingale Convergence Convergence Martingale convergence theorem Let (Y, F) be a submartingale and suppose that for all n there exist a real value M such that E(Y + n ) M. Then there exist a random variable Y such that Y n

More information

18.440: Lecture 32 Strong law of large numbers and Jensen s inequality

18.440: Lecture 32 Strong law of large numbers and Jensen s inequality 18.440: Lecture 32 Strong law of large numbers and Jensen s inequality Scott Sheffield MIT 1 Outline A story about Pedro Strong law of large numbers Jensen s inequality 2 Outline A story about Pedro Strong

More information

Lecture Notes 6. Assume F belongs to a family of distributions, (e.g. F is Normal), indexed by some parameter θ.

Lecture Notes 6. Assume F belongs to a family of distributions, (e.g. F is Normal), indexed by some parameter θ. Sufficient Statistics Lecture Notes 6 Sufficiency Data reduction in terms of a particular statistic can be thought of as a partition of the sample space X. Definition T is sufficient for θ if the conditional

More information

Math489/889 Stochastic Processes and Advanced Mathematical Finance Homework 4

Math489/889 Stochastic Processes and Advanced Mathematical Finance Homework 4 Math489/889 Stochastic Processes and Advanced Mathematical Finance Homework 4 Steve Dunbar Due Mon, October 5, 2009 1. (a) For T 0 = 10 and a = 20, draw a graph of the probability of ruin as a function

More information

The ruin probabilities of a multidimensional perturbed risk model

The ruin probabilities of a multidimensional perturbed risk model MATHEMATICAL COMMUNICATIONS 231 Math. Commun. 18(2013, 231 239 The ruin probabilities of a multidimensional perturbed risk model Tatjana Slijepčević-Manger 1, 1 Faculty of Civil Engineering, University

More information

CS134: Networks Spring Random Variables and Independence. 1.2 Probability Distribution Function (PDF) Number of heads Probability 2 0.

CS134: Networks Spring Random Variables and Independence. 1.2 Probability Distribution Function (PDF) Number of heads Probability 2 0. CS134: Networks Spring 2017 Prof. Yaron Singer Section 0 1 Probability 1.1 Random Variables and Independence A real-valued random variable is a variable that can take each of a set of possible values in

More information

An Introduction to Stochastic Calculus

An Introduction to Stochastic Calculus An Introduction to Stochastic Calculus Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Week 5 Haijun Li An Introduction to Stochastic Calculus Week 5 1 / 20 Outline 1 Martingales

More information

Chapter 6. Importance sampling. 6.1 The basics

Chapter 6. Importance sampling. 6.1 The basics Chapter 6 Importance sampling 6.1 The basics To movtivate our discussion consider the following situation. We want to use Monte Carlo to compute µ E[X]. There is an event E such that P(E) is small but

More information

Risk, Return, and Ross Recovery

Risk, Return, and Ross Recovery Risk, Return, and Ross Recovery Peter Carr and Jiming Yu Courant Institute, New York University September 13, 2012 Carr/Yu (NYU Courant) Risk, Return, and Ross Recovery September 13, 2012 1 / 30 P, Q,

More information

Introduction to Stochastic Calculus With Applications

Introduction to Stochastic Calculus With Applications Introduction to Stochastic Calculus With Applications Fima C Klebaner University of Melbourne \ Imperial College Press Contents Preliminaries From Calculus 1 1.1 Continuous and Differentiable Functions.

More information

CHOICE THEORY, UTILITY FUNCTIONS AND RISK AVERSION

CHOICE THEORY, UTILITY FUNCTIONS AND RISK AVERSION CHOICE THEORY, UTILITY FUNCTIONS AND RISK AVERSION Szabolcs Sebestyén szabolcs.sebestyen@iscte.pt Master in Finance INVESTMENTS Sebestyén (ISCTE-IUL) Choice Theory Investments 1 / 65 Outline 1 An Introduction

More information

Martingales, Part II, with Exercise Due 9/21

Martingales, Part II, with Exercise Due 9/21 Econ. 487a Fall 1998 C.Sims Martingales, Part II, with Exercise Due 9/21 1. Brownian Motion A process {X t } is a Brownian Motion if and only if i. it is a martingale, ii. t is a continuous time parameter

More information

AMH4 - ADVANCED OPTION PRICING. Contents

AMH4 - ADVANCED OPTION PRICING. Contents AMH4 - ADVANCED OPTION PRICING ANDREW TULLOCH Contents 1. Theory of Option Pricing 2 2. Black-Scholes PDE Method 4 3. Martingale method 4 4. Monte Carlo methods 5 4.1. Method of antithetic variances 5

More information

From Discrete Time to Continuous Time Modeling

From Discrete Time to Continuous Time Modeling From Discrete Time to Continuous Time Modeling Prof. S. Jaimungal, Department of Statistics, University of Toronto 2004 Arrow-Debreu Securities 2004 Prof. S. Jaimungal 2 Consider a simple one-period economy

More information

3 Arbitrage pricing theory in discrete time.

3 Arbitrage pricing theory in discrete time. 3 Arbitrage pricing theory in discrete time. Orientation. In the examples studied in Chapter 1, we worked with a single period model and Gaussian returns; in this Chapter, we shall drop these assumptions

More information

14.30 Introduction to Statistical Methods in Economics Spring 2009

14.30 Introduction to Statistical Methods in Economics Spring 2009 MIT OpenCourseWare http://ocw.mit.edu 14.30 Introduction to Statistical Methods in Economics Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

IEOR 3106: Introduction to OR: Stochastic Models. Fall 2013, Professor Whitt. Class Lecture Notes: Tuesday, September 10.

IEOR 3106: Introduction to OR: Stochastic Models. Fall 2013, Professor Whitt. Class Lecture Notes: Tuesday, September 10. IEOR 3106: Introduction to OR: Stochastic Models Fall 2013, Professor Whitt Class Lecture Notes: Tuesday, September 10. The Central Limit Theorem and Stock Prices 1. The Central Limit Theorem (CLT See

More information

Bayesian Linear Model: Gory Details

Bayesian Linear Model: Gory Details Bayesian Linear Model: Gory Details Pubh7440 Notes By Sudipto Banerjee Let y y i ] n i be an n vector of independent observations on a dependent variable (or response) from n experimental units. Associated

More information

Estimating the Greeks

Estimating the Greeks IEOR E4703: Monte-Carlo Simulation Columbia University Estimating the Greeks c 207 by Martin Haugh In these lecture notes we discuss the use of Monte-Carlo simulation for the estimation of sensitivities

More information

The value of foresight

The value of foresight Philip Ernst Department of Statistics, Rice University Support from NSF-DMS-1811936 (co-pi F. Viens) and ONR-N00014-18-1-2192 gratefully acknowledged. IMA Financial and Economic Applications June 11, 2018

More information

Introduction to Probability Theory and Stochastic Processes for Finance Lecture Notes

Introduction to Probability Theory and Stochastic Processes for Finance Lecture Notes Introduction to Probability Theory and Stochastic Processes for Finance Lecture Notes Fabio Trojani Department of Economics, University of St. Gallen, Switzerland Correspondence address: Fabio Trojani,

More information

4: SINGLE-PERIOD MARKET MODELS

4: SINGLE-PERIOD MARKET MODELS 4: SINGLE-PERIOD MARKET MODELS Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 4: Single-Period Market Models 1 / 87 General Single-Period

More information

Strategies and Nash Equilibrium. A Whirlwind Tour of Game Theory

Strategies and Nash Equilibrium. A Whirlwind Tour of Game Theory Strategies and Nash Equilibrium A Whirlwind Tour of Game Theory (Mostly from Fudenberg & Tirole) Players choose actions, receive rewards based on their own actions and those of the other players. Example,

More information

Additional questions for chapter 3

Additional questions for chapter 3 Additional questions for chapter 3 1. Let ξ 1, ξ 2,... be independent and identically distributed with φθ) = IEexp{θξ 1 })

More information

Dynamic Portfolio Execution Detailed Proofs

Dynamic Portfolio Execution Detailed Proofs Dynamic Portfolio Execution Detailed Proofs Gerry Tsoukalas, Jiang Wang, Kay Giesecke March 16, 2014 1 Proofs Lemma 1 (Temporary Price Impact) A buy order of size x being executed against i s ask-side

More information

Lecture 17. The model is parametrized by the time period, δt, and three fixed constant parameters, v, σ and the riskless rate r.

Lecture 17. The model is parametrized by the time period, δt, and three fixed constant parameters, v, σ and the riskless rate r. Lecture 7 Overture to continuous models Before rigorously deriving the acclaimed Black-Scholes pricing formula for the value of a European option, we developed a substantial body of material, in continuous

More information

Practice Exercises for Midterm Exam ST Statistical Theory - II The ACTUAL exam will consists of less number of problems.

Practice Exercises for Midterm Exam ST Statistical Theory - II The ACTUAL exam will consists of less number of problems. Practice Exercises for Midterm Exam ST 522 - Statistical Theory - II The ACTUAL exam will consists of less number of problems. 1. Suppose X i F ( ) for i = 1,..., n, where F ( ) is a strictly increasing

More information

Optimal stopping problems for a Brownian motion with a disorder on a finite interval

Optimal stopping problems for a Brownian motion with a disorder on a finite interval Optimal stopping problems for a Brownian motion with a disorder on a finite interval A. N. Shiryaev M. V. Zhitlukhin arxiv:1212.379v1 [math.st] 15 Dec 212 December 18, 212 Abstract We consider optimal

More information

3 Stock under the risk-neutral measure

3 Stock under the risk-neutral measure 3 Stock under the risk-neutral measure 3 Adapted processes We have seen that the sampling space Ω = {H, T } N underlies the N-period binomial model for the stock-price process Elementary event ω = ω ω

More information

Outline Brownian Process Continuity of Sample Paths Differentiability of Sample Paths Simulating Sample Paths Hitting times and Maximum

Outline Brownian Process Continuity of Sample Paths Differentiability of Sample Paths Simulating Sample Paths Hitting times and Maximum Normal Distribution and Brownian Process Page 1 Outline Brownian Process Continuity of Sample Paths Differentiability of Sample Paths Simulating Sample Paths Hitting times and Maximum Searching for a Continuous-time

More information

based on two joint papers with Sara Biagini Scuola Normale Superiore di Pisa, Università degli Studi di Perugia

based on two joint papers with Sara Biagini Scuola Normale Superiore di Pisa, Università degli Studi di Perugia Marco Frittelli Università degli Studi di Firenze Winter School on Mathematical Finance January 24, 2005 Lunteren. On Utility Maximization in Incomplete Markets. based on two joint papers with Sara Biagini

More information

Module 10:Application of stochastic processes in areas like finance Lecture 36:Black-Scholes Model. Stochastic Differential Equation.

Module 10:Application of stochastic processes in areas like finance Lecture 36:Black-Scholes Model. Stochastic Differential Equation. Stochastic Differential Equation Consider. Moreover partition the interval into and define, where. Now by Rieman Integral we know that, where. Moreover. Using the fundamentals mentioned above we can easily

More information

Lecture 7: Bayesian approach to MAB - Gittins index

Lecture 7: Bayesian approach to MAB - Gittins index Advanced Topics in Machine Learning and Algorithmic Game Theory Lecture 7: Bayesian approach to MAB - Gittins index Lecturer: Yishay Mansour Scribe: Mariano Schain 7.1 Introduction In the Bayesian approach

More information

Non-semimartingales in finance

Non-semimartingales in finance Non-semimartingales in finance Pricing and Hedging Options with Quadratic Variation Tommi Sottinen University of Vaasa 1st Northern Triangular Seminar 9-11 March 2009, Helsinki University of Technology

More information

Reading: You should read Hull chapter 12 and perhaps the very first part of chapter 13.

Reading: You should read Hull chapter 12 and perhaps the very first part of chapter 13. FIN-40008 FINANCIAL INSTRUMENTS SPRING 2008 Asset Price Dynamics Introduction These notes give assumptions of asset price returns that are derived from the efficient markets hypothesis. Although a hypothesis,

More information

Dynamic Admission and Service Rate Control of a Queue

Dynamic Admission and Service Rate Control of a Queue Dynamic Admission and Service Rate Control of a Queue Kranthi Mitra Adusumilli and John J. Hasenbein 1 Graduate Program in Operations Research and Industrial Engineering Department of Mechanical Engineering

More information

How do Variance Swaps Shape the Smile?

How do Variance Swaps Shape the Smile? How do Variance Swaps Shape the Smile? A Summary of Arbitrage Restrictions and Smile Asymptotics Vimal Raval Imperial College London & UBS Investment Bank www2.imperial.ac.uk/ vr402 Joint Work with Mark

More information

IEOR E4703: Monte-Carlo Simulation

IEOR E4703: Monte-Carlo Simulation IEOR E4703: Monte-Carlo Simulation Simulating Stochastic Differential Equations Martin Haugh Department of Industrial Engineering and Operations Research Columbia University Email: martin.b.haugh@gmail.com

More information

Martingale Pricing Theory in Discrete-Time and Discrete-Space Models

Martingale Pricing Theory in Discrete-Time and Discrete-Space Models IEOR E4707: Foundations of Financial Engineering c 206 by Martin Haugh Martingale Pricing Theory in Discrete-Time and Discrete-Space Models These notes develop the theory of martingale pricing in a discrete-time,

More information

MTH6154 Financial Mathematics I Stochastic Interest Rates

MTH6154 Financial Mathematics I Stochastic Interest Rates MTH6154 Financial Mathematics I Stochastic Interest Rates Contents 4 Stochastic Interest Rates 45 4.1 Fixed Interest Rate Model............................ 45 4.2 Varying Interest Rate Model...........................

More information

Point Estimators. STATISTICS Lecture no. 10. Department of Econometrics FEM UO Brno office 69a, tel

Point Estimators. STATISTICS Lecture no. 10. Department of Econometrics FEM UO Brno office 69a, tel STATISTICS Lecture no. 10 Department of Econometrics FEM UO Brno office 69a, tel. 973 442029 email:jiri.neubauer@unob.cz 8. 12. 2009 Introduction Suppose that we manufacture lightbulbs and we want to state

More information

Chapter 4: Asymptotic Properties of MLE (Part 3)

Chapter 4: Asymptotic Properties of MLE (Part 3) Chapter 4: Asymptotic Properties of MLE (Part 3) Daniel O. Scharfstein 09/30/13 1 / 1 Breakdown of Assumptions Non-Existence of the MLE Multiple Solutions to Maximization Problem Multiple Solutions to

More information

STAT/MATH 395 PROBABILITY II

STAT/MATH 395 PROBABILITY II STAT/MATH 395 PROBABILITY II Distribution of Random Samples & Limit Theorems Néhémy Lim University of Washington Winter 2017 Outline Distribution of i.i.d. Samples Convergence of random variables The Laws

More information

RECURSIVE VALUATION AND SENTIMENTS

RECURSIVE VALUATION AND SENTIMENTS 1 / 32 RECURSIVE VALUATION AND SENTIMENTS Lars Peter Hansen Bendheim Lectures, Princeton University 2 / 32 RECURSIVE VALUATION AND SENTIMENTS ABSTRACT Expectations and uncertainty about growth rates that

More information

Characterization of the Optimum

Characterization of the Optimum ECO 317 Economics of Uncertainty Fall Term 2009 Notes for lectures 5. Portfolio Allocation with One Riskless, One Risky Asset Characterization of the Optimum Consider a risk-averse, expected-utility-maximizing

More information

Game Theory: Normal Form Games

Game Theory: Normal Form Games Game Theory: Normal Form Games Michael Levet June 23, 2016 1 Introduction Game Theory is a mathematical field that studies how rational agents make decisions in both competitive and cooperative situations.

More information

FURTHER ASPECTS OF GAMBLING WITH THE KELLY CRITERION. We consider two aspects of gambling with the Kelly criterion. First, we show that for

FURTHER ASPECTS OF GAMBLING WITH THE KELLY CRITERION. We consider two aspects of gambling with the Kelly criterion. First, we show that for FURTHER ASPECTS OF GAMBLING WITH THE KELLY CRITERION RAVI PHATARFOD *, Monash University Abstract We consider two aspects of gambling with the Kelly criterion. First, we show that for a wide range of final

More information

Arbitrage of the first kind and filtration enlargements in semimartingale financial models. Beatrice Acciaio

Arbitrage of the first kind and filtration enlargements in semimartingale financial models. Beatrice Acciaio Arbitrage of the first kind and filtration enlargements in semimartingale financial models Beatrice Acciaio the London School of Economics and Political Science (based on a joint work with C. Fontana and

More information

Steven Heston: Recovering the Variance Premium. Discussion by Jaroslav Borovička November 2017

Steven Heston: Recovering the Variance Premium. Discussion by Jaroslav Borovička November 2017 Steven Heston: Recovering the Variance Premium Discussion by Jaroslav Borovička November 2017 WHAT IS THE RECOVERY PROBLEM? Using observed cross-section(s) of prices (of Arrow Debreu securities), infer

More information

The rth moment of a real-valued random variable X with density f(x) is. x r f(x) dx

The rth moment of a real-valued random variable X with density f(x) is. x r f(x) dx 1 Cumulants 1.1 Definition The rth moment of a real-valued random variable X with density f(x) is µ r = E(X r ) = x r f(x) dx for integer r = 0, 1,.... The value is assumed to be finite. Provided that

More information

3.2 No-arbitrage theory and risk neutral probability measure

3.2 No-arbitrage theory and risk neutral probability measure Mathematical Models in Economics and Finance Topic 3 Fundamental theorem of asset pricing 3.1 Law of one price and Arrow securities 3.2 No-arbitrage theory and risk neutral probability measure 3.3 Valuation

More information

Equivalence between Semimartingales and Itô Processes

Equivalence between Semimartingales and Itô Processes International Journal of Mathematical Analysis Vol. 9, 215, no. 16, 787-791 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/1.12988/ijma.215.411358 Equivalence between Semimartingales and Itô Processes

More information

Pricing Dynamic Solvency Insurance and Investment Fund Protection

Pricing Dynamic Solvency Insurance and Investment Fund Protection Pricing Dynamic Solvency Insurance and Investment Fund Protection Hans U. Gerber and Gérard Pafumi Switzerland Abstract In the first part of the paper the surplus of a company is modelled by a Wiener process.

More information

GPD-POT and GEV block maxima

GPD-POT and GEV block maxima Chapter 3 GPD-POT and GEV block maxima This chapter is devoted to the relation between POT models and Block Maxima (BM). We only consider the classical frameworks where POT excesses are assumed to be GPD,

More information

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER Two hours MATH20802 To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER STATISTICAL METHODS Answer any FOUR of the SIX questions.

More information

1 Consumption and saving under uncertainty

1 Consumption and saving under uncertainty 1 Consumption and saving under uncertainty 1.1 Modelling uncertainty As in the deterministic case, we keep assuming that agents live for two periods. The novelty here is that their earnings in the second

More information

MATH3075/3975 FINANCIAL MATHEMATICS TUTORIAL PROBLEMS

MATH3075/3975 FINANCIAL MATHEMATICS TUTORIAL PROBLEMS MATH307/37 FINANCIAL MATHEMATICS TUTORIAL PROBLEMS School of Mathematics and Statistics Semester, 04 Tutorial problems should be used to test your mathematical skills and understanding of the lecture material.

More information

In Discrete Time a Local Martingale is a Martingale under an Equivalent Probability Measure

In Discrete Time a Local Martingale is a Martingale under an Equivalent Probability Measure In Discrete Time a Local Martingale is a Martingale under an Equivalent Probability Measure Yuri Kabanov 1,2 1 Laboratoire de Mathématiques, Université de Franche-Comté, 16 Route de Gray, 253 Besançon,

More information

Prediction Market Prices as Martingales: Theory and Analysis. David Klein Statistics 157

Prediction Market Prices as Martingales: Theory and Analysis. David Klein Statistics 157 Prediction Market Prices as Martingales: Theory and Analysis David Klein Statistics 157 Introduction With prediction markets growing in number and in prominence in various domains, the construction of

More information

1 Mathematics in a Pill 1.1 PROBABILITY SPACE AND RANDOM VARIABLES. A probability triple P consists of the following components:

1 Mathematics in a Pill 1.1 PROBABILITY SPACE AND RANDOM VARIABLES. A probability triple P consists of the following components: 1 Mathematics in a Pill The purpose of this chapter is to give a brief outline of the probability theory underlying the mathematics inside the book, and to introduce necessary notation and conventions

More information