FE 5204 Stochastic Differential Equations

Similar documents
Why Bankers Should Learn Convex Analysis

Convergence. Any submartingale or supermartingale (Y, F) converges almost surely if it satisfies E Y n <. STAT2004 Martingale Convergence

Martingales. by D. Cox December 2, 2009

Econ 6900: Statistical Problems. Instructor: Yogesh Uppal

Stochastic Calculus, Application of Real Analysis in Finance

4 Martingales in Discrete-Time

Probability without Measure!

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 11 10/9/2013. Martingales and stopping times II

Statistics for Business and Economics: Random Variables (1)

Last Time. Martingale inequalities Martingale convergence theorem Uniformly integrable martingales. Today s lecture: Sections 4.4.1, 5.

An Introduction to Stochastic Calculus

Statistical Methods for NLP LT 2202

6.1 Discrete and Continuous Random Variables. 6.1A Discrete random Variables, Mean (Expected Value) of a Discrete Random Variable

Marquette University MATH 1700 Class 8 Copyright 2018 by D.B. Rowe

Additional questions for chapter 3

Math 14 Lecture Notes Ch Mean

Conditional Probability. Expected Value.

Sec$on 6.1: Discrete and Con.nuous Random Variables. Tuesday, November 14 th, 2017

CHAPTER 6 Random Variables

Advanced Probability and Applications (Part II)

The Binomial distribution

Martingale Measure TA

Lecture 19: March 20

ECON 214 Elements of Statistics for Economists 2016/2017

INTRODUCTION TO MATHEMATICAL MODELLING LECTURES 3-4: BASIC PROBABILITY THEORY

Asymptotic results discrete time martingales and stochastic algorithms

Lecture 23: April 10

Probability mass function; cumulative distribution function

MATH3075/3975 FINANCIAL MATHEMATICS TUTORIAL PROBLEMS

3 Stock under the risk-neutral measure

6. Martingales. = Zn. Think of Z n+1 as being a gambler s earnings after n+1 games. If the game if fair, then E [ Z n+1 Z n

MAT25 LECTURE 10 NOTES. = a b. > 0, there exists N N such that if n N, then a n a < ɛ

Keeping Your Options Open: An Introduction to Pricing Options

HHH HHT HTH THH HTT THT TTH TTT

Finance 651: PDEs and Stochastic Calculus Midterm Examination November 9, 2012

European Contingent Claims

Finance 651: PDEs and Stochastic Calculus Midterm Examination November 9, 2012

Math-Stat-491-Fall2014-Notes-V

then for any deterministic f,g and any other random variable

X i = 124 MARTINGALES

Chapter 6: Random Variables

PAPER 27 STOCHASTIC CALCULUS AND APPLICATIONS

Remarks on Probability

Probability Distributions for Discrete RV

Class Notes on Financial Mathematics. No-Arbitrage Pricing Model

Lecture Notes for Chapter 6. 1 Prototype model: a one-step binomial tree

Math489/889 Stochastic Processes and Advanced Mathematical Finance Homework 5

Introduction to Probability Theory and Stochastic Processes for Finance Lecture Notes

Some Computational Aspects of Martingale Processes in ruling the Arbitrage from Binomial asset Pricing Model

Chapter 7. Random Variables

Drunken Birds, Brownian Motion, and Other Random Fun

TOPIC: PROBABILITY DISTRIBUTIONS

Comparison of proof techniques in game-theoretic probability and measure-theoretic probability

12. THE BINOMIAL DISTRIBUTION

12. THE BINOMIAL DISTRIBUTION

The binomial distribution

On the Lower Arbitrage Bound of American Contingent Claims

Stat 211 Week Five. The Binomial Distribution

Martingales. Will Perkins. March 18, 2013

CHAPTER 10: Introducing Probability

Casino gambling problem under probability weighting

Binomial Random Variables

Chapter 3 - Lecture 5 The Binomial Probability Distribution

18.440: Lecture 35 Martingales and the optional stopping theorem

Outline of Lecture 1. Martin-Löf tests and martingales

CS134: Networks Spring Random Variables and Independence. 1.2 Probability Distribution Function (PDF) Number of heads Probability 2 0.

Equivalence between Semimartingales and Itô Processes

Stochastic calculus Introduction I. Stochastic Finance. C. Azizieh VUB 1/91. C. Azizieh VUB Stochastic Finance

Mathematics of Finance Final Preparation December 19. To be thoroughly prepared for the final exam, you should

MARTINGALES AND LOCAL MARTINGALES

Random Variables. 6.1 Discrete and Continuous Random Variables. Probability Distribution. Discrete Random Variables. Chapter 6, Section 1

Changes of the filtration and the default event risk premium

SECTION 4.4: Expected Value

Stochastic Processes and Financial Mathematics (part one) Dr Nic Freeman

Discrete Probability Distributions

Optimal Stopping. Nick Hay (presentation follows Thomas Ferguson s Optimal Stopping and Applications) November 6, 2008

Laws of probabilities in efficient markets

The Simple Random Walk

MTH6154 Financial Mathematics I Stochastic Interest Rates

CONDITIONAL EXPECTATION AND MARTINGALES

Choice under risk and uncertainty

Theoretical Statistics. Lecture 4. Peter Bartlett

MA 1125 Lecture 14 - Expected Values. Wednesday, October 4, Objectives: Introduce expected values.

Elementary Statistics Lecture 5

Financial Mathematics. Spring Richard F. Bass Department of Mathematics University of Connecticut

BROWNIAN MOTION II. D.Majumdar

MATH20180: Foundations of Financial Mathematics

Mathacle. PSet Stats, Concepts In Statistics Level Number Name: Date: Distribution Distribute in anyway but normal

Probability & Sampling The Practice of Statistics 4e Mostly Chpts 5 7

Chapter 4 Discrete Random variables

Part V - Chance Variability

Universal Portfolios

Introduction Random Walk One-Period Option Pricing Binomial Option Pricing Nice Math. Binomial Models. Christopher Ting.

N(A) P (A) = lim. N(A) =N, we have P (A) = 1.

Mean, Variance, and Expectation. Mean

Real Business Cycles (Solution)

Lesson 97 - Binomial Distributions IBHL2 - SANTOWSKI

Opening Exercise: Lesson 91 - Binomial Distributions IBHL2 - SANTOWSKI

Binomial Random Variables. Binomial Random Variables

The normal distribution is a theoretical model derived mathematically and not empirically.

Transcription:

Instructor: Jim Zhu e-mail:zhu@wmich.edu http://homepages.wmich.edu/ zhu/ January 13, 2009

Stochastic differential equations deal with continuous random processes. They are idealization of discrete stochastic processes. Discrete random process often arise in gaming problems. We use example in gaming to understand discrete case first. Our main reference for this lecture is Chapter 2 of Steele s book.

Disclaimer An example This lecture contains examples related to gambling and their analysis. This analysis is based on past experience and data and is not necessarily applicable to future examples. Thus, practicing the methods discussed in this lecture may result in huge financial gains or losses. The instructor will share in your profit but will absolutely take no responsibly for any of your financial losses and related emotional distresses.

The game An example Tossing a coin A discrete stochastic process Information Bet on flipping a fair coin. Head: the house will double your bet. Tail: you lose your bet to the house.

A discrete stochastic process Tossing a coin A discrete stochastic process Information Play the game i times and always bet 1. Denote the outcome of the ith game by X i. Then X i is a random variable and P(X i = 1) = P(X i = 1) = 1/2. If we start with an initial endowment of w 0 then our total wealth after the ith game is w i = w 0 + X 1 +... + X i. (1) Now (w i ) n i=1 is an example of a discrete stochastic process.

Information An example Tossing a coin A discrete stochastic process Information Suppose we know X 1,...,X i. Does this help us to play the (i + 1)th game? In this case we have no reason to believe so. How do we clearly describe this conclusion? Let us look at the game with n = 3 to get some feeling. We use H to represent a head and T, tail. The information we can get at each stage can be illustrated with the following binary tree.

Tossing a coin A discrete stochastic process Information F 0 F 1 F 2 F 3 HHH HH HHT H HTH HT HTT {Ω} THH TH THT T TTH TT TTT

for 3 coin tosses Concrete case Definition of filtration Adapted random process All the information are represented by F 3 = 2 Ω,Ω = {HHH,HHT,HTH,HTT,THH,THT, TTH, TTT }. Similarly, after 2 tosses F 2 = 2 {HH,HT,TH,TT }, where {HH,HT,TH,TT } = {{HHH,HHT }, {HTH,HTT }, {THH,THT }, {TTH,TTT }}. F 2 has less information than F 3. Similarly, F 1 = 2 {H,T }, where {H,T } = {{HHH,HHT,HTH,HTT }, {THH, THT,TTH,TTT }}. F 0 = {, {Ω}}.

for 3 coin tosses Concrete case Definition of filtration Adapted random process The sequence is a filtration for (w i ) 3 i=0. F : F 0 F 1 F 2 F 3 For each i, F i is a set algebra, i.e., its elements as sets are closed under union, intersection and compliment.

General filtration An example Concrete case Definition of filtration Adapted random process Let Ω be a sample space (representing possible states of a chance event). A sequence of algebra (σ-algebra when Ω is infinite) F : F i,i = 0,1,...,n satisfying is called a filtration. F 0 F 1 F 2... F n (2) If F 0 = {Ω} and F n = Ω then F is called an information structure.

Adapted random process Concrete case Definition of filtration Adapted random process If a random variable such as w i relies only on information up to time i, then, for any a, (w i < a) F i. In other words, w i is F i -measurable. We say a stochastic process X = (X i ) is F-adapted if, for each i, X i is F i -measurable. The random process (w i ) in the coin toss example is F-adapted.

Fair game and martingale Fair game and martingale Sub and supper-martingale Examples of martingales Generating martingales Toss a fair coin is a fair game in the sense that no player has an advantage. In other words, restricted to information at (i 1)th game, the expectation of w i and w i 1 are the same. Mathematically, E[w i F i 1 ] = w i 1. (3) A stochastic process satisfying (3) is called a martingale.

Sub and supper-martingale Fair game and martingale Sub and supper-martingale Examples of martingales Generating martingales Change to an unfair coin with probability p 1/2 for head and 1 p for tail. We have an unfair game. when p > 1/2 (p < 1/2) we have E[w i F i 1 ] > w i 1 (E[w i F i 1 ] < w i 1 ). (4) We call such a stochastic process sub-martingale (super-martingale). It represent a game that favors the player (house).

Examples An example Fair game and martingale Sub and supper-martingale Examples of martingales Generating martingales 1 Let X i be independent with E[X i ] = 0 for all i. Then, S 0 = 0, S i = X 1 +... + X i defines a martingale. 2 Let X i be independent with E[X i ] = 0 and Var[X i ] = σ 2 for all i. Then, M 0 = 0, M i = S 2 i iσ 2 gives a martingale. 3 Let X i be independent random variables with E[X i ] = 1 for all i. Then, M 0 = 0, M i = X 1... X i gives a martingale with respect to F i.

Generating martingales Fair game and martingale Sub and supper-martingale Examples of martingales Generating martingales Let Y i be iid (independent identically distributed) and φ(λ) = E[exp(λY i )] <. Then X i = exp(λy i )/φ(λ) are independent and E[X i ] = 1 for all i. Then, M 0 = 1, M i = X 1... X i = exp(λ i Y k )/φ(λ) i k=1 is a martingale. In particular, if there is λ 0 0 such that φ(λ 0 ) = 1 then, for S i = ik=1 Y k, is a martingale. M i = exp(λ 0 S i )

The motivating question The motivating question Mathematical formulation The answer Can we take advantage of a fair game by changing the betting size?

transform The motivating question Mathematical formulation The answer Let us try to formulate the problem mathematically. Let (M i ) n i=1 be an F i adapted martingale representing this fair game. Denote M 0 = 0. A i is the bet for the ith game. A i has to be F i 1 -measurable called predictable (determined after the (i 1)th game). At the end of the ith game the player has w i = i A k (M k M k 1 ). (5) k=1 The new stochastic process (w i ) n i=1 is a martingale transform of (M i ) n i=0.

The motivating question Mathematical formulation The answer Transform Theorem A martingale transform of a martingale is again a martingale. Transform Theorem Let M i be a martingale and A i be a predictable process with respect to F i. Then the martingale transform w i = ik=1 A k(m k M k 1 ) is also a martingale. Proof. E[w i w i 1 F i 1 ] = E[A i (M i M i 1 ) F i 1 ] = A i E[M i M i 1 F i 1 ] = 0.

The motivating question The motivating question Mathematical formulation The answer Can we take advantage of a fair game by selecting the time of finishing the game?

Stopping time An example The motivating question Mathematical formulation The answer Let F = (F i ) i=0 be a filtration. A random variable τ that takes value 0,1,... and + is an F-stopping time if (τ i) F i.

Stopped process An example The motivating question Mathematical formulation The answer Let (X i ) i=0 be a random process. If τ < with probability 1 then we define the stopped process by τ as X τ = 1(τ = k)x k. k=0

Stopping Time Theorem The motivating question Mathematical formulation The answer Stopping Time Theorem A stopped martingale is again a martingale. Stopping Time Theorem Let (M i ) i=0 be an F martingale and let τ be an F-stopping time. Then (M i τ ) i=0 is a F-martingale.

Proof An example The motivating question Mathematical formulation The answer We may assume M 0 = 0. Write M n τ = = = n 1(τ = k)m k k=0 n (1(τ k) 1(τ k + 1))M k k=0 n 1(τ k)(m k M k 1 ). k=1 Thus, (M n τ ) is a martingale transform and, therefore, a martingale.

Jensen s inequality An example Jensen s inequality Generating submartingales Doob s inequality Jensen s inequality Let X be a random variable on probability space (Ω, F,P) and G is a σ-algebra contained in F. Suppose that φ is a convex function. Then φ(e[x G]) E[φ(X) G]. Jensen s inequality follows directly from the definition of a convex function.

Generating submartingales Jensen s inequality Generating submartingales Doob s inequality Let M i is a martingale with respect to F i, and φ is a convex function then φ(m i 1 ) = φ(e[m i F i 1 ]) E[φ(M i ) F i 1 ]. That is to say φ(m i ) is a submartingale. In particular, M i p,p 1 are submartingales.

Doob s Maximal inequality Jensen s inequality Generating submartingales Doob s inequality Doob s Maximal inequality Let M i be a nonnegative submartingale and λ > 0. Then λp(mi λ) E[M i 1(Mi λ)] E[M i ]. Here Mi = sup 0 j i M j.

Proof of Doob s inequality Jensen s inequality Generating submartingales Doob s inequality First repeatedly use submartingale inequality on M i and towel property of conditional expectation we have Second, is a stopping time and E[M j 1 A ] E[M i 1 A ], j i,a F j. (6) τ := min{j : M j λ} P(M i λ) = P(τ i).

Proof of Doob s inequality Jensen s inequality Generating submartingales Doob s inequality Note that on the set (τ i) we have M τ λ. Thus, i λ1(τ i) M τ 1(τ i) = M j 1(τ = j). Finally, taking expectation and using (6) we have i λp(mi λ) E[ M i 1(τ = j)] E[M i 1(Mi λ)] E[M i ]. j=0 j=0

theorem Let M i be a martingale with E[Mi 2 ] B <. Then there exist a random variable M with E[M ] 2 B such that P( lim i M i = M ) = 1 and lim i E[(M i M ) 2 ] = 0.

Proof An example Set M 0 = 0 and denote d k = M k M k 1 we have M i = ik=1 d k and i i E[Mi 2 ] = E[( d k ) 2 ] = E[dk 2 ]. k=1 k=1 So the hypothesis implies that E[dk 2 ] B. k=1

Proof An example Let D be the set where M i diverges. Then D = m=1 i=1 {ω : sup M k M i 1 m }. k i Using the Doob maximal inequality we have, for any i, P(sup k i M k M i 1 m ) = P(sup k i Thus, D has a measure 0. (M k M i ) 2 1 m 2) m2 E[dk 2 ]. k=i

Proof An example Let M be the limit of M i, we have E[(M M i ) 2 ] = E[dk 2 ]. k=i Thus, lim E[(M i M ) 2 ] = 0. i

Instruction An example Exercise 1.1 Exercise 1.2 Exercise 1.3 Exercise 1.4 Exercise 1.5 Exercise 1.6 Exercise 1.7* Exercise 1.8* Homework is an important part of learning SDE. Homework problems are given as exercises following each lecture and those marked with * are optional. The homework of this lecture is due on Jan 27 at the beginning of the lecture. Discussions with me or classmates are encouraged but the final work should be independently completed. I expect that you submit clear and neatly written work with careful justifications for your conclusions.

Exercise 1.1 An example Exercise 1.1 Exercise 1.2 Exercise 1.3 Exercise 1.4 Exercise 1.5 Exercise 1.6 Exercise 1.7* Exercise 1.8* Consider a game of betting 1 dollar on a fair coin. Using random variable X i to represent the outcome of the ith game, we have P(X i = 1) = P(X i = 1) = 1/2. Suppose that we use an exit strategy of stopping the game when we either win A dollars or loss B dollars. Calculate the probability of winning and losing.

Exercise 1.2 An example Exercise 1.1 Exercise 1.2 Exercise 1.3 Exercise 1.4 Exercise 1.5 Exercise 1.6 Exercise 1.7* Exercise 1.8* Consider a game of betting 1 dollar on a unfair coin with probability p 1/2 for head. Using random variable X i to represent the outcome of the ith game, we have P(X i = 1) = p and P(X i = 1) = q = 1 p. Let S 0 = 0, and S i = X 1 +...,+X i. Show that M i = (q/p) S i is a martingale.

Exercise 1.3 An example Exercise 1.1 Exercise 1.2 Exercise 1.3 Exercise 1.4 Exercise 1.5 Exercise 1.6 Exercise 1.7* Exercise 1.8* Repeat Exercise 1.1 for a unfair coin with probability p 1/2 for head.

Exercise 1.4 An example Exercise 1.1 Exercise 1.2 Exercise 1.3 Exercise 1.4 Exercise 1.5 Exercise 1.6 Exercise 1.7* Exercise 1.8* Let τ be a stopping time for the filtration F i,i = 1,2,... Show that the random process A i = 1(τ i) is predictable. Hint: 1(τ i) = 1 1(τ < i).

Exercise 1.5 An example Exercise 1.1 Exercise 1.2 Exercise 1.3 Exercise 1.4 Exercise 1.5 Exercise 1.6 Exercise 1.7* Exercise 1.8* Let X i be iid random variables with E[X i ] = 0 and Var[X i ] = σ 2 for all i. Show that M 0 = 0, is a martingale. M i = S 2 i iσ 2

Exercise 1.6 An example Exercise 1.1 Exercise 1.2 Exercise 1.3 Exercise 1.4 Exercise 1.5 Exercise 1.6 Exercise 1.7* Exercise 1.8* Let M i,i = 0,1,... be a martingale with M 0 = 0 and let and d i = M i M i 1. Show that i E[Mi 2 ] = E[dk 2 ]. k=1

Exercise 1.1 Exercise 1.2 Exercise 1.3 Exercise 1.4 Exercise 1.5 Exercise 1.6 Exercise 1.7* Exercise 1.8* Exercise 1.7* (Doob s decomposition) Let M i,i = 0,1,... be a F i -martingale with E[Mi 2 ] <. Show that we can write Mi 2 = N i + A i, where (1) N i is a F i -martingale; (2) A i A i 1 i.e. A i is monotone; and (3) A i is predictable with respect to F i. Hint: Set A 0 = 0 and define A i recursively by A i+1 = A i + E[(M i+1 M i ) 2 F i ],i 1.

Exercise 1.8* An example Exercise 1.1 Exercise 1.2 Exercise 1.3 Exercise 1.4 Exercise 1.5 Exercise 1.6 Exercise 1.7* Exercise 1.8* Let M i,i = 0,1,... be a subartingale and let τ and ν be bounded stopping time such that ν τ. Show that E[M ν ] E[M τ ].