Random Time Change with Some Applications. Amy Peterson

Similar documents
Stochastic Dynamical Systems and SDE s. An Informal Introduction

Equivalence between Semimartingales and Itô Processes

Last Time. Martingale inequalities Martingale convergence theorem Uniformly integrable martingales. Today s lecture: Sections 4.4.1, 5.

Martingales. by D. Cox December 2, 2009

An Introduction to Point Processes. from a. Martingale Point of View

AMH4 - ADVANCED OPTION PRICING. Contents

The ruin probabilities of a multidimensional perturbed risk model

A No-Arbitrage Theorem for Uncertain Stock Model

Hedging under Arbitrage

Introduction to Probability Theory and Stochastic Processes for Finance Lecture Notes

STOCHASTIC CALCULUS AND BLACK-SCHOLES MODEL

PAPER 27 STOCHASTIC CALCULUS AND APPLICATIONS

A note on the existence of unique equivalent martingale measures in a Markovian setting

Drunken Birds, Brownian Motion, and Other Random Fun

Stochastic calculus Introduction I. Stochastic Finance. C. Azizieh VUB 1/91. C. Azizieh VUB Stochastic Finance

Functional vs Banach space stochastic calculus & strong-viscosity solutions to semilinear parabolic path-dependent PDEs.

RMSC 4005 Stochastic Calculus for Finance and Risk. 1 Exercises. (c) Let X = {X n } n=0 be a {F n }-supermartingale. Show that.

Constructive martingale representation using Functional Itô Calculus: a local martingale extension

Enlargement of filtration

Hedging under arbitrage

Optimal stopping problems for a Brownian motion with a disorder on a finite interval

Introduction to Stochastic Calculus With Applications

CONVERGENCE OF OPTION REWARDS FOR MARKOV TYPE PRICE PROCESSES MODULATED BY STOCHASTIC INDICES

IEOR E4703: Monte-Carlo Simulation

Limit Theorems for the Empirical Distribution Function of Scaled Increments of Itô Semimartingales at high frequencies

Sensitivity of American Option Prices with Different Strikes, Maturities and Volatilities

The stochastic calculus

Hedging of Contingent Claims under Incomplete Information

A Note on the No Arbitrage Condition for International Financial Markets

Stochastic Calculus, Application of Real Analysis in Finance

Tangent Lévy Models. Sergey Nadtochiy (joint work with René Carmona) Oxford-Man Institute of Quantitative Finance University of Oxford.

M5MF6. Advanced Methods in Derivatives Pricing

MSc Financial Engineering CHRISTMAS ASSIGNMENT: MERTON S JUMP-DIFFUSION MODEL. To be handed in by monday January 28, 2013

Insiders Hedging in a Stochastic Volatility Model with Informed Traders of Multiple Levels

Optimal trading strategies under arbitrage

There are no predictable jumps in arbitrage-free markets

American Option Pricing Formula for Uncertain Financial Market

BROWNIAN MOTION Antonella Basso, Martina Nardon

Minimal Variance Hedging in Large Financial Markets: random fields approach

4 Martingales in Discrete-Time

STOCHASTIC INTEGRALS

Are the Azéma-Yor processes truly remarkable?

An Introduction to Stochastic Calculus

Martingale representation theorem

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 11 10/9/2013. Martingales and stopping times II

Large Deviations and Stochastic Volatility with Jumps: Asymptotic Implied Volatility for Affine Models

Risk Neutral Measures

American Foreign Exchange Options and some Continuity Estimates of the Optimal Exercise Boundary with respect to Volatility

Remarks: 1. Often we shall be sloppy about specifying the ltration. In all of our examples there will be a Brownian motion around and it will be impli

An overview of some financial models using BSDE with enlarged filtrations

Basic Concepts and Examples in Finance

Lecture 1: Lévy processes

MESURES DE RISQUE DYNAMIQUES DYNAMIC RISK MEASURES

Are the Azéma-Yor processes truly remarkable?

Non-semimartingales in finance

Lecture 4. Finite difference and finite element methods

Comparison of proof techniques in game-theoretic probability and measure-theoretic probability

On Complexity of Multistage Stochastic Programs

Asymptotic results discrete time martingales and stochastic algorithms

MATH3075/3975 FINANCIAL MATHEMATICS TUTORIAL PROBLEMS

Changes of the filtration and the default event risk premium

Exact Sampling of Jump-Diffusion Processes

An Introduction to Stochastic Calculus

Martingales & Strict Local Martingales PDE & Probability Methods INRIA, Sophia-Antipolis

VOLATILITY TIME AND PROPERTIES OF OPTION PRICES

Barrier Options Pricing in Uncertain Financial Market

Exponential martingales and the UI martingale property

Logarithmic derivatives of densities for jump processes

Bandit Problems with Lévy Payoff Processes

The Azéma-Yor Embedding in Non-Singular Diffusions

Chapter 3: Black-Scholes Equation and Its Numerical Evaluation

Pricing in markets modeled by general processes with independent increments

Constructing Markov models for barrier options

THE MARTINGALE METHOD DEMYSTIFIED

An Introduction to Stochastic Calculus

Local Volatility Dynamic Models

No-arbitrage theorem for multi-factor uncertain stock model with floating interest rate

Arbitrage of the first kind and filtration enlargements in semimartingale financial models. Beatrice Acciaio

Universität Regensburg Mathematik

Introduction to Stochastic Calculus

Weierstrass Institute for Applied Analysis and Stochastics Maximum likelihood estimation for jump diffusions

Hints on Some of the Exercises

Conditional Full Support and No Arbitrage

Math-Stat-491-Fall2014-Notes-V

Stochastic Integral Representation of One Stochastically Non-smooth Wiener Functional

The Azema Yor embedding in non-singular diusions

Additional questions for chapter 3

Deterministic Income under a Stochastic Interest Rate

SHORT-TERM RELATIVE ARBITRAGE IN VOLATILITY-STABILIZED MARKETS

Hedging of Contingent Claims in Incomplete Markets

Math 6810 (Probability) Fall Lecture notes

Introduction to Stochastic Calculus

1 Mathematics in a Pill 1.1 PROBABILITY SPACE AND RANDOM VARIABLES. A probability triple P consists of the following components:

Risk, Return, and Ross Recovery

1.1 Basic Financial Derivatives: Forward Contracts and Options

S t d with probability (1 p), where

In Discrete Time a Local Martingale is a Martingale under an Equivalent Probability Measure

On the Lower Arbitrage Bound of American Contingent Claims

Math 416/516: Stochastic Simulation

EFFICIENT MONTE CARLO ALGORITHM FOR PRICING BARRIER OPTIONS

Transcription:

Random Time Change with Some Applications by Amy Peterson A thesis submitted to the Graduate Faculty of Auburn University in partial fulfillment of the requirements for the Degree of Master of Science Auburn, Alabama May 4, 214 Approved by Olav Kallenberg, Chair, Professor of Mathematics Ming Liao, Professor of Mathematics Erkan Nane, Professor of Mathematics Jerzy Szulga, Professor of Mathematics

Abstract This thesis is a survey of known results concerning random time change and its applications. It will cover basic probabilistic concepts and then follow with a detailed look at major results in several branches of probability all concerning random time change. The first of these major results is a theorem on how an increasing process adapted to a filtration can be used to transform the time scale and filtration. Next we show how an arbitrary continuous local martingale can be changed into a Brownian motion. We then show that a simple point process can be changed into a Poisson process using a random time change. Lastly, we look at an application of random time change to create solutions of stochastic differential equations. ii

Acknowledgments I would like to thank my advisor, Dr. Olav Kallenberg, for his advice and encouragement. I would also like to thank my committee members for their support. Furthermore I am grateful to everyone at the Auburn University Mathematics Department, my family, and friends. iii

Table of Contents Abstract....................................... Acknowledgments.................................. ii iii 1 Introduction................................. 1 1.1 Summary................................. 1 1.2 Definitions and Primary Concepts.................... 2 1.3 Martingales and Brownian Motion.................... 5 2 Time Change of Filtrations....................... 8 2.1 Time Change of Filtrations........................ 8 3 Time Change of Continuous Martingales............... 11 3.1 Quadratic Variation............................ 11 3.2 Stochastic Integration.......................... 13 3.3 Brownian Motion as a Martingale.................... 15 3.4 Time Change of Continuous Martingales................ 17 3.5 Time Change of Continuous Martingales in Higher Dimensions.... 21 4 Time Change of Point Processes.................... 23 4.1 Random Measures and Point Processes................. 23 4.2 Doob-Meyer Decomposition....................... 24 4.3 Time Change of Point Processes..................... 24 5 Application of Time Change to Stochastic Differential Equations 3 5.1 Stochastic Differential Equations.................... 3 5.2 Brownian Local Time........................... 31 5.3 Application of Time Change to SDEs.................. 33 iv

Bibliography.................................... 41 v

Chapter 1 Introduction 1.1 Summary This thesis discusses the subject of random time change by looking at several known results in various areas of probability theory. In the first chapter, we give several basic definitions and theorems of probability theory, including a section discussing martingales and Brownian motion. These definitions and theorems will be used throughout the thesis and are in most basic probability texts. In the second chapter we began with our results on random time change. The main result of the second chapter discusses how an increasing process adapted to a filtration can be used to create a process of optional times that transform the time scale and filtration. This theorem will appear in the following chapters particularly in regard to the creation of a process of optional times. In the third chapter we begin with a discussion of the quadratic variation process and stochastic integration. These topics are also fundamental in probability theory, and will be important for all further results in the thesis. We will omit some of the proofs of the more detailed results but will include references. We then use these new concepts to prove Lévy s characterization of Browian motion. This theorem shows that a Brownian motion is a martingale and gives conditions for a continuous local martingale to be a Brownian motion. We then use Lévy s characterization of Brownian motion to prove the main result of chapter three that is, using a process of optional times, we can change an arbitrary continuous local martingale into a 1

Brownian motion. Our process of optional times depends on the quadratic variation of the local martingale and so we will break the proof of the main result into two cases depending on whether the limit of the quadratic variation is finite or infinite. Lastly in chapter three, we discuss two different ways to extend our main result to higher dimensions. To start our fourth chapter we introduce random measures, point processes, and Poisson processes. Following that we introduce the Doob-Meyer decomposition of a submartingale and explain its relation to random measures. The Doob-Meyer decomposition is an in-depth topic in probability theory we will only mention it and give reference for further study. Lastly, we proceed to prove the main result of the chapter, that is, an arbitrary simple point process to change it into a Poisson process. In our last chapter, we will look at an application of some of our previous results to stochastic differential equations (SDEs). We began the chapter by discussing what stochastic differential equations are and the type of stochastic differential equations we are interested in. We then discuss the challenging topic of Brownian local time and continuous additive functionals. Lastly we prove a necessary and sufficient conditions for a solution to certain stochastic differential equations by constructing solutions to the SDEs using random time change. 1.2 Definitions and Primary Concepts Let (Ω, A, P ) be a probability space and T be a subset of R = [, ]. A non-decreasing family F = (F t ) of σ-fields such that F t A for t T is called a filtration on T. A process X is said to be adapted to a filtration F = (F t ) if X t is F t -measurable for every t T. Given a process X, the smallest filtration F such that X is adapted to F is the filtration generated by X, that is F t = σ{x s ; s t}. Also define F = σ( t F t). If F is a filtration on T = R + we can define another 2

filtration F t + = h> F t+h. We call a filtration F on R + right-continuous if F = F +. Note that F + = (F + ) + so F + itself is right-continuous. Unless stated otherwise, filtrations on R + are assumed to be right-continuous. Let F t = σ(x t ) for some process X and let N t = {F Ω; F G, G F t, P (G) = }, N is the collection of all null sets. Then the filtration H defined by H t = σ(f t N t ) for all t is called the completion of the filtration. Any filtration that has the above properties is called a complete filtration. A random time τ is a measurable mapping τ : Ω T, where T is the closure of T. Given a filtration F on T, a random time τ is called an optional time if {ω; τ(ω) t} F t for every t T. Further we call a random time τ weakly optional if {ω; τ(ω) < t} F t for every t T. We define the σ-field F τ associated with an optional time τ by F τ = {A A; A {τ t} F t, t T }. The first lemma shows that weakly optional and optional are the same on a filtration that is right-continuous. Lemma 1.2.1 If F is any filtration and τ is F-optional time then it is F weakly optional. If F is a right-continuous filtration and τ is a F-weakly optional time then τ is F-optional. Proof. Let τ be an F-optional time. Now {τ < t} = n {τ 1 (1/n)} F t, and so τ is an F weakly optional time. 3

Let F be right-continuous and τ be an F weakly optional time, then {τ t} = h>{τ < t + h} F t+h. Thus τ is F + -optional. But F is right-continuous, so F = F +, which means that τ is F-optional. The following lemma expresses a closure property of optional times and their associated σ-fields. Lemma 1.2.2 If F is a right-continuous filtration and τ n are F-optional times, then τ = inf n τ n is an optional time and F τ = n F τ n. Proof. Since {τ < t} = n {τ n < t} F t, t, τ is weakly optional and thus optional, by the right continuity of F. To prove the second part we note that, again, since F is right-continuous, (F + ) τ = F τ. Let A n F τ n. Then A {τ < t} = A n {τ n < t} = n (A {τ n < t}) F t, and so, n F τ n F τ. To get the reverse inclusion, let A F τ. Then for any n, A {τ n t} = A {τ t} {τ n t}. Since this is true for all n, we have n F τ n F τ. Thus n F τ n = F τ. 4

For any random variable ξ with distribution µ we define the characteristic function φ of ξ to be φ(t) = Ee itξ = e itx µ(dx), t R. Characteristic functions uniquely determine the distribution of a random variable. We will need to know that the characteristic function for a normally distributed random variable ξ with mean µ and variance σ 2 is φ(t) = e iµt σ2 t 2 /2, t R. 1.3 Martingales and Brownian Motion A process M in R d is called a martingale with respect to a filtration F, or an F-martingale, if M t is integrable for each t, M is adapted to F, and E[M t F s ] = M s a.s., s t. A martingale is called square-integrable if EM 2 t < for all t. A process X is said to be uniformly integrable if lim sup E[ X t ; X t > r] =. r t T First we prove a general result about uniformly integrable processes. Lemma 1.3.1 For p > 1, every L p -bounded process is uniformly integrable. Proof. Assume X is bounded in L p, then E( X t p ) <. Let p and q be such that 1 p + 1 q = 1 then, from Hölder s inequality, we get for u E( X t 1 X >u ) (E( X p )) 1/p (E( 1 X >u q ) 1/q. 5

Thus X is uniformly integrable. A process M is called a local martingale if it is adapted to a filtration F and there exist optional times τ n such that τ n and the process M t = M τn t M is a martingale for every n. A Brownian motion is a continuous process B in R with independent increments, B =, and, for all t, EB t = and Var(B t ) = t. This definition implies that B t is normally distributed with mean and variance t. A process B in R d is called a Brownian motion if its components are independent Brownian motions in R. A Brownian motion B adapted to a general filtration F on R + such that the process B s+t B s is independent of F s for all s is said to be an F-Brownian motion. A process X on R + is said to be right-continuous if X t = X t+ for all t, and X has left limits if the left limits X t exist and are finite for all t. The regularization theorem of martingales allows us to assume all martingales to be rightcontinuous with left limits, here abbreviated as rcll. We state, without proof, the more general version of this theorem for submartingales. We follow with a result relating uniform integrability to the convergence of a martingale to a random variable. These are classical results in the study of martingales we refer to [3] for the proofs and more detailed discussion. Theorem 1.3.1 Let X be an F-submartingale. Then X has a rcll version if and only if EX is right-continuous. Theorem 1.3.2 Let M be a right-continuous F-martingale on an unbounded index set T and define u = sup T. Then the following conditions are equivalent: i) M is uniformly integrable, ii) M t converges in L 1 to some M u as t, iii) M can be extended to a martingale on T {u}. 6

The next result, the optional sampling theorem, shows that, under certain conditions, the martingale property is preserved under a random time change. Theorem 1.3.3 Let M be an F-martingale on R +, where M is right-continuous, and consider two optional times σ and τ, where τ is bounded. Then M τ is integrable, and M σ τ = E[M τ F σ ] a.s. The statement extends to unbounded times τ if and only if M is uniformly integrable. 7

Chapter 2 Time Change of Filtrations In this section, we begin by showing how we can use an increasing process X adapted to a filtration F to transform the time scale and the filtration. We will then apply this result in chapter 3 to the case where X, the increasing process, is the quadratic variation process of a continuous local martingale, and in chapter 4 when the increasing process is the compensator of a increasing process related to a point process. 2.1 Time Change of Filtrations We now state our main result using increasing process X adapted to a filtration F that will transform the time scale and the filtration. Theorem 2.1.1 Let X be a non-decreasing right-continuous process adapted to some right-continuous filtration F, and define τ s = inf{t > ; X t > s}, s. Then i) (τ s ) is a right-continuous process of optional times, generating a right-continuous filtration G defined by G s = F τs for s, ii) if X is also continuous and σ is F-optional, X σ is G-optional and F σ G Xσ. 8

Note that, when composing the process X with an optional time σ, we get a random variable X σ. Thus it makes sense to consider X σ as an optional time. Proof. (i) Since X is right-continuous, the process (τ s ) is right-continuous as well. We want to show that τ s is an optional time for every s. By definition of τ s, {τ s < t} {X r > s}, t >. r Q (,t) To prove the inclusion in the opposite direction, fix an ω {τ s < t}. Then for some t, we have t = τ s (ω) and X t (ω) > s. Since (s, ) is an open set containing X t (ω), there exists a neighborhood around X t (ω) that remains in the set (s, ). If t is rational we have proved the inclusion. If not, since X is right-continuous and Q is dense in R, there exists an r Q such that t < r < t and X r (ω) (s, ). So ω {X r > s}, and Therefore, {τ s < t} {X r > s}, t >. r Q (,t) {τ s < t} = {X r > s} F t, t >, r Q (,t) which means that τ s is weakly optional hence τ s is optional. Since τ s is a process of optional times, G s = F τs is a filtration and we need to show that it is right-continuous. Now G + s = u>s G u = u>s F τu = u>s F + τ u = F + τ s = F τs = G s where the second and last equality come from the fact that G s = F τs. The third equality holds because F is right-continuous, and the fourth equality holds since τ u τ s. 9

(ii) Let X be continuous and let σ > be an F-optional time. By the definition of τ s and the fact that X is non-decreasing, we see that {X σ s} = {σ τ s }. Since σ and τ s are both optional times, {σ t} and {τ s t} are F t -measurable, and since F t is a σ-field, we have {σ τ s } = {σ t} {τ s t} c F t. So {σ τ s } F τs by definition of F τs. Thus X σ is a G-optional time. We can extend to any σ by Lemma 1.2.2. Since X σ is an optional time, G Xσ is a σ-field. If we let A F σ be arbitrary, the above arguments give A {X σ s} = A {σ τ s } F τs G s. This shows that, for any A F σ, we also have A G Xσ, and so F σ G Xσ. 1

Chapter 3 Time Change of Continuous Martingales In order to change an arbitrary continuous local martingale into a Brownian motion, we will use a process of optional times such as in Theorem 2.1.1, except that our non-decreasing process will be the quadratic variation process of the continuous local martingale. Before getting to this result, we define the quadratic variation process and state some lemmas pertaining to it. Then we will prove Lévy s theorem, which characterizes Brownian motion as a martingale. This will be used in our proof of the main result. 3.1 Quadratic Variation For local martingales M and N, the process [M, N] is called the covariation of M and N, and the process [M, M] is called the quadratic variation. It is often denoted by [M]. The quadratic variation process can be constructed as a limit of the sum of squares of the original process; however, we will define the process based on a martingale characterization. We state, without proof, the existence theorem of the process [M, N] for continuous local martingales M and N. Theorem 3.1.1 For continuous local martingales M and N there exists an a.s. unique continuous process [M, N] of locally finite variation, with [M, N] = and such that MN [M, N] is a local martingale. The next lemma lists, without proof, several properties of the covariation process. 11

Theorem 3.1.2 Let M and N be continuous local martingales, and let [M, N] be the covariation process defined in Theorem 3.1.1. Then [M, N] is a.s. bilinear, symmetric, and satisfies [M, N] = [M M, N N ]. Further, [M] is a.s. non-decreasing and, for any optional time τ, [M τ, N] = [M τ, N τ ] = [M, N] τ a.s. The next result shows that a local martingale has the same intervals of constancy as its quadratic variation process. Lemma 3.1.1 Let M be a continuous local F-martingale, and fix any s < t. Then [M] s = [M] t if and only if a.s. M u = M t for all u [s, t). Proof. First assume that [M] s = [M] t. Then [M] s = [M] u for all s < u t since quadratic variation is nondecreasing. Then σ = inf{w > s; [M] w > [M] s } is an F-optional time. Also, N r = M σ (s+r) M s, r, is a continuous local martingale with respect to the filtration ˆF r = F (s+r), r. By the definition of σ, [N] r = [M] σ (s+r) [M] s =, r. 12

Since N is a local martingale, there exists a sequence of optional times ρ n such that ρ n a.s. and each process N ρn t is a true martingale. Now [N] ρn r = a.s. and E(N 2 ρ n r [N] ρn r) = E(N 2 ρ n r). Since N 2 ρ n r [N] ρn r is a martingale, E(N 2 ρ n r [N] ρn r) =. So E(Nρ 2 n r) =, and thus N ρn r = a.s. Letting ρ n, we get N = a.s. Thus M σ (s+r) = M s, and so M u = M t for any u [s, t]. To prove the converse, assume M u = M t for all s u < t. Then τ = inf{w > s; M s < M w } is an optional time. Now N r = M τ (s+r) M s is a continuous local martingale with respect to ˆF, defined by ˆF r = F s+r. By definition of τ, N r = M τ (s+r) M s =. Let ρ n be a sequence of optional times such that ρ n and N r ρn is a martingale. Then N 2 r ρ n [N] r ρn is a martingale and E[N 2 r ρ n [N] r ρn ] =. Since N r =, we have E[N] r ρn =. Letting ρ n, we have E[N] r =, which gives us [M τ (s+r) M s ] = a.s. And so we have [M] s = [M] t. 3.2 Stochastic Integration We now introduce the concept of stochastic integration. We will start by defining an elementary stochastic integral as a sum of random variables. Let τ n be optional 13

times and ξ k be bounded F τk -measurable random variables, and define V t = k n ξ k 1{t > τ k }, t. Then for any process X, we may define the integral process V X by (V X) t = V s dx s = k n ξ k (X t X t τk ). We call the process V X an elementary stochastic integral. A process V on R is said to be progressively measurable, or simply progressive, if its restriction to Ω [, t] is F t B[, t] - measurable for every t. Originally stochastic integrals were extended to progressive processes using an approximation of the elementary stochastic integrals defined above. However in the following theorem we extend the notion of stochastic integrals by a martingale characterization. Theorem 3.2.1 Let M be a continuous local martingales and V a progressive process such that (V 2 [M]) t < a.s. for every t >. Then there exists an a.s. unique continuous local martingale V M with (V M) = and such that [V M, N] = V [M, N] a.s. for every continuous local martingale N. Since covariation has locally finite variation, the integral V [M, N] is a Lebesgue- Steljes integral. This allows us to uniquely characterize the stochastic integral in terms of a Lebesgue-Steljes integral. We omit the proof of this theorem but refer to [3] for the proof and a more detailed discussion of stochastic integrals. 14

A continuous process X is said to be a semi-martingale if it can be represented as a sum M +A, where M is a continuous local martingale and A is a continuous, adapted process with locally finite variation and A =. If X is a semi-martingale and f is a sufficiently smooth function then f(x) is also a semi-martingale. The following result gives a useful representation of semi-martingales that are images of smooth functions. We state, without proof, Itô s formula for continuous semi-martingales. Here f i and f ij represent first and second partial derivatives of f. Theorem 3.2.2 If X = (X 1,..., X d ) is a continuous semi-martingales in R d and f is a function that is twice continuously differentiable in R d. Then f(x) = f(x ) + i f i(x) X i + 1 2 f ij(x) [X i, X j ] i j a.s. We can extend Itô s formula to analytic functions. Theorem 3.2.3 If f is an analytic function on D C. Then f(z) = f(z ) + i f i(z) Z i + 1 2 f ij(z) [Z i, Z j ] i j a.s. holds for any D-valued semi-martingale Z. 3.3 Brownian Motion as a Martingale In this section we show the following result, due to Lévy, which characterizes Brownian Motion as a martingale. Theorem 3.3.1 Let B be a continuous process in R with B =. Then B is a local F-martingale with [B] t = t a.s. if and only if B is an F-Brownian motion. Before we begin the proof of the theorem we prove a needed result. 15

Lemma 3.3.1 Let M be a continuous local martingale starting at with [M] t = t a.s. Then M is a square integrable martingale. Proof. Let ρ n be optional times such that ρ n and M ρn t is a true martingale for every n. Then N t = Mρ 2 n t [M] ρn t is a martingale for every n and EM 2 ρ n t = E[M] ρn t = E(ρ n t). Using dominated and monotone convergence, we can let ρ n to get EM 2 t = t. Thus M 2 t [M] t is a true martingale and M is a square integrable martingale. Now we move on to the proof of Theorem 3.3.1. Proof. First assume that B is a continuous local F-martingale with [B] t = t a.s. and B =. Recalling the definition of Brownian motion and the characteristic function for a random variable with normal distribution, it is enough to prove for a fixed set A F s E[e iv(bt Bs) A] = e v2(t s)/2 a.s. for v R and t > s. Let f(x) = e ivx then, applying Theorem 3.2.3, we get e ivbt e ivbs = s ive ibu db u 1 2 s v 2 e icbu du. (3.1) Now [B] t = t implies that B is a true martingale by Lemma 3.3.1, and so [ ] E e ivbu db u F s = a.s. (3.2) s 16

Let A F s and multiply equation (3.1) by e ivbs 1 A on both sides to obtain 1 A e iv(bt Bs) 1 A = s iv1 A e ic(bu Bs) db u 1 2 s v 2 1 A e ic(bu Bs) du. Taking the expectation of both sides and recalling (3.2), we have E(1 A e iv(bt Bs) ) P (A) = 1 2 v2 E s 1 A e iv(bu Bs) du. This is a Volterra integral equation of the second kind for the deterministic function t Ee iv(bt Bs). Solving this integral equation we have Ee iv(bt Bs) = e v2 /2(t s). To prove the converse, we assume that B is an F-Brownian motion. To show B is a martingale, let s t, E[B t F s ] = E[B s + B t B s F s ] = E[B s + B t s F s ] = B s. 3.4 Time Change of Continuous Martingales We now show how we can use a process of optional times to change an arbitrary continuous local martingale into a Brownian motion. To do this in the general case, we consider extensions of probability space and extensions of filtrations. Let X be a process adapted to the filtration F on probability space (Ω, A, P ). Now we wish to find a Brownian motion B independent of X. In order to guarantee 17

that the processes in question are independent and still retain any original adaptedness properties we extended the probability space to a new probability space. Let ˆΩ = Ω [, 1], Â = A B[, 1], and ˆP = P λ[, 1] then (ˆΩ, Â, ˆP ) is an extension of the probability space. We can define X(ω, t, ) = X(ω, t) and B(ω, t, 1) = B(ω, t) then X and B are trivially independent. A subtler way to achieve the same goal is to take a standard extension of a filtration. We call the filtration G a standard extension of F if F t G t for all t and if G t and F are conditionally independent given F t for all t. Now we state the main theorem. Theorem 3.4.1 Let M be a continuous local F-martingale in R with M =, and define τ s = inf{t ; [M] t > s}, G s = F τs, s. Then there exists in R a Brownian motion B with respect to a standard extension of G, such that a.s. B = M τ on [, [M] ) and M = B [M]. We will break the proof into two cases, first the case when [M] = and secondly when [M] is finite. If [M] = we do not require a standard extension of the filtration for M τ to be a Brownian motion. Proof. First assume that [M] =. By Theorem 2.1.1, τ s is a right-continuous process of optional times and G s = F τs is a right-continuous filtration. To prove that B = M τ is a Brownian motion, we will use Lévy s characterization of Brownian motion, Theorem 3.3.1. Thus we need to show that B is a continuous square-integrable martingale and [B] t = t a.s. 18

First we prove that B is a continuous square integrable martingale. For fixed s, ( ˆM t ) = (M τs t) is a true martingale, and [ ˆM] t [M] τs = s, t, by the definition of τ s. Because E ˆM 2 t = E[ ˆM] t s we can apply 1.3.1 to get ˆM and ˆM 2 [ ˆM] are uniformly integrable. This allows us to use the optional sampling theorem, Theorem 1.3.3. Fix r s. Then E(M τs M τr F τr ) = E( ˆM τs ˆM τr F τr ) = ˆM τr ˆM τr =. Recall that ˆM is a true martingale starting at zero. Hence ˆM 2 t [ ˆM] t =, which gives ˆM 2 t = [ ˆM] t. Now E((M τs M τr ) 2 F τr ) = E(( ˆM τs ˆM τr ) 2 F τr ) = E( ˆM 2 τ s ˆM 2 τ r F τr ) = s r. Now B is a square-integrable martingale with [B] s = s. Next we want to prove that B is continuous. Referring to Lemma 3.1.1, we see that, for any s < t, [M] s = [M] t implies M u = M t for all u [s, t]. This property, along with the fact B is rightcontinuous, proves that B is continuous. We have now shown that B is a square integrable, continuous martingale with [B] t = t a.s., and so, by Lévy s characterization of Brownian motion, B is a Brownian motion. 19

To prove the second assertion, M t = B [M]t, we use the fact that B = M τ and τ [M]t = t, by the definition of τ s. Therefore, we can conclude that M t = M τ[m]t = B [M]t. Now we allow [M] to be finite. Define [M] = Q <. Letting s be fixed and ˆM t = M τs t, we have [ ˆM] t [M] τs = s Q. This allows us to use Lemma 1.3.1 and the optional sampling theorem, just as efore, to conclude that M τ is a continuous martingale. Let X a Brownian motion independent of F with induced filtration X. Now let H = σ{g, X } then H is a standard extension of both X and G. And so M τs is a H-martingale and X is a H-Brownian motion and they are independent. Define B s = M τs + s 1{τ r = }dx r, s. Let N s = s 1{τ r = }dx r, s. Since [M] is non-decreasing, τ s is non-decreasing. Now τ r < for r Q and so 1{τ r = } =. This means that [N] r = for all r Q. Letting s > Q, 1{τ r = } = 1 for all r [Q, s]. And so for every s > Q [N] s = [X] s [X] Q = s Q = s [M]. So if s < Q, we have [B] s = [M] τs = s, 2

and if s Q, we have [B] s = [M] τs + [N] s = Q + [X] s [X] Q = Q + s Q = s. Therefore [B] s = s. We conclude again that B is a Brownian motion and B s = M τs for all s < Q = [M]. Now we show that M = B [M]. If for t < [M] t = [M], then by Lemma 3.1.1 M s = M for s t. Thus we can use the same argument as before to obtain M t = M τ[m]t = B [M]t. 3.5 Time Change of Continuous Martingales in Higher Dimensions To extend our result to higher dimensions we discuss two approaches. Firstly we define a continuous local martingale M = (M 1,..., M d ) to be isotropic if [M i ] = [M j ] a.s. for all i, j {1...d} and if [M i, M j ] = a.s. for all i, j {1...d} with i j. Now we have a similar result for isotropic local martingales. Theorem 3.5.1 Let M be an isotropic, continuous local F- martingale starting at. Define τ s = inf{t ; [M 1 ] t > s}, F τs = G s, s. Then there exists a Brownian motion B such that B = M τ a.s. on [, [M 1 ] ) with respect to a standard extension of G and M = B τ a.s. 21

We omit the proof. However, the isotropic condition leads to a very similar proof to that of the one-dimensional case. It is important to note that in this case we only needed a single time change process to transform our local martingale. Our next result will use a weaker assumption but will also have a weaker assertion. Our next result gives another way to extend the result of Theorem 3.4.1 to higher dimensions. We define a continuous local martingales M 1,..., M d to be strongly orthogonal if [M i, M j ] = a.s. for all i, j {1...d} with i j. Under the weaker assumption of strong orthogonality, we must use individual processes of optional times to transform each component of the local martingale into a Brownian motion. Theorem 3.5.2 Let M 1,..., M d be strongly orthogonal continuous local martingales starting at zero, then define τ i s = inf{t ; [M i ] t > s}, s, 1 i d, where τ i s an optional time, for each i and s. Then the processes Bs i = M i τ, s, 1 i d, s i are independent one-dimensional Brownian motions. Obviously the individual components are transformed to Brownian motions from our proof of the one-dimensional case. However we need these one-dimensional Brownian motions to be independent in order to combine them into a Brownian motion in R d. This can be achieved through looking at the filtrations induced by the Brownian motions but not the filtrations F τ i s. We will omit the proof of Theorem 3.5.2 but a full proof can be found in [4]. 22

Chapter 4 Time Change of Point Processes The main result of this section, similar to Theorem 3.4.1, shows that a random time change can be used to transform a point process into a Poisson process. To do this, we introduce some more notation and definitions. 4.1 Random Measures and Point Processes Let (Ω, A) be a probability space and (S, S) a measurable space. A random measure ξ on S is defined as a mapping ξ : Ω S R + such that ξ(ω, B) is an A-measurable random variable for fixed B S and a locally finite measure for fixed ω Ω. We define a point process as a random measure ξ on R d such that ξb is integer-valued for every bounded Borel set B. For a stationary random measure ξ on R, Eξ = cλ, where c and λ is the Lebesgue measure, is called the intensity measure of ξ and c the rate. Define M(S) to be the space of all σ-finite measures on a measurable space S. A Poisson process ξ with intensity µ M(R d ) is defined to be point process with independent increments such that ξb is a Poisson random variable with mean µb whenever µb <. A point process ξ with ξ{s} 1 for all s R d outside a fixed P -null set is called simple. And a Poisson process is of unit rate if it has rate equal to a one. We now assume that the underlying probability space has a filtration that is not only right-continuous but also complete, and let (S, S) be a Borel space. The predictable σ-field P in the product space Ω R + is defined as the σ-field generated by all continuous, adapted processes on R +. A process V on R + S is predictable if it 23

is P S-measurable where P denotes the predictable σ-field in R + Ω. We mention, without proof the fact that the predictable σ-field is generated by all left-continuous adapted processes and that every predictable process is progressive. 4.2 Doob-Meyer Decomposition Another new concept of this section needed for our main result is the compensator process. First we define compensators in relation to the Doob-Meyer decomposition of submartingales and then extend the notion to random measures. Theorem 4.2.1 (Doob-Meyer Decomposition) Any local submartingale X has an a.s. unique decomposition X = M + A, where M is a local martingale and A is a locally integrable, nondecreasing, predictable process starting at. The proof is omitted since it is very involved and would distract from the main topic of time change, we refer to [3] for a detailed proof. The process A in the above theorem is called the compensator of the submartingale X. We want to extend compensators to random measures. Let ξ be a random measure on R + and introduce the associated cumulative process N t (ω) = ξ((, t], ω). The process N has right-continuous, a.s. nondecreasing paths and N is a submartingale. Now we can apply the Doob-Meyer decomposition to N to get its compensator A which will also be the cumulative process of a random measure. We will use compensators similarly to the way the quadratic variation process was used in Theorem 3.4.1 to define our process of optional times. 4.3 Time Change of Point Processes We now move on to prove our main result, that a process of optional times can be used to transform a point process into a Poisson process. Before stating the main 24

result, we need several important theorems. This approach is from [1]. The first of those uses only some basic analysis; however, we will soon relate it to probability. Theorem 4.3.1 Let f(x) be an increasing, right-continuous function on R with f() =, and let u(x) be a measurable function with u(x) df(x) < for each t >. Let f(t) = f(t) f(t ) and f c (t) = f(t) s t f(s). Then the integral equation h(t) = h() + h(s )u(s)df(s) has the unique solution h(t) = h() <s t ( (1 + u(s) f(s)) exp ) u(s)df c (s), t satisfying sup s t h(s) < for each t. and Proof. Let g 1 (t) = h() <s t (1 + u(s) f(s)) ( ) g 2 (t) = exp u(s)df c (s). 25

Now g 1 and g 2 are right-continuous and have bounded variation so we can use integration by parts to get h(t) = g 1 (t)g 2 (t) = g 1 ()g 2 () + g 1 (s )dg 2 (s) + g 2 (s)dg 1 (s). By definition of g 2 we have, g 1 (s )dg 2 (s) = = = [ g 1 (s )d exp ( g 1 (s )exp ( g 1 (s )g 2 (s)u(s)df c (s). )] u(s)df c (s) ) u(s)df c (s) u(s)df c (s) If there is no jump in f at point s then f(s) =. If at time s there is a jump in f then g 1 (s) = g 1 (s) g 1 (s ) = (1 + u(s) f(s))(g 1 (s ) g 1 (s ) = u(s) f(s)g 1 (s ). And so, we have g 2 (s)dg 1 (s) = <s t g 2 (s)u(s)g 1 (s ) f(s). Putting this together, h(t) = g 1 ()g 2 () + = h() + g 1 (s )dg 2 (s) + h(s )u(s)df(s). g 2 (s)dg 1 (s) 26

So h(t) is a solution to the given integral equation. Now we apply this theorem to our next result, giving conditions for a simple point process to be Poisson. Recall that by the cumulative process N of a random measure ξ on R + we mean N t = ξ(, t]. Theorem 4.3.2 Let N be the cumulative process of a simple point process with compensator A t = µ(, t] where µ is a σ-finite measure. Then N is the cumulative process of a Poisson process with rate µ. Proof. Let θ be fixed. Define M t = exp{iθn t + (1 e iθ )A t } Referring to Theorem 4.3.1, we see that this is the solution of the integral equation M t = 1 + M s (e iθ 1)d[N s A s ] The integrand on the right is left-continuous and adapted hence predictable. Since N A is a martingale Lemma 3.2.1 shows that the integral is a martingale. Now we take the conditional expectation of both sides with respect to F r [ E[M t F r ] = E 1 + M s (e iθ 1)d[N s A s ] F r ] = 1 Replacing M t with its definition and using the fact that A is assumed to be deterministic, we have E[exp{iθN t + (1 e iθ )A t } F r ] = exp{(1 e iθ )A t }E[e iθnt F r ] = 1. 27

Dividing by the exponential function of A, we get E[e iθnt F r ] = exp{(e iθ 1)A t } which is the characteristic function of a Poisson distribution. Repeating the argument, we can gain that, for < r < t, E[e iθ(nt Nr) F r ] = exp{(e iθ 1)(A t A r ), } which shows that N has independent increments and is therefore the cumulative process associated with a Poisson process. We state the main result of the section showing that a time changed cumulative process of a point process is the cumulative process of a Poisson process. Theorem 4.3.3 Let ξ be an F-adapted simple point process and N t = ξ(, t]. Let A be the compensator of N. Assume A is continuous and a.s. unbounded. Define τ s = inf{t ; A t > s}. Then the re-scaled process N τs = η(, t] where η is a unit-rate Poisson process. Proof. Referring back to Theorem 2.1.1, we see that τ s is right-continuous, and N τs is F τs -adapted. Further by continuity of A, τ s can only have jumps at countably many t. By definition of N and A, N τs can only increase by integer-valued jumps. Since τ s is right-continuous with left limits, the only jumps in τ s are when A is constant. Assume A is constant over the interval (a, b] by the martingale property of compensators, E[N b N a F a ] = A b A a =. 28

So given F a N b N a = a.s. Thus there are no jumps in N τs when τ s is discontinuous or has a jump. Since N is simple, when τ s is continuous N τs can only increase by unit jumps. Therefore N τs is simple. Referring to Theorem 4.3.2, we only need to show N τs that has compensator s. By definition A τs = s for all s. Recalling τ s is an optional time for each s, we can apply the optional sampling theorem, for s t, E[N τt t F τs ] = E[N τt A τt F τs ] = N τs A τs = N τs s. So, N τs s is a F τs -martingale, and by the uniqueness of the compensator, s is the compensator of N τs. 29

Chapter 5 Application of Time Change to Stochastic Differential Equations In this last chapter we discuss an application of the previous ideas on random time change to the area of stochastic differential equations (SDEs). First we define stochastic differential equations and some basic related concepts. Then we discuss the concept of Brownian local time. Lastly we create solutions to certain SDEs using optional times to prove Engelbert and Schmidt s necessary and sufficient conditions for solutions to certain SDEs. 5.1 Stochastic Differential Equations Our theorems involving stochastic differential equations, abbreviated SDEs, are of the basic form dx t = σ(x t )db t + b(x t )dt (5.1) where B is a one-dimensional Brownian motion, and σ and b are measurable functions on R. For our purposes we only define stochastic differential equations in the one dimensional case, but the concept can extend to higher dimensions. We refer to [2] for more information on general SDEs. We define a weak solution of the stochastic differential equation with initial distribution µ to be a process X, a probability space (Ω, F, P ) a Brownian motion B, and a random variable ξ with L(ξ) = µ, such that X satisfies (1) for (Ω, F, P ), B, and X = ξ. Further weak existence holds for a stochastic differential equation provided there is a weak solution to the SDE. Uniqueness in 3

law means that any two weak solutions with initial distribution µ have the same distribution. It is also often possible to remove the drift term from the above SDE by either a change in the underlying probability measure or a change in the state space. In this way we can reduce our SDE to dx t = σ(x t )db t. Using this SDE without a drift-term it is possible to construct weak solutions using random time change. We will discuss this after we introduce Brownian local time. For further discussion and proofs on removing the drift term we refer to [2]. 5.2 Brownian Local Time Let B be a Brownian motion and x R. To gain information about the time a path of B spends near x we would look at the set {t ; B t (ω) = x} however this set has Lebesgue measure zero. So in order to gain information about the time a Brownian path spends around a point x, we introduce the process L. Theorem 5.2.1 Let B be a Brownian motion then there exists an a.s. jointly continuous process L x t on R + R, such that for every Borel set A of R and t, 1{B s A}ds = A L x t dx. The process L defined in the theorem above is called the local time of the Brownian motion B. We can also represent the local time at some point x R of any 31

semi-martingale X by the following formula, due to Tanaka, L x t = X t x X x sgn(x s x)dx s, t, where 1 x > sgn(x) = 1 x. Next we define a nondecreasing, measurable, adapted process A in R to be a continuous additive functional if, for every x R, A t+s = A s + A t θ s a.s., s, t, where θ s is a shift operator for s. Now we state without proof the relationship between continuous additive functionals of Brownian motion and local time of Brownian motion Theorem 5.2.2 For Brownian motion X in R with local time L a process A is a continuous additive functional of X iff it has a.s. representation A t = L x t ν(dx), t, for some locally finite measure ν on R. Refer to [3] for more information about continuous additive functionals and local time of Brownian motion. 32

5.3 Application of Time Change to SDEs In this section we use random time change to create weak solutions to SDEs in the one-dimensional case, dx t = σ(x t )db t, (5.2) with initial distribution µ. We now give the informal construction of weak solutions to 5.2. To do this, first let Y be a Brownian motion with respect to some filtration F and X be a F -measurable random variable with distribution µ. Now we look at the continuous process Z t = X + Y t for t. Using Z we create a process of optional times ρ t = σ 2 (Z s )ds, t. Now we create the inverse process, τ s = inf{t ; ρ t > s}, s. Referring to Theorem 2.1.1 we see that τ s is also a process of optional times. Now for X s = Z τs with filtration G s = F τs we can find a Brownian motion B with respect to G such that they form a weak solution to dx t = σ(x t )db t with initial distribution µ. Problems with this construction could occur depending on the measurable function σ. Also we have not described how the Brownian motion B is found. To answer these questions we will use this construction formally to prove Engelbert and 33

Schmidt s theorem which gives the exact conditions σ must satisfy for a weak solution to exist. In the following proofs we will remove the condition that a Brownian motion B must have B =. This allows us to let our Brownian motion have initial distribution µ and removes our need for a random variable X in the above construction. Theorem 5.3.1 The SDE dx t = σ(x t )db t has a weak solution for every initial distribution µ if and only if I(σ) Z(σ) where, x+ɛ dy I(σ) = {x R; lim ɛ x ɛ σ 2 (y) = }, (5.3) and Z(σ) = {x R; σ(x) = }. (5.4) First we prove a lemma relating the additive functional of local time of Brownian motion to a real measure of the interval around a point of a Brownian process. Lemma 5.3.1 Let L be the local time of Brownian motion B with arbitrary initial distribution, and let ν be some measure on R. Define A t = L t (x)ν(dx), t, and S ν = {x R; lim ν(x ɛ, x + ɛ) = }. ɛ Then a.s. inf{s ; A s = } = inf{s ; B s S ν }. 34

Proof. Let t >, and R be the event where B s / S ν on [, t]. Now L x t = a.s. outside of the range of B on [, t]. Then we get, a.s. on R A t = L x t ν(dx) ν(b[, t]) sup L x t <, x since the range of B on [, t] is compact as the continuous image of a compact set and L x t is a.s. continuous and hence bounded on a closed interval. Conversely assume that B s S ν for some s < t. If τ = inf{s ; B s S ν } then B τ S ν. By the strong Markov property, the shifted process B = B τ+t for t with B = B τ is a Brownian motion in S ν. We can then reduce to the case when B = a in S ν. Then L a t > by Tanaka s formula, so then by the continuity of L with respect to x we get for some ɛ > A t = L x t ν(dx) ν(a + ɛ, a ɛ) inf x a <ɛ Lx t =. We also need the following lemma, which shows that every continuous local martingale M can be represented as a stochastic integral with respect to a Brownian motion B. Lemma 5.3.2 Let M be a continuous local F-martingale with M = and [M] = V 2 λ a.s. for some F-progressive process V. Then there exists a Brownian motion B with respect to a standard extension of F such that M = V B. a.s. Proof. Define B = V 1 M where V 1 = 1/V and V 1 = if V =. As a stochastic integral with respect to a continuous local martingale B is a continuous 35

local martingale and [B] t = [V 1 M] t = (V 1 s ) 2 d[m] s = (V 1 s ) 2 V 2 s ds = t So B is a Brownian motion by Theorem 3.3.1 and M = V B a.s. However this only works if V does not become zero. If V should vanish, define Z to be a Brownian motion independent of F with induced filtration Z then G = σ{f, Z} is a standard extension of both F and Z. Therefore V is G-progressive and M is a G-local martingale and X is a G-Brownian motion. Let B = V 1 M + U Z where U = 1{V = }. Now B is a Brownian motion. To see M = V B, we note that V U = (V B) t = V s V 1 s dm s + V s U s dz r = M t + = M t We proceed to prove Theorem 5.3.1. Proof. Assume I(σ) Z(σ). Let Y be a Brownian motion with respect to a filtration G and with initial distribution µ. Define A s = s σ 2 (Y u )du, s. Also define τ t = inf{s ; A s > t}, t, and τ = inf{s ; A s = }. 36

Now let R = inf{s ; Y s I(σ)}. By Lemma 5.3.1, R = τ. The process A s is continuous and strictly increasing for s < R. Then, by Theorem 2.1.1, τ t is a continuous process of optional times which is strictly increasing for t < A R. Further we have, A τt = t, t < A R, and τ As = s, s < R. Therefore, we conclude A s = inf{t ; τ t > s} a.s. for s. By the optional sampling theorem, we have for t 1 t 2 <, E[Y τt2 τ An G t1 ] = E[Y τt2 n G t1 ] = Y τt1 n = Y τt1 τ An. Since A n as n, Y τt is a continuous local martingale. Also Y 2 τ t τ t is a continuous local martingale, and by the uniqueness of quadratic variation we have [Y ] τt = τ t for t. Define X t = Y τt, then [X] t = τ t. For t A R, τ t = τt ( u ) τt σ 2 (Y u )d σ 2 (Y r )dr = σ 2 (Y u )da u. Then, by a change of variables, σ 2 (Y τu )da τu = σ 2 (X u )du. Thus we get τ t = σ 2 (X u )du, t A R. 37

To show that equality holds for all t, first we note that A t = for all t R by Lemma 5.3.1. Now, τ t = τ = R, t A R. To see that σ2 (X u )du is also equal to R for t A R, we first note that X t = X AR = Y τar = Y R, t A R. Recalling that R = inf{s ; Y s I(σ)} and the original assumption I(σ) Z(σ), we see that σ(x t ) = σ(y R ) = t A R. Thus σ2 (X u )du = τ t = R for t A R, which means that τ t = [X] t = σ2 (X u )du for all t. By Lemma 5.3.2, there exists a Brownian motion B such that X t = σ(x u)db u. So X is a weak solution to the stochastic differential equation dx t = σ(x t )db t with initial distribution µ. To prove the converse, let x I(σ) and let X be a solution to the stochastic differential equation dx t = σ(x t )db t with X = x. By the definition of stochastic integrals, X is a continuous local martingale, and by Theorem 3.4.1 we have X t = Y [X]t for some Brownian motion Y. Also, [X] t = [σ(x) B] t = σ 2 (X u )d[b] u = σ 2 (X u )du. 38

Let τ t = [X] t. For s, define A s = s σ 2 (Y r )dr. Then, for t, A τt = = = τt σ 2 (Y r )dr = σ 2 (X s )d( s σ 2 (X s )dτ s (5.5) σ 2 (X u )du) (5.6) 1{σ 2 (X s ) > }ds t. (5.7) Since X = x I(σ), Lemma 5.3.1 gives A s = for s >, so τ t = a.s. which implies X t = x a.s. Further, τ t = σ2 (X s )ds = a.s. and so x Z(σ). In Theorem 5.3.1 we have just proved that a stochastic differential equation dx t = σ(x t )db t has a necessary and sufficient condition for weak existence. We now prove a necessary and sufficient condition for uniqueness in law. Theorem 5.3.2 For every initial distribution µ, the stochastic differential equation dx t = σ(x t )db t has a solution which is unique in law iff I(σ) = Z(σ), where I(σ) is given by (5.3) and Z(σ) by (5.4) in Theorem 5.3.1. Proof. By Theorem 5.3.1, I(σ) Z(σ) is the sufficient condition for a solution to exist. So we must assume I(σ) Z(σ) in order to have a solution. To show that I(σ) = Z(σ) is necessary for uniqueness in law, we will prove the contraposition which is that if I(σ) is a proper subset of Z(σ) we can create solutions that are not unique in law. To this end, let I(σ) Z(σ) and x Z(σ)\I(σ). We can create a solution, as we did in Theorem 5.3.1, X = Y τt where Y is a Brownian motion starting at x. And τ t = inf{s > ; A s > t} for t with A s = s σ 2 (Y r )dr for s. To create another solution to the SDE, we let ˆX t x, which is a solution since x Z(σ). Both solutions X and ˆX have the same initial distribution µ. However they are not equal in distribution. The solution ˆX is constant. For the solution X, since 39

x / I(σ) we have A s < a.s. for s > by Lemma 5.3.1. So by definition τ t > a.s. for t >. So X as a time-changed Brownian motion is a.s. not constant. So X and ˆX are not unique in law. Now we show that I(σ) = Z(σ) is a sufficient condition for uniqueness in law. Once again, since Theorem 5.3.1 requires I(σ) Z(σ) for the existence of a solution, we only need to show I(σ) Z(σ) is sufficient for uniqueness in law. Let I(σ) Z(σ) and let X be a solution to the SDE with initial distribution µ. Again X t = Y τt, where Y is a Brownian motion with initial distribution µ and τ t = σ2 (X s )ds for t. Define again A s = s σ 2 (Y r )dr for s, and S = inf{t ; X t I(σ)}. Now τ S = R = inf{r ; Y r I(σ)}. Since S is the first time X t is in I(σ) and since I(σ) Z(σ), then before time S, X is not in either set, and so, referring back to our argument (5.5), we have for t S A τt = = τt σ 2 (Y r )dr 1{σ 2 (X s ) > }ds = t. We also know that A s = for s R by Lemma 5.3.1, and so the argument (5.5) implies τ t R a.s. for all t. So τ is constant after time S. Now we can once again define τ t = inf{s > ; A s > t} for t. This shows that τ is a measurable function of Y. Furthermore, since X t = Y τt, X is a measurable function of Y. Since Y is a Brownian motion with initial distribution µ, we know the distribution of Y. Since we can do the same thing for any solution X, they all must have distributions determined by µ. This proves uniqueness in law. 4

Bibliography [1] Daley, D.J. and Vere-Jones D. (28). An Introduction to the Theory of Point Processes, Vol. I & II. Springer, NY. [2] Ikeda, N. and Watanabe S. (1989). Stochastic Differential Equations and Diffusion Processes, 2nd ed. North-Holland, Amsterdam. [3] Kallenberg, O. (22). Foundations of Moderen Probability, 2nd ed. Springer, NY. [4] Karatzas, I. and Shreve S. (1991). Brownian Motion and Stochastic Calculus, 2nd ed. Springer, NY. 41