Random Time Change with Some Applications. Amy Peterson
|
|
- Elisabeth Collins
- 5 years ago
- Views:
Transcription
1 Random Time Change with Some Applications by Amy Peterson A thesis submitted to the Graduate Faculty of Auburn University in partial fulfillment of the requirements for the Degree of Master of Science Auburn, Alabama May 4, 214 Approved by Olav Kallenberg, Chair, Professor of Mathematics Ming Liao, Professor of Mathematics Erkan Nane, Professor of Mathematics Jerzy Szulga, Professor of Mathematics
2 Abstract This thesis is a survey of known results concerning random time change and its applications. It will cover basic probabilistic concepts and then follow with a detailed look at major results in several branches of probability all concerning random time change. The first of these major results is a theorem on how an increasing process adapted to a filtration can be used to transform the time scale and filtration. Next we show how an arbitrary continuous local martingale can be changed into a Brownian motion. We then show that a simple point process can be changed into a Poisson process using a random time change. Lastly, we look at an application of random time change to create solutions of stochastic differential equations. ii
3 Acknowledgments I would like to thank my advisor, Dr. Olav Kallenberg, for his advice and encouragement. I would also like to thank my committee members for their support. Furthermore I am grateful to everyone at the Auburn University Mathematics Department, my family, and friends. iii
4 Table of Contents Abstract Acknowledgments ii iii 1 Introduction Summary Definitions and Primary Concepts Martingales and Brownian Motion Time Change of Filtrations Time Change of Filtrations Time Change of Continuous Martingales Quadratic Variation Stochastic Integration Brownian Motion as a Martingale Time Change of Continuous Martingales Time Change of Continuous Martingales in Higher Dimensions Time Change of Point Processes Random Measures and Point Processes Doob-Meyer Decomposition Time Change of Point Processes Application of Time Change to Stochastic Differential Equations Stochastic Differential Equations Brownian Local Time Application of Time Change to SDEs iv
5 Bibliography v
6 Chapter 1 Introduction 1.1 Summary This thesis discusses the subject of random time change by looking at several known results in various areas of probability theory. In the first chapter, we give several basic definitions and theorems of probability theory, including a section discussing martingales and Brownian motion. These definitions and theorems will be used throughout the thesis and are in most basic probability texts. In the second chapter we began with our results on random time change. The main result of the second chapter discusses how an increasing process adapted to a filtration can be used to create a process of optional times that transform the time scale and filtration. This theorem will appear in the following chapters particularly in regard to the creation of a process of optional times. In the third chapter we begin with a discussion of the quadratic variation process and stochastic integration. These topics are also fundamental in probability theory, and will be important for all further results in the thesis. We will omit some of the proofs of the more detailed results but will include references. We then use these new concepts to prove Lévy s characterization of Browian motion. This theorem shows that a Brownian motion is a martingale and gives conditions for a continuous local martingale to be a Brownian motion. We then use Lévy s characterization of Brownian motion to prove the main result of chapter three that is, using a process of optional times, we can change an arbitrary continuous local martingale into a 1
7 Brownian motion. Our process of optional times depends on the quadratic variation of the local martingale and so we will break the proof of the main result into two cases depending on whether the limit of the quadratic variation is finite or infinite. Lastly in chapter three, we discuss two different ways to extend our main result to higher dimensions. To start our fourth chapter we introduce random measures, point processes, and Poisson processes. Following that we introduce the Doob-Meyer decomposition of a submartingale and explain its relation to random measures. The Doob-Meyer decomposition is an in-depth topic in probability theory we will only mention it and give reference for further study. Lastly, we proceed to prove the main result of the chapter, that is, an arbitrary simple point process to change it into a Poisson process. In our last chapter, we will look at an application of some of our previous results to stochastic differential equations (SDEs). We began the chapter by discussing what stochastic differential equations are and the type of stochastic differential equations we are interested in. We then discuss the challenging topic of Brownian local time and continuous additive functionals. Lastly we prove a necessary and sufficient conditions for a solution to certain stochastic differential equations by constructing solutions to the SDEs using random time change. 1.2 Definitions and Primary Concepts Let (Ω, A, P ) be a probability space and T be a subset of R = [, ]. A non-decreasing family F = (F t ) of σ-fields such that F t A for t T is called a filtration on T. A process X is said to be adapted to a filtration F = (F t ) if X t is F t -measurable for every t T. Given a process X, the smallest filtration F such that X is adapted to F is the filtration generated by X, that is F t = σ{x s ; s t}. Also define F = σ( t F t). If F is a filtration on T = R + we can define another 2
8 filtration F t + = h> F t+h. We call a filtration F on R + right-continuous if F = F +. Note that F + = (F + ) + so F + itself is right-continuous. Unless stated otherwise, filtrations on R + are assumed to be right-continuous. Let F t = σ(x t ) for some process X and let N t = {F Ω; F G, G F t, P (G) = }, N is the collection of all null sets. Then the filtration H defined by H t = σ(f t N t ) for all t is called the completion of the filtration. Any filtration that has the above properties is called a complete filtration. A random time τ is a measurable mapping τ : Ω T, where T is the closure of T. Given a filtration F on T, a random time τ is called an optional time if {ω; τ(ω) t} F t for every t T. Further we call a random time τ weakly optional if {ω; τ(ω) < t} F t for every t T. We define the σ-field F τ associated with an optional time τ by F τ = {A A; A {τ t} F t, t T }. The first lemma shows that weakly optional and optional are the same on a filtration that is right-continuous. Lemma If F is any filtration and τ is F-optional time then it is F weakly optional. If F is a right-continuous filtration and τ is a F-weakly optional time then τ is F-optional. Proof. Let τ be an F-optional time. Now {τ < t} = n {τ 1 (1/n)} F t, and so τ is an F weakly optional time. 3
9 Let F be right-continuous and τ be an F weakly optional time, then {τ t} = h>{τ < t + h} F t+h. Thus τ is F + -optional. But F is right-continuous, so F = F +, which means that τ is F-optional. The following lemma expresses a closure property of optional times and their associated σ-fields. Lemma If F is a right-continuous filtration and τ n are F-optional times, then τ = inf n τ n is an optional time and F τ = n F τ n. Proof. Since {τ < t} = n {τ n < t} F t, t, τ is weakly optional and thus optional, by the right continuity of F. To prove the second part we note that, again, since F is right-continuous, (F + ) τ = F τ. Let A n F τ n. Then A {τ < t} = A n {τ n < t} = n (A {τ n < t}) F t, and so, n F τ n F τ. To get the reverse inclusion, let A F τ. Then for any n, A {τ n t} = A {τ t} {τ n t}. Since this is true for all n, we have n F τ n F τ. Thus n F τ n = F τ. 4
10 For any random variable ξ with distribution µ we define the characteristic function φ of ξ to be φ(t) = Ee itξ = e itx µ(dx), t R. Characteristic functions uniquely determine the distribution of a random variable. We will need to know that the characteristic function for a normally distributed random variable ξ with mean µ and variance σ 2 is φ(t) = e iµt σ2 t 2 /2, t R. 1.3 Martingales and Brownian Motion A process M in R d is called a martingale with respect to a filtration F, or an F-martingale, if M t is integrable for each t, M is adapted to F, and E[M t F s ] = M s a.s., s t. A martingale is called square-integrable if EM 2 t < for all t. A process X is said to be uniformly integrable if lim sup E[ X t ; X t > r] =. r t T First we prove a general result about uniformly integrable processes. Lemma For p > 1, every L p -bounded process is uniformly integrable. Proof. Assume X is bounded in L p, then E( X t p ) <. Let p and q be such that 1 p + 1 q = 1 then, from Hölder s inequality, we get for u E( X t 1 X >u ) (E( X p )) 1/p (E( 1 X >u q ) 1/q. 5
11 Thus X is uniformly integrable. A process M is called a local martingale if it is adapted to a filtration F and there exist optional times τ n such that τ n and the process M t = M τn t M is a martingale for every n. A Brownian motion is a continuous process B in R with independent increments, B =, and, for all t, EB t = and Var(B t ) = t. This definition implies that B t is normally distributed with mean and variance t. A process B in R d is called a Brownian motion if its components are independent Brownian motions in R. A Brownian motion B adapted to a general filtration F on R + such that the process B s+t B s is independent of F s for all s is said to be an F-Brownian motion. A process X on R + is said to be right-continuous if X t = X t+ for all t, and X has left limits if the left limits X t exist and are finite for all t. The regularization theorem of martingales allows us to assume all martingales to be rightcontinuous with left limits, here abbreviated as rcll. We state, without proof, the more general version of this theorem for submartingales. We follow with a result relating uniform integrability to the convergence of a martingale to a random variable. These are classical results in the study of martingales we refer to [3] for the proofs and more detailed discussion. Theorem Let X be an F-submartingale. Then X has a rcll version if and only if EX is right-continuous. Theorem Let M be a right-continuous F-martingale on an unbounded index set T and define u = sup T. Then the following conditions are equivalent: i) M is uniformly integrable, ii) M t converges in L 1 to some M u as t, iii) M can be extended to a martingale on T {u}. 6
12 The next result, the optional sampling theorem, shows that, under certain conditions, the martingale property is preserved under a random time change. Theorem Let M be an F-martingale on R +, where M is right-continuous, and consider two optional times σ and τ, where τ is bounded. Then M τ is integrable, and M σ τ = E[M τ F σ ] a.s. The statement extends to unbounded times τ if and only if M is uniformly integrable. 7
13 Chapter 2 Time Change of Filtrations In this section, we begin by showing how we can use an increasing process X adapted to a filtration F to transform the time scale and the filtration. We will then apply this result in chapter 3 to the case where X, the increasing process, is the quadratic variation process of a continuous local martingale, and in chapter 4 when the increasing process is the compensator of a increasing process related to a point process. 2.1 Time Change of Filtrations We now state our main result using increasing process X adapted to a filtration F that will transform the time scale and the filtration. Theorem Let X be a non-decreasing right-continuous process adapted to some right-continuous filtration F, and define τ s = inf{t > ; X t > s}, s. Then i) (τ s ) is a right-continuous process of optional times, generating a right-continuous filtration G defined by G s = F τs for s, ii) if X is also continuous and σ is F-optional, X σ is G-optional and F σ G Xσ. 8
14 Note that, when composing the process X with an optional time σ, we get a random variable X σ. Thus it makes sense to consider X σ as an optional time. Proof. (i) Since X is right-continuous, the process (τ s ) is right-continuous as well. We want to show that τ s is an optional time for every s. By definition of τ s, {τ s < t} {X r > s}, t >. r Q (,t) To prove the inclusion in the opposite direction, fix an ω {τ s < t}. Then for some t, we have t = τ s (ω) and X t (ω) > s. Since (s, ) is an open set containing X t (ω), there exists a neighborhood around X t (ω) that remains in the set (s, ). If t is rational we have proved the inclusion. If not, since X is right-continuous and Q is dense in R, there exists an r Q such that t < r < t and X r (ω) (s, ). So ω {X r > s}, and Therefore, {τ s < t} {X r > s}, t >. r Q (,t) {τ s < t} = {X r > s} F t, t >, r Q (,t) which means that τ s is weakly optional hence τ s is optional. Since τ s is a process of optional times, G s = F τs is a filtration and we need to show that it is right-continuous. Now G + s = u>s G u = u>s F τu = u>s F + τ u = F + τ s = F τs = G s where the second and last equality come from the fact that G s = F τs. The third equality holds because F is right-continuous, and the fourth equality holds since τ u τ s. 9
15 (ii) Let X be continuous and let σ > be an F-optional time. By the definition of τ s and the fact that X is non-decreasing, we see that {X σ s} = {σ τ s }. Since σ and τ s are both optional times, {σ t} and {τ s t} are F t -measurable, and since F t is a σ-field, we have {σ τ s } = {σ t} {τ s t} c F t. So {σ τ s } F τs by definition of F τs. Thus X σ is a G-optional time. We can extend to any σ by Lemma Since X σ is an optional time, G Xσ is a σ-field. If we let A F σ be arbitrary, the above arguments give A {X σ s} = A {σ τ s } F τs G s. This shows that, for any A F σ, we also have A G Xσ, and so F σ G Xσ. 1
16 Chapter 3 Time Change of Continuous Martingales In order to change an arbitrary continuous local martingale into a Brownian motion, we will use a process of optional times such as in Theorem 2.1.1, except that our non-decreasing process will be the quadratic variation process of the continuous local martingale. Before getting to this result, we define the quadratic variation process and state some lemmas pertaining to it. Then we will prove Lévy s theorem, which characterizes Brownian motion as a martingale. This will be used in our proof of the main result. 3.1 Quadratic Variation For local martingales M and N, the process [M, N] is called the covariation of M and N, and the process [M, M] is called the quadratic variation. It is often denoted by [M]. The quadratic variation process can be constructed as a limit of the sum of squares of the original process; however, we will define the process based on a martingale characterization. We state, without proof, the existence theorem of the process [M, N] for continuous local martingales M and N. Theorem For continuous local martingales M and N there exists an a.s. unique continuous process [M, N] of locally finite variation, with [M, N] = and such that MN [M, N] is a local martingale. The next lemma lists, without proof, several properties of the covariation process. 11
17 Theorem Let M and N be continuous local martingales, and let [M, N] be the covariation process defined in Theorem Then [M, N] is a.s. bilinear, symmetric, and satisfies [M, N] = [M M, N N ]. Further, [M] is a.s. non-decreasing and, for any optional time τ, [M τ, N] = [M τ, N τ ] = [M, N] τ a.s. The next result shows that a local martingale has the same intervals of constancy as its quadratic variation process. Lemma Let M be a continuous local F-martingale, and fix any s < t. Then [M] s = [M] t if and only if a.s. M u = M t for all u [s, t). Proof. First assume that [M] s = [M] t. Then [M] s = [M] u for all s < u t since quadratic variation is nondecreasing. Then σ = inf{w > s; [M] w > [M] s } is an F-optional time. Also, N r = M σ (s+r) M s, r, is a continuous local martingale with respect to the filtration ˆF r = F (s+r), r. By the definition of σ, [N] r = [M] σ (s+r) [M] s =, r. 12
18 Since N is a local martingale, there exists a sequence of optional times ρ n such that ρ n a.s. and each process N ρn t is a true martingale. Now [N] ρn r = a.s. and E(N 2 ρ n r [N] ρn r) = E(N 2 ρ n r). Since N 2 ρ n r [N] ρn r is a martingale, E(N 2 ρ n r [N] ρn r) =. So E(Nρ 2 n r) =, and thus N ρn r = a.s. Letting ρ n, we get N = a.s. Thus M σ (s+r) = M s, and so M u = M t for any u [s, t]. To prove the converse, assume M u = M t for all s u < t. Then τ = inf{w > s; M s < M w } is an optional time. Now N r = M τ (s+r) M s is a continuous local martingale with respect to ˆF, defined by ˆF r = F s+r. By definition of τ, N r = M τ (s+r) M s =. Let ρ n be a sequence of optional times such that ρ n and N r ρn is a martingale. Then N 2 r ρ n [N] r ρn is a martingale and E[N 2 r ρ n [N] r ρn ] =. Since N r =, we have E[N] r ρn =. Letting ρ n, we have E[N] r =, which gives us [M τ (s+r) M s ] = a.s. And so we have [M] s = [M] t. 3.2 Stochastic Integration We now introduce the concept of stochastic integration. We will start by defining an elementary stochastic integral as a sum of random variables. Let τ n be optional 13
19 times and ξ k be bounded F τk -measurable random variables, and define V t = k n ξ k 1{t > τ k }, t. Then for any process X, we may define the integral process V X by (V X) t = V s dx s = k n ξ k (X t X t τk ). We call the process V X an elementary stochastic integral. A process V on R is said to be progressively measurable, or simply progressive, if its restriction to Ω [, t] is F t B[, t] - measurable for every t. Originally stochastic integrals were extended to progressive processes using an approximation of the elementary stochastic integrals defined above. However in the following theorem we extend the notion of stochastic integrals by a martingale characterization. Theorem Let M be a continuous local martingales and V a progressive process such that (V 2 [M]) t < a.s. for every t >. Then there exists an a.s. unique continuous local martingale V M with (V M) = and such that [V M, N] = V [M, N] a.s. for every continuous local martingale N. Since covariation has locally finite variation, the integral V [M, N] is a Lebesgue- Steljes integral. This allows us to uniquely characterize the stochastic integral in terms of a Lebesgue-Steljes integral. We omit the proof of this theorem but refer to [3] for the proof and a more detailed discussion of stochastic integrals. 14
20 A continuous process X is said to be a semi-martingale if it can be represented as a sum M +A, where M is a continuous local martingale and A is a continuous, adapted process with locally finite variation and A =. If X is a semi-martingale and f is a sufficiently smooth function then f(x) is also a semi-martingale. The following result gives a useful representation of semi-martingales that are images of smooth functions. We state, without proof, Itô s formula for continuous semi-martingales. Here f i and f ij represent first and second partial derivatives of f. Theorem If X = (X 1,..., X d ) is a continuous semi-martingales in R d and f is a function that is twice continuously differentiable in R d. Then f(x) = f(x ) + i f i(x) X i f ij(x) [X i, X j ] i j a.s. We can extend Itô s formula to analytic functions. Theorem If f is an analytic function on D C. Then f(z) = f(z ) + i f i(z) Z i f ij(z) [Z i, Z j ] i j a.s. holds for any D-valued semi-martingale Z. 3.3 Brownian Motion as a Martingale In this section we show the following result, due to Lévy, which characterizes Brownian Motion as a martingale. Theorem Let B be a continuous process in R with B =. Then B is a local F-martingale with [B] t = t a.s. if and only if B is an F-Brownian motion. Before we begin the proof of the theorem we prove a needed result. 15
21 Lemma Let M be a continuous local martingale starting at with [M] t = t a.s. Then M is a square integrable martingale. Proof. Let ρ n be optional times such that ρ n and M ρn t is a true martingale for every n. Then N t = Mρ 2 n t [M] ρn t is a martingale for every n and EM 2 ρ n t = E[M] ρn t = E(ρ n t). Using dominated and monotone convergence, we can let ρ n to get EM 2 t = t. Thus M 2 t [M] t is a true martingale and M is a square integrable martingale. Now we move on to the proof of Theorem Proof. First assume that B is a continuous local F-martingale with [B] t = t a.s. and B =. Recalling the definition of Brownian motion and the characteristic function for a random variable with normal distribution, it is enough to prove for a fixed set A F s E[e iv(bt Bs) A] = e v2(t s)/2 a.s. for v R and t > s. Let f(x) = e ivx then, applying Theorem 3.2.3, we get e ivbt e ivbs = s ive ibu db u 1 2 s v 2 e icbu du. (3.1) Now [B] t = t implies that B is a true martingale by Lemma 3.3.1, and so [ ] E e ivbu db u F s = a.s. (3.2) s 16
22 Let A F s and multiply equation (3.1) by e ivbs 1 A on both sides to obtain 1 A e iv(bt Bs) 1 A = s iv1 A e ic(bu Bs) db u 1 2 s v 2 1 A e ic(bu Bs) du. Taking the expectation of both sides and recalling (3.2), we have E(1 A e iv(bt Bs) ) P (A) = 1 2 v2 E s 1 A e iv(bu Bs) du. This is a Volterra integral equation of the second kind for the deterministic function t Ee iv(bt Bs). Solving this integral equation we have Ee iv(bt Bs) = e v2 /2(t s). To prove the converse, we assume that B is an F-Brownian motion. To show B is a martingale, let s t, E[B t F s ] = E[B s + B t B s F s ] = E[B s + B t s F s ] = B s. 3.4 Time Change of Continuous Martingales We now show how we can use a process of optional times to change an arbitrary continuous local martingale into a Brownian motion. To do this in the general case, we consider extensions of probability space and extensions of filtrations. Let X be a process adapted to the filtration F on probability space (Ω, A, P ). Now we wish to find a Brownian motion B independent of X. In order to guarantee 17
23 that the processes in question are independent and still retain any original adaptedness properties we extended the probability space to a new probability space. Let ˆΩ = Ω [, 1], Â = A B[, 1], and ˆP = P λ[, 1] then (ˆΩ, Â, ˆP ) is an extension of the probability space. We can define X(ω, t, ) = X(ω, t) and B(ω, t, 1) = B(ω, t) then X and B are trivially independent. A subtler way to achieve the same goal is to take a standard extension of a filtration. We call the filtration G a standard extension of F if F t G t for all t and if G t and F are conditionally independent given F t for all t. Now we state the main theorem. Theorem Let M be a continuous local F-martingale in R with M =, and define τ s = inf{t ; [M] t > s}, G s = F τs, s. Then there exists in R a Brownian motion B with respect to a standard extension of G, such that a.s. B = M τ on [, [M] ) and M = B [M]. We will break the proof into two cases, first the case when [M] = and secondly when [M] is finite. If [M] = we do not require a standard extension of the filtration for M τ to be a Brownian motion. Proof. First assume that [M] =. By Theorem 2.1.1, τ s is a right-continuous process of optional times and G s = F τs is a right-continuous filtration. To prove that B = M τ is a Brownian motion, we will use Lévy s characterization of Brownian motion, Theorem Thus we need to show that B is a continuous square-integrable martingale and [B] t = t a.s. 18
24 First we prove that B is a continuous square integrable martingale. For fixed s, ( ˆM t ) = (M τs t) is a true martingale, and [ ˆM] t [M] τs = s, t, by the definition of τ s. Because E ˆM 2 t = E[ ˆM] t s we can apply to get ˆM and ˆM 2 [ ˆM] are uniformly integrable. This allows us to use the optional sampling theorem, Theorem Fix r s. Then E(M τs M τr F τr ) = E( ˆM τs ˆM τr F τr ) = ˆM τr ˆM τr =. Recall that ˆM is a true martingale starting at zero. Hence ˆM 2 t [ ˆM] t =, which gives ˆM 2 t = [ ˆM] t. Now E((M τs M τr ) 2 F τr ) = E(( ˆM τs ˆM τr ) 2 F τr ) = E( ˆM 2 τ s ˆM 2 τ r F τr ) = s r. Now B is a square-integrable martingale with [B] s = s. Next we want to prove that B is continuous. Referring to Lemma 3.1.1, we see that, for any s < t, [M] s = [M] t implies M u = M t for all u [s, t]. This property, along with the fact B is rightcontinuous, proves that B is continuous. We have now shown that B is a square integrable, continuous martingale with [B] t = t a.s., and so, by Lévy s characterization of Brownian motion, B is a Brownian motion. 19
25 To prove the second assertion, M t = B [M]t, we use the fact that B = M τ and τ [M]t = t, by the definition of τ s. Therefore, we can conclude that M t = M τ[m]t = B [M]t. Now we allow [M] to be finite. Define [M] = Q <. Letting s be fixed and ˆM t = M τs t, we have [ ˆM] t [M] τs = s Q. This allows us to use Lemma and the optional sampling theorem, just as efore, to conclude that M τ is a continuous martingale. Let X a Brownian motion independent of F with induced filtration X. Now let H = σ{g, X } then H is a standard extension of both X and G. And so M τs is a H-martingale and X is a H-Brownian motion and they are independent. Define B s = M τs + s 1{τ r = }dx r, s. Let N s = s 1{τ r = }dx r, s. Since [M] is non-decreasing, τ s is non-decreasing. Now τ r < for r Q and so 1{τ r = } =. This means that [N] r = for all r Q. Letting s > Q, 1{τ r = } = 1 for all r [Q, s]. And so for every s > Q [N] s = [X] s [X] Q = s Q = s [M]. So if s < Q, we have [B] s = [M] τs = s, 2
26 and if s Q, we have [B] s = [M] τs + [N] s = Q + [X] s [X] Q = Q + s Q = s. Therefore [B] s = s. We conclude again that B is a Brownian motion and B s = M τs for all s < Q = [M]. Now we show that M = B [M]. If for t < [M] t = [M], then by Lemma M s = M for s t. Thus we can use the same argument as before to obtain M t = M τ[m]t = B [M]t. 3.5 Time Change of Continuous Martingales in Higher Dimensions To extend our result to higher dimensions we discuss two approaches. Firstly we define a continuous local martingale M = (M 1,..., M d ) to be isotropic if [M i ] = [M j ] a.s. for all i, j {1...d} and if [M i, M j ] = a.s. for all i, j {1...d} with i j. Now we have a similar result for isotropic local martingales. Theorem Let M be an isotropic, continuous local F- martingale starting at. Define τ s = inf{t ; [M 1 ] t > s}, F τs = G s, s. Then there exists a Brownian motion B such that B = M τ a.s. on [, [M 1 ] ) with respect to a standard extension of G and M = B τ a.s. 21
27 We omit the proof. However, the isotropic condition leads to a very similar proof to that of the one-dimensional case. It is important to note that in this case we only needed a single time change process to transform our local martingale. Our next result will use a weaker assumption but will also have a weaker assertion. Our next result gives another way to extend the result of Theorem to higher dimensions. We define a continuous local martingales M 1,..., M d to be strongly orthogonal if [M i, M j ] = a.s. for all i, j {1...d} with i j. Under the weaker assumption of strong orthogonality, we must use individual processes of optional times to transform each component of the local martingale into a Brownian motion. Theorem Let M 1,..., M d be strongly orthogonal continuous local martingales starting at zero, then define τ i s = inf{t ; [M i ] t > s}, s, 1 i d, where τ i s an optional time, for each i and s. Then the processes Bs i = M i τ, s, 1 i d, s i are independent one-dimensional Brownian motions. Obviously the individual components are transformed to Brownian motions from our proof of the one-dimensional case. However we need these one-dimensional Brownian motions to be independent in order to combine them into a Brownian motion in R d. This can be achieved through looking at the filtrations induced by the Brownian motions but not the filtrations F τ i s. We will omit the proof of Theorem but a full proof can be found in [4]. 22
28 Chapter 4 Time Change of Point Processes The main result of this section, similar to Theorem 3.4.1, shows that a random time change can be used to transform a point process into a Poisson process. To do this, we introduce some more notation and definitions. 4.1 Random Measures and Point Processes Let (Ω, A) be a probability space and (S, S) a measurable space. A random measure ξ on S is defined as a mapping ξ : Ω S R + such that ξ(ω, B) is an A-measurable random variable for fixed B S and a locally finite measure for fixed ω Ω. We define a point process as a random measure ξ on R d such that ξb is integer-valued for every bounded Borel set B. For a stationary random measure ξ on R, Eξ = cλ, where c and λ is the Lebesgue measure, is called the intensity measure of ξ and c the rate. Define M(S) to be the space of all σ-finite measures on a measurable space S. A Poisson process ξ with intensity µ M(R d ) is defined to be point process with independent increments such that ξb is a Poisson random variable with mean µb whenever µb <. A point process ξ with ξ{s} 1 for all s R d outside a fixed P -null set is called simple. And a Poisson process is of unit rate if it has rate equal to a one. We now assume that the underlying probability space has a filtration that is not only right-continuous but also complete, and let (S, S) be a Borel space. The predictable σ-field P in the product space Ω R + is defined as the σ-field generated by all continuous, adapted processes on R +. A process V on R + S is predictable if it 23
29 is P S-measurable where P denotes the predictable σ-field in R + Ω. We mention, without proof the fact that the predictable σ-field is generated by all left-continuous adapted processes and that every predictable process is progressive. 4.2 Doob-Meyer Decomposition Another new concept of this section needed for our main result is the compensator process. First we define compensators in relation to the Doob-Meyer decomposition of submartingales and then extend the notion to random measures. Theorem (Doob-Meyer Decomposition) Any local submartingale X has an a.s. unique decomposition X = M + A, where M is a local martingale and A is a locally integrable, nondecreasing, predictable process starting at. The proof is omitted since it is very involved and would distract from the main topic of time change, we refer to [3] for a detailed proof. The process A in the above theorem is called the compensator of the submartingale X. We want to extend compensators to random measures. Let ξ be a random measure on R + and introduce the associated cumulative process N t (ω) = ξ((, t], ω). The process N has right-continuous, a.s. nondecreasing paths and N is a submartingale. Now we can apply the Doob-Meyer decomposition to N to get its compensator A which will also be the cumulative process of a random measure. We will use compensators similarly to the way the quadratic variation process was used in Theorem to define our process of optional times. 4.3 Time Change of Point Processes We now move on to prove our main result, that a process of optional times can be used to transform a point process into a Poisson process. Before stating the main 24
30 result, we need several important theorems. This approach is from [1]. The first of those uses only some basic analysis; however, we will soon relate it to probability. Theorem Let f(x) be an increasing, right-continuous function on R with f() =, and let u(x) be a measurable function with u(x) df(x) < for each t >. Let f(t) = f(t) f(t ) and f c (t) = f(t) s t f(s). Then the integral equation h(t) = h() + h(s )u(s)df(s) has the unique solution h(t) = h() <s t ( (1 + u(s) f(s)) exp ) u(s)df c (s), t satisfying sup s t h(s) < for each t. and Proof. Let g 1 (t) = h() <s t (1 + u(s) f(s)) ( ) g 2 (t) = exp u(s)df c (s). 25
31 Now g 1 and g 2 are right-continuous and have bounded variation so we can use integration by parts to get h(t) = g 1 (t)g 2 (t) = g 1 ()g 2 () + g 1 (s )dg 2 (s) + g 2 (s)dg 1 (s). By definition of g 2 we have, g 1 (s )dg 2 (s) = = = [ g 1 (s )d exp ( g 1 (s )exp ( g 1 (s )g 2 (s)u(s)df c (s). )] u(s)df c (s) ) u(s)df c (s) u(s)df c (s) If there is no jump in f at point s then f(s) =. If at time s there is a jump in f then g 1 (s) = g 1 (s) g 1 (s ) = (1 + u(s) f(s))(g 1 (s ) g 1 (s ) = u(s) f(s)g 1 (s ). And so, we have g 2 (s)dg 1 (s) = <s t g 2 (s)u(s)g 1 (s ) f(s). Putting this together, h(t) = g 1 ()g 2 () + = h() + g 1 (s )dg 2 (s) + h(s )u(s)df(s). g 2 (s)dg 1 (s) 26
32 So h(t) is a solution to the given integral equation. Now we apply this theorem to our next result, giving conditions for a simple point process to be Poisson. Recall that by the cumulative process N of a random measure ξ on R + we mean N t = ξ(, t]. Theorem Let N be the cumulative process of a simple point process with compensator A t = µ(, t] where µ is a σ-finite measure. Then N is the cumulative process of a Poisson process with rate µ. Proof. Let θ be fixed. Define M t = exp{iθn t + (1 e iθ )A t } Referring to Theorem 4.3.1, we see that this is the solution of the integral equation M t = 1 + M s (e iθ 1)d[N s A s ] The integrand on the right is left-continuous and adapted hence predictable. Since N A is a martingale Lemma shows that the integral is a martingale. Now we take the conditional expectation of both sides with respect to F r [ E[M t F r ] = E 1 + M s (e iθ 1)d[N s A s ] F r ] = 1 Replacing M t with its definition and using the fact that A is assumed to be deterministic, we have E[exp{iθN t + (1 e iθ )A t } F r ] = exp{(1 e iθ )A t }E[e iθnt F r ] = 1. 27
33 Dividing by the exponential function of A, we get E[e iθnt F r ] = exp{(e iθ 1)A t } which is the characteristic function of a Poisson distribution. Repeating the argument, we can gain that, for < r < t, E[e iθ(nt Nr) F r ] = exp{(e iθ 1)(A t A r ), } which shows that N has independent increments and is therefore the cumulative process associated with a Poisson process. We state the main result of the section showing that a time changed cumulative process of a point process is the cumulative process of a Poisson process. Theorem Let ξ be an F-adapted simple point process and N t = ξ(, t]. Let A be the compensator of N. Assume A is continuous and a.s. unbounded. Define τ s = inf{t ; A t > s}. Then the re-scaled process N τs = η(, t] where η is a unit-rate Poisson process. Proof. Referring back to Theorem 2.1.1, we see that τ s is right-continuous, and N τs is F τs -adapted. Further by continuity of A, τ s can only have jumps at countably many t. By definition of N and A, N τs can only increase by integer-valued jumps. Since τ s is right-continuous with left limits, the only jumps in τ s are when A is constant. Assume A is constant over the interval (a, b] by the martingale property of compensators, E[N b N a F a ] = A b A a =. 28
34 So given F a N b N a = a.s. Thus there are no jumps in N τs when τ s is discontinuous or has a jump. Since N is simple, when τ s is continuous N τs can only increase by unit jumps. Therefore N τs is simple. Referring to Theorem 4.3.2, we only need to show N τs that has compensator s. By definition A τs = s for all s. Recalling τ s is an optional time for each s, we can apply the optional sampling theorem, for s t, E[N τt t F τs ] = E[N τt A τt F τs ] = N τs A τs = N τs s. So, N τs s is a F τs -martingale, and by the uniqueness of the compensator, s is the compensator of N τs. 29
35 Chapter 5 Application of Time Change to Stochastic Differential Equations In this last chapter we discuss an application of the previous ideas on random time change to the area of stochastic differential equations (SDEs). First we define stochastic differential equations and some basic related concepts. Then we discuss the concept of Brownian local time. Lastly we create solutions to certain SDEs using optional times to prove Engelbert and Schmidt s necessary and sufficient conditions for solutions to certain SDEs. 5.1 Stochastic Differential Equations Our theorems involving stochastic differential equations, abbreviated SDEs, are of the basic form dx t = σ(x t )db t + b(x t )dt (5.1) where B is a one-dimensional Brownian motion, and σ and b are measurable functions on R. For our purposes we only define stochastic differential equations in the one dimensional case, but the concept can extend to higher dimensions. We refer to [2] for more information on general SDEs. We define a weak solution of the stochastic differential equation with initial distribution µ to be a process X, a probability space (Ω, F, P ) a Brownian motion B, and a random variable ξ with L(ξ) = µ, such that X satisfies (1) for (Ω, F, P ), B, and X = ξ. Further weak existence holds for a stochastic differential equation provided there is a weak solution to the SDE. Uniqueness in 3
36 law means that any two weak solutions with initial distribution µ have the same distribution. It is also often possible to remove the drift term from the above SDE by either a change in the underlying probability measure or a change in the state space. In this way we can reduce our SDE to dx t = σ(x t )db t. Using this SDE without a drift-term it is possible to construct weak solutions using random time change. We will discuss this after we introduce Brownian local time. For further discussion and proofs on removing the drift term we refer to [2]. 5.2 Brownian Local Time Let B be a Brownian motion and x R. To gain information about the time a path of B spends near x we would look at the set {t ; B t (ω) = x} however this set has Lebesgue measure zero. So in order to gain information about the time a Brownian path spends around a point x, we introduce the process L. Theorem Let B be a Brownian motion then there exists an a.s. jointly continuous process L x t on R + R, such that for every Borel set A of R and t, 1{B s A}ds = A L x t dx. The process L defined in the theorem above is called the local time of the Brownian motion B. We can also represent the local time at some point x R of any 31
37 semi-martingale X by the following formula, due to Tanaka, L x t = X t x X x sgn(x s x)dx s, t, where 1 x > sgn(x) = 1 x. Next we define a nondecreasing, measurable, adapted process A in R to be a continuous additive functional if, for every x R, A t+s = A s + A t θ s a.s., s, t, where θ s is a shift operator for s. Now we state without proof the relationship between continuous additive functionals of Brownian motion and local time of Brownian motion Theorem For Brownian motion X in R with local time L a process A is a continuous additive functional of X iff it has a.s. representation A t = L x t ν(dx), t, for some locally finite measure ν on R. Refer to [3] for more information about continuous additive functionals and local time of Brownian motion. 32
38 5.3 Application of Time Change to SDEs In this section we use random time change to create weak solutions to SDEs in the one-dimensional case, dx t = σ(x t )db t, (5.2) with initial distribution µ. We now give the informal construction of weak solutions to 5.2. To do this, first let Y be a Brownian motion with respect to some filtration F and X be a F -measurable random variable with distribution µ. Now we look at the continuous process Z t = X + Y t for t. Using Z we create a process of optional times ρ t = σ 2 (Z s )ds, t. Now we create the inverse process, τ s = inf{t ; ρ t > s}, s. Referring to Theorem we see that τ s is also a process of optional times. Now for X s = Z τs with filtration G s = F τs we can find a Brownian motion B with respect to G such that they form a weak solution to dx t = σ(x t )db t with initial distribution µ. Problems with this construction could occur depending on the measurable function σ. Also we have not described how the Brownian motion B is found. To answer these questions we will use this construction formally to prove Engelbert and 33
39 Schmidt s theorem which gives the exact conditions σ must satisfy for a weak solution to exist. In the following proofs we will remove the condition that a Brownian motion B must have B =. This allows us to let our Brownian motion have initial distribution µ and removes our need for a random variable X in the above construction. Theorem The SDE dx t = σ(x t )db t has a weak solution for every initial distribution µ if and only if I(σ) Z(σ) where, x+ɛ dy I(σ) = {x R; lim ɛ x ɛ σ 2 (y) = }, (5.3) and Z(σ) = {x R; σ(x) = }. (5.4) First we prove a lemma relating the additive functional of local time of Brownian motion to a real measure of the interval around a point of a Brownian process. Lemma Let L be the local time of Brownian motion B with arbitrary initial distribution, and let ν be some measure on R. Define A t = L t (x)ν(dx), t, and S ν = {x R; lim ν(x ɛ, x + ɛ) = }. ɛ Then a.s. inf{s ; A s = } = inf{s ; B s S ν }. 34
40 Proof. Let t >, and R be the event where B s / S ν on [, t]. Now L x t = a.s. outside of the range of B on [, t]. Then we get, a.s. on R A t = L x t ν(dx) ν(b[, t]) sup L x t <, x since the range of B on [, t] is compact as the continuous image of a compact set and L x t is a.s. continuous and hence bounded on a closed interval. Conversely assume that B s S ν for some s < t. If τ = inf{s ; B s S ν } then B τ S ν. By the strong Markov property, the shifted process B = B τ+t for t with B = B τ is a Brownian motion in S ν. We can then reduce to the case when B = a in S ν. Then L a t > by Tanaka s formula, so then by the continuity of L with respect to x we get for some ɛ > A t = L x t ν(dx) ν(a + ɛ, a ɛ) inf x a <ɛ Lx t =. We also need the following lemma, which shows that every continuous local martingale M can be represented as a stochastic integral with respect to a Brownian motion B. Lemma Let M be a continuous local F-martingale with M = and [M] = V 2 λ a.s. for some F-progressive process V. Then there exists a Brownian motion B with respect to a standard extension of F such that M = V B. a.s. Proof. Define B = V 1 M where V 1 = 1/V and V 1 = if V =. As a stochastic integral with respect to a continuous local martingale B is a continuous 35
41 local martingale and [B] t = [V 1 M] t = (V 1 s ) 2 d[m] s = (V 1 s ) 2 V 2 s ds = t So B is a Brownian motion by Theorem and M = V B a.s. However this only works if V does not become zero. If V should vanish, define Z to be a Brownian motion independent of F with induced filtration Z then G = σ{f, Z} is a standard extension of both F and Z. Therefore V is G-progressive and M is a G-local martingale and X is a G-Brownian motion. Let B = V 1 M + U Z where U = 1{V = }. Now B is a Brownian motion. To see M = V B, we note that V U = (V B) t = V s V 1 s dm s + V s U s dz r = M t + = M t We proceed to prove Theorem Proof. Assume I(σ) Z(σ). Let Y be a Brownian motion with respect to a filtration G and with initial distribution µ. Define A s = s σ 2 (Y u )du, s. Also define τ t = inf{s ; A s > t}, t, and τ = inf{s ; A s = }. 36
42 Now let R = inf{s ; Y s I(σ)}. By Lemma 5.3.1, R = τ. The process A s is continuous and strictly increasing for s < R. Then, by Theorem 2.1.1, τ t is a continuous process of optional times which is strictly increasing for t < A R. Further we have, A τt = t, t < A R, and τ As = s, s < R. Therefore, we conclude A s = inf{t ; τ t > s} a.s. for s. By the optional sampling theorem, we have for t 1 t 2 <, E[Y τt2 τ An G t1 ] = E[Y τt2 n G t1 ] = Y τt1 n = Y τt1 τ An. Since A n as n, Y τt is a continuous local martingale. Also Y 2 τ t τ t is a continuous local martingale, and by the uniqueness of quadratic variation we have [Y ] τt = τ t for t. Define X t = Y τt, then [X] t = τ t. For t A R, τ t = τt ( u ) τt σ 2 (Y u )d σ 2 (Y r )dr = σ 2 (Y u )da u. Then, by a change of variables, σ 2 (Y τu )da τu = σ 2 (X u )du. Thus we get τ t = σ 2 (X u )du, t A R. 37
43 To show that equality holds for all t, first we note that A t = for all t R by Lemma Now, τ t = τ = R, t A R. To see that σ2 (X u )du is also equal to R for t A R, we first note that X t = X AR = Y τar = Y R, t A R. Recalling that R = inf{s ; Y s I(σ)} and the original assumption I(σ) Z(σ), we see that σ(x t ) = σ(y R ) = t A R. Thus σ2 (X u )du = τ t = R for t A R, which means that τ t = [X] t = σ2 (X u )du for all t. By Lemma 5.3.2, there exists a Brownian motion B such that X t = σ(x u)db u. So X is a weak solution to the stochastic differential equation dx t = σ(x t )db t with initial distribution µ. To prove the converse, let x I(σ) and let X be a solution to the stochastic differential equation dx t = σ(x t )db t with X = x. By the definition of stochastic integrals, X is a continuous local martingale, and by Theorem we have X t = Y [X]t for some Brownian motion Y. Also, [X] t = [σ(x) B] t = σ 2 (X u )d[b] u = σ 2 (X u )du. 38
44 Let τ t = [X] t. For s, define A s = s σ 2 (Y r )dr. Then, for t, A τt = = = τt σ 2 (Y r )dr = σ 2 (X s )d( s σ 2 (X s )dτ s (5.5) σ 2 (X u )du) (5.6) 1{σ 2 (X s ) > }ds t. (5.7) Since X = x I(σ), Lemma gives A s = for s >, so τ t = a.s. which implies X t = x a.s. Further, τ t = σ2 (X s )ds = a.s. and so x Z(σ). In Theorem we have just proved that a stochastic differential equation dx t = σ(x t )db t has a necessary and sufficient condition for weak existence. We now prove a necessary and sufficient condition for uniqueness in law. Theorem For every initial distribution µ, the stochastic differential equation dx t = σ(x t )db t has a solution which is unique in law iff I(σ) = Z(σ), where I(σ) is given by (5.3) and Z(σ) by (5.4) in Theorem Proof. By Theorem 5.3.1, I(σ) Z(σ) is the sufficient condition for a solution to exist. So we must assume I(σ) Z(σ) in order to have a solution. To show that I(σ) = Z(σ) is necessary for uniqueness in law, we will prove the contraposition which is that if I(σ) is a proper subset of Z(σ) we can create solutions that are not unique in law. To this end, let I(σ) Z(σ) and x Z(σ)\I(σ). We can create a solution, as we did in Theorem 5.3.1, X = Y τt where Y is a Brownian motion starting at x. And τ t = inf{s > ; A s > t} for t with A s = s σ 2 (Y r )dr for s. To create another solution to the SDE, we let ˆX t x, which is a solution since x Z(σ). Both solutions X and ˆX have the same initial distribution µ. However they are not equal in distribution. The solution ˆX is constant. For the solution X, since 39
45 x / I(σ) we have A s < a.s. for s > by Lemma So by definition τ t > a.s. for t >. So X as a time-changed Brownian motion is a.s. not constant. So X and ˆX are not unique in law. Now we show that I(σ) = Z(σ) is a sufficient condition for uniqueness in law. Once again, since Theorem requires I(σ) Z(σ) for the existence of a solution, we only need to show I(σ) Z(σ) is sufficient for uniqueness in law. Let I(σ) Z(σ) and let X be a solution to the SDE with initial distribution µ. Again X t = Y τt, where Y is a Brownian motion with initial distribution µ and τ t = σ2 (X s )ds for t. Define again A s = s σ 2 (Y r )dr for s, and S = inf{t ; X t I(σ)}. Now τ S = R = inf{r ; Y r I(σ)}. Since S is the first time X t is in I(σ) and since I(σ) Z(σ), then before time S, X is not in either set, and so, referring back to our argument (5.5), we have for t S A τt = = τt σ 2 (Y r )dr 1{σ 2 (X s ) > }ds = t. We also know that A s = for s R by Lemma 5.3.1, and so the argument (5.5) implies τ t R a.s. for all t. So τ is constant after time S. Now we can once again define τ t = inf{s > ; A s > t} for t. This shows that τ is a measurable function of Y. Furthermore, since X t = Y τt, X is a measurable function of Y. Since Y is a Brownian motion with initial distribution µ, we know the distribution of Y. Since we can do the same thing for any solution X, they all must have distributions determined by µ. This proves uniqueness in law. 4
46 Bibliography [1] Daley, D.J. and Vere-Jones D. (28). An Introduction to the Theory of Point Processes, Vol. I & II. Springer, NY. [2] Ikeda, N. and Watanabe S. (1989). Stochastic Differential Equations and Diffusion Processes, 2nd ed. North-Holland, Amsterdam. [3] Kallenberg, O. (22). Foundations of Moderen Probability, 2nd ed. Springer, NY. [4] Karatzas, I. and Shreve S. (1991). Brownian Motion and Stochastic Calculus, 2nd ed. Springer, NY. 41
Stochastic Dynamical Systems and SDE s. An Informal Introduction
Stochastic Dynamical Systems and SDE s An Informal Introduction Olav Kallenberg Graduate Student Seminar, April 18, 2012 1 / 33 2 / 33 Simple recursion: Deterministic system, discrete time x n+1 = f (x
More informationEquivalence between Semimartingales and Itô Processes
International Journal of Mathematical Analysis Vol. 9, 215, no. 16, 787-791 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/1.12988/ijma.215.411358 Equivalence between Semimartingales and Itô Processes
More informationLast Time. Martingale inequalities Martingale convergence theorem Uniformly integrable martingales. Today s lecture: Sections 4.4.1, 5.
MATH136/STAT219 Lecture 21, November 12, 2008 p. 1/11 Last Time Martingale inequalities Martingale convergence theorem Uniformly integrable martingales Today s lecture: Sections 4.4.1, 5.3 MATH136/STAT219
More informationMartingales. by D. Cox December 2, 2009
Martingales by D. Cox December 2, 2009 1 Stochastic Processes. Definition 1.1 Let T be an arbitrary index set. A stochastic process indexed by T is a family of random variables (X t : t T) defined on a
More informationAn Introduction to Point Processes. from a. Martingale Point of View
An Introduction to Point Processes from a Martingale Point of View Tomas Björk KTH, 211 Preliminary, incomplete, and probably with lots of typos 2 Contents I The Mathematics of Counting Processes 5 1 Counting
More informationAMH4 - ADVANCED OPTION PRICING. Contents
AMH4 - ADVANCED OPTION PRICING ANDREW TULLOCH Contents 1. Theory of Option Pricing 2 2. Black-Scholes PDE Method 4 3. Martingale method 4 4. Monte Carlo methods 5 4.1. Method of antithetic variances 5
More informationThe ruin probabilities of a multidimensional perturbed risk model
MATHEMATICAL COMMUNICATIONS 231 Math. Commun. 18(2013, 231 239 The ruin probabilities of a multidimensional perturbed risk model Tatjana Slijepčević-Manger 1, 1 Faculty of Civil Engineering, University
More informationA No-Arbitrage Theorem for Uncertain Stock Model
Fuzzy Optim Decis Making manuscript No (will be inserted by the editor) A No-Arbitrage Theorem for Uncertain Stock Model Kai Yao Received: date / Accepted: date Abstract Stock model is used to describe
More informationHedging under Arbitrage
Hedging under Arbitrage Johannes Ruf Columbia University, Department of Statistics Modeling and Managing Financial Risks January 12, 2011 Motivation Given: a frictionless market of stocks with continuous
More informationIntroduction to Probability Theory and Stochastic Processes for Finance Lecture Notes
Introduction to Probability Theory and Stochastic Processes for Finance Lecture Notes Fabio Trojani Department of Economics, University of St. Gallen, Switzerland Correspondence address: Fabio Trojani,
More informationSTOCHASTIC CALCULUS AND BLACK-SCHOLES MODEL
STOCHASTIC CALCULUS AND BLACK-SCHOLES MODEL YOUNGGEUN YOO Abstract. Ito s lemma is often used in Ito calculus to find the differentials of a stochastic process that depends on time. This paper will introduce
More informationPAPER 27 STOCHASTIC CALCULUS AND APPLICATIONS
MATHEMATICAL TRIPOS Part III Thursday, 5 June, 214 1:3 pm to 4:3 pm PAPER 27 STOCHASTIC CALCULUS AND APPLICATIONS Attempt no more than FOUR questions. There are SIX questions in total. The questions carry
More informationA note on the existence of unique equivalent martingale measures in a Markovian setting
Finance Stochast. 1, 251 257 1997 c Springer-Verlag 1997 A note on the existence of unique equivalent martingale measures in a Markovian setting Tina Hviid Rydberg University of Aarhus, Department of Theoretical
More informationDrunken Birds, Brownian Motion, and Other Random Fun
Drunken Birds, Brownian Motion, and Other Random Fun Michael Perlmutter Department of Mathematics Purdue University 1 M. Perlmutter(Purdue) Brownian Motion and Martingales Outline Review of Basic Probability
More informationStochastic calculus Introduction I. Stochastic Finance. C. Azizieh VUB 1/91. C. Azizieh VUB Stochastic Finance
Stochastic Finance C. Azizieh VUB C. Azizieh VUB Stochastic Finance 1/91 Agenda of the course Stochastic calculus : introduction Black-Scholes model Interest rates models C. Azizieh VUB Stochastic Finance
More informationFunctional vs Banach space stochastic calculus & strong-viscosity solutions to semilinear parabolic path-dependent PDEs.
Functional vs Banach space stochastic calculus & strong-viscosity solutions to semilinear parabolic path-dependent PDEs Andrea Cosso LPMA, Université Paris Diderot joint work with Francesco Russo ENSTA,
More informationRMSC 4005 Stochastic Calculus for Finance and Risk. 1 Exercises. (c) Let X = {X n } n=0 be a {F n }-supermartingale. Show that.
1. EXERCISES RMSC 45 Stochastic Calculus for Finance and Risk Exercises 1 Exercises 1. (a) Let X = {X n } n= be a {F n }-martingale. Show that E(X n ) = E(X ) n N (b) Let X = {X n } n= be a {F n }-submartingale.
More informationConstructive martingale representation using Functional Itô Calculus: a local martingale extension
Mathematical Statistics Stockholm University Constructive martingale representation using Functional Itô Calculus: a local martingale extension Kristoffer Lindensjö Research Report 216:21 ISSN 165-377
More informationEnlargement of filtration
Enlargement of filtration Bernardo D Auria email: bernardo.dauria@uc3m.es web: www.est.uc3m.es/bdauria July 6, 2017 ICMAT / UC3M Enlargement of Filtration Enlargement of Filtration ([1] 5.9) If G is a
More informationHedging under arbitrage
Hedging under arbitrage Johannes Ruf Columbia University, Department of Statistics AnStAp10 August 12, 2010 Motivation Usually, there are several trading strategies at one s disposal to obtain a given
More informationOptimal stopping problems for a Brownian motion with a disorder on a finite interval
Optimal stopping problems for a Brownian motion with a disorder on a finite interval A. N. Shiryaev M. V. Zhitlukhin arxiv:1212.379v1 [math.st] 15 Dec 212 December 18, 212 Abstract We consider optimal
More informationIntroduction to Stochastic Calculus With Applications
Introduction to Stochastic Calculus With Applications Fima C Klebaner University of Melbourne \ Imperial College Press Contents Preliminaries From Calculus 1 1.1 Continuous and Differentiable Functions.
More informationCONVERGENCE OF OPTION REWARDS FOR MARKOV TYPE PRICE PROCESSES MODULATED BY STOCHASTIC INDICES
CONVERGENCE OF OPTION REWARDS FOR MARKOV TYPE PRICE PROCESSES MODULATED BY STOCHASTIC INDICES D. S. SILVESTROV, H. JÖNSSON, AND F. STENBERG Abstract. A general price process represented by a two-component
More informationIEOR E4703: Monte-Carlo Simulation
IEOR E4703: Monte-Carlo Simulation Simulating Stochastic Differential Equations Martin Haugh Department of Industrial Engineering and Operations Research Columbia University Email: martin.b.haugh@gmail.com
More informationLimit Theorems for the Empirical Distribution Function of Scaled Increments of Itô Semimartingales at high frequencies
Limit Theorems for the Empirical Distribution Function of Scaled Increments of Itô Semimartingales at high frequencies George Tauchen Duke University Viktor Todorov Northwestern University 2013 Motivation
More informationSensitivity of American Option Prices with Different Strikes, Maturities and Volatilities
Applied Mathematical Sciences, Vol. 6, 2012, no. 112, 5597-5602 Sensitivity of American Option Prices with Different Strikes, Maturities and Volatilities Nasir Rehman Department of Mathematics and Statistics
More informationThe stochastic calculus
Gdansk A schedule of the lecture Stochastic differential equations Ito calculus, Ito process Ornstein - Uhlenbeck (OU) process Heston model Stopping time for OU process Stochastic differential equations
More informationHedging of Contingent Claims under Incomplete Information
Projektbereich B Discussion Paper No. B 166 Hedging of Contingent Claims under Incomplete Information by Hans Föllmer ) Martin Schweizer ) October 199 ) Financial support by Deutsche Forschungsgemeinschaft,
More informationA Note on the No Arbitrage Condition for International Financial Markets
A Note on the No Arbitrage Condition for International Financial Markets FREDDY DELBAEN 1 Department of Mathematics Vrije Universiteit Brussel and HIROSHI SHIRAKAWA 2 Department of Industrial and Systems
More informationStochastic Calculus, Application of Real Analysis in Finance
, Application of Real Analysis in Finance Workshop for Young Mathematicians in Korea Seungkyu Lee Pohang University of Science and Technology August 4th, 2010 Contents 1 BINOMIAL ASSET PRICING MODEL Contents
More informationTangent Lévy Models. Sergey Nadtochiy (joint work with René Carmona) Oxford-Man Institute of Quantitative Finance University of Oxford.
Tangent Lévy Models Sergey Nadtochiy (joint work with René Carmona) Oxford-Man Institute of Quantitative Finance University of Oxford June 24, 2010 6th World Congress of the Bachelier Finance Society Sergey
More informationM5MF6. Advanced Methods in Derivatives Pricing
Course: Setter: M5MF6 Dr Antoine Jacquier MSc EXAMINATIONS IN MATHEMATICS AND FINANCE DEPARTMENT OF MATHEMATICS April 2016 M5MF6 Advanced Methods in Derivatives Pricing Setter s signature...........................................
More informationMSc Financial Engineering CHRISTMAS ASSIGNMENT: MERTON S JUMP-DIFFUSION MODEL. To be handed in by monday January 28, 2013
MSc Financial Engineering 2012-13 CHRISTMAS ASSIGNMENT: MERTON S JUMP-DIFFUSION MODEL To be handed in by monday January 28, 2013 Department EMS, Birkbeck Introduction The assignment consists of Reading
More informationInsiders Hedging in a Stochastic Volatility Model with Informed Traders of Multiple Levels
Insiders Hedging in a Stochastic Volatility Model with Informed Traders of Multiple Levels Kiseop Lee Department of Statistics, Purdue University Mathematical Finance Seminar University of Southern California
More informationOptimal trading strategies under arbitrage
Optimal trading strategies under arbitrage Johannes Ruf Columbia University, Department of Statistics The Third Western Conference in Mathematical Finance November 14, 2009 How should an investor trade
More informationThere are no predictable jumps in arbitrage-free markets
There are no predictable jumps in arbitrage-free markets Markus Pelger October 21, 2016 Abstract We model asset prices in the most general sensible form as special semimartingales. This approach allows
More informationAmerican Option Pricing Formula for Uncertain Financial Market
American Option Pricing Formula for Uncertain Financial Market Xiaowei Chen Uncertainty Theory Laboratory, Department of Mathematical Sciences Tsinghua University, Beijing 184, China chenxw7@mailstsinghuaeducn
More informationBROWNIAN MOTION Antonella Basso, Martina Nardon
BROWNIAN MOTION Antonella Basso, Martina Nardon basso@unive.it, mnardon@unive.it Department of Applied Mathematics University Ca Foscari Venice Brownian motion p. 1 Brownian motion Brownian motion plays
More informationMinimal Variance Hedging in Large Financial Markets: random fields approach
Minimal Variance Hedging in Large Financial Markets: random fields approach Giulia Di Nunno Third AMaMeF Conference: Advances in Mathematical Finance Pitesti, May 5-1 28 based on a work in progress with
More information4 Martingales in Discrete-Time
4 Martingales in Discrete-Time Suppose that (Ω, F, P is a probability space. Definition 4.1. A sequence F = {F n, n = 0, 1,...} is called a filtration if each F n is a sub-σ-algebra of F, and F n F n+1
More informationSTOCHASTIC INTEGRALS
Stat 391/FinMath 346 Lecture 8 STOCHASTIC INTEGRALS X t = CONTINUOUS PROCESS θ t = PORTFOLIO: #X t HELD AT t { St : STOCK PRICE M t : MG W t : BROWNIAN MOTION DISCRETE TIME: = t < t 1
More informationAre the Azéma-Yor processes truly remarkable?
Are the Azéma-Yor processes truly remarkable? Jan Obłój j.obloj@imperial.ac.uk based on joint works with L. Carraro, N. El Karoui, A. Meziou and M. Yor Welsh Probability Seminar, 17 Jan 28 Are the Azéma-Yor
More informationAn Introduction to Stochastic Calculus
An Introduction to Stochastic Calculus Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Week 5 Haijun Li An Introduction to Stochastic Calculus Week 5 1 / 20 Outline 1 Martingales
More informationMartingale representation theorem
Martingale representation theorem Ω = C[, T ], F T = smallest σ-field with respect to which B s are all measurable, s T, P the Wiener measure, B t = Brownian motion M t square integrable martingale with
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 11 10/9/2013. Martingales and stopping times II
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 11 10/9/013 Martingales and stopping times II Content. 1. Second stopping theorem.. Doob-Kolmogorov inequality. 3. Applications of stopping
More informationLarge Deviations and Stochastic Volatility with Jumps: Asymptotic Implied Volatility for Affine Models
Large Deviations and Stochastic Volatility with Jumps: TU Berlin with A. Jaquier and A. Mijatović (Imperial College London) SIAM conference on Financial Mathematics, Minneapolis, MN July 10, 2012 Implied
More informationRisk Neutral Measures
CHPTER 4 Risk Neutral Measures Our aim in this section is to show how risk neutral measures can be used to price derivative securities. The key advantage is that under a risk neutral measure the discounted
More informationAmerican Foreign Exchange Options and some Continuity Estimates of the Optimal Exercise Boundary with respect to Volatility
American Foreign Exchange Options and some Continuity Estimates of the Optimal Exercise Boundary with respect to Volatility Nasir Rehman Allam Iqbal Open University Islamabad, Pakistan. Outline Mathematical
More informationRemarks: 1. Often we shall be sloppy about specifying the ltration. In all of our examples there will be a Brownian motion around and it will be impli
6 Martingales in continuous time Just as in discrete time, the notion of a martingale will play a key r^ole in our continuous time models. Recall that in discrete time, a sequence ; 1 ;::: ; n for which
More informationAn overview of some financial models using BSDE with enlarged filtrations
An overview of some financial models using BSDE with enlarged filtrations Anne EYRAUD-LOISEL Workshop : Enlargement of Filtrations and Applications to Finance and Insurance May 31st - June 4th, 2010, Jena
More informationBasic Concepts and Examples in Finance
Basic Concepts and Examples in Finance Bernardo D Auria email: bernardo.dauria@uc3m.es web: www.est.uc3m.es/bdauria July 5, 2017 ICMAT / UC3M The Financial Market The Financial Market We assume there are
More informationLecture 1: Lévy processes
Lecture 1: Lévy processes A. E. Kyprianou Department of Mathematical Sciences, University of Bath 1/ 22 Lévy processes 2/ 22 Lévy processes A process X = {X t : t 0} defined on a probability space (Ω,
More informationMESURES DE RISQUE DYNAMIQUES DYNAMIC RISK MEASURES
from BMO martingales MESURES DE RISQUE DYNAMIQUES DYNAMIC RISK MEASURES CNRS - CMAP Ecole Polytechnique March 1, 2007 1/ 45 OUTLINE from BMO martingales 1 INTRODUCTION 2 DYNAMIC RISK MEASURES Time Consistency
More informationAre the Azéma-Yor processes truly remarkable?
Are the Azéma-Yor processes truly remarkable? Jan Obłój j.obloj@imperial.ac.uk based on joint works with L. Carraro, N. El Karoui, A. Meziou and M. Yor Swiss Probability Seminar, 5 Dec 2007 Are the Azéma-Yor
More informationNon-semimartingales in finance
Non-semimartingales in finance Pricing and Hedging Options with Quadratic Variation Tommi Sottinen University of Vaasa 1st Northern Triangular Seminar 9-11 March 2009, Helsinki University of Technology
More informationLecture 4. Finite difference and finite element methods
Finite difference and finite element methods Lecture 4 Outline Black-Scholes equation From expectation to PDE Goal: compute the value of European option with payoff g which is the conditional expectation
More informationComparison of proof techniques in game-theoretic probability and measure-theoretic probability
Comparison of proof techniques in game-theoretic probability and measure-theoretic probability Akimichi Takemura, Univ. of Tokyo March 31, 2008 1 Outline: A.Takemura 0. Background and our contributions
More informationOn Complexity of Multistage Stochastic Programs
On Complexity of Multistage Stochastic Programs Alexander Shapiro School of Industrial and Systems Engineering, Georgia Institute of Technology, Atlanta, Georgia 30332-0205, USA e-mail: ashapiro@isye.gatech.edu
More informationAsymptotic results discrete time martingales and stochastic algorithms
Asymptotic results discrete time martingales and stochastic algorithms Bernard Bercu Bordeaux University, France IFCAM Summer School Bangalore, India, July 2015 Bernard Bercu Asymptotic results for discrete
More informationMATH3075/3975 FINANCIAL MATHEMATICS TUTORIAL PROBLEMS
MATH307/37 FINANCIAL MATHEMATICS TUTORIAL PROBLEMS School of Mathematics and Statistics Semester, 04 Tutorial problems should be used to test your mathematical skills and understanding of the lecture material.
More informationChanges of the filtration and the default event risk premium
Changes of the filtration and the default event risk premium Department of Banking and Finance University of Zurich April 22 2013 Math Finance Colloquium USC Change of the probability measure Change of
More informationExact Sampling of Jump-Diffusion Processes
1 Exact Sampling of Jump-Diffusion Processes and Dmitry Smelov Management Science & Engineering Stanford University Exact Sampling of Jump-Diffusion Processes 2 Jump-Diffusion Processes Ubiquitous in finance
More informationAn Introduction to Stochastic Calculus
An Introduction to Stochastic Calculus Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Week 2-3 Haijun Li An Introduction to Stochastic Calculus Week 2-3 1 / 24 Outline
More informationMartingales & Strict Local Martingales PDE & Probability Methods INRIA, Sophia-Antipolis
Martingales & Strict Local Martingales PDE & Probability Methods INRIA, Sophia-Antipolis Philip Protter, Columbia University Based on work with Aditi Dandapani, 2016 Columbia PhD, now at ETH, Zurich March
More informationVOLATILITY TIME AND PROPERTIES OF OPTION PRICES
VOLATILITY TIME AND PROPERTIES OF OPTION PRICES SVANTE JANSON AND JOHAN TYSK Abstract. We use a notion of stochastic time, here called volatility time, to show convexity of option prices in the underlying
More informationBarrier Options Pricing in Uncertain Financial Market
Barrier Options Pricing in Uncertain Financial Market Jianqiang Xu, Jin Peng Institute of Uncertain Systems, Huanggang Normal University, Hubei 438, China College of Mathematics and Science, Shanghai Normal
More informationExponential martingales and the UI martingale property
u n i v e r s i t y o f c o p e n h a g e n d e p a r t m e n t o f m a t h e m a t i c a l s c i e n c e s Faculty of Science Exponential martingales and the UI martingale property Alexander Sokol Department
More informationLogarithmic derivatives of densities for jump processes
Logarithmic derivatives of densities for jump processes Atsushi AKEUCHI Osaka City University (JAPAN) June 3, 29 City University of Hong Kong Workshop on Stochastic Analysis and Finance (June 29 - July
More informationBandit Problems with Lévy Payoff Processes
Bandit Problems with Lévy Payoff Processes Eilon Solan Tel Aviv University Joint with Asaf Cohen Multi-Arm Bandits A single player sequential decision making problem. Time is continuous or discrete. The
More informationThe Azéma-Yor Embedding in Non-Singular Diffusions
The Azéma-Yor Embedding in Non-Singular Diffusions J.L. Pedersen and G. Peskir Let (X t ) t 0 be a non-singular (not necessarily recurrent) diffusion on R starting at zero, and let ν be a probability measure
More informationChapter 3: Black-Scholes Equation and Its Numerical Evaluation
Chapter 3: Black-Scholes Equation and Its Numerical Evaluation 3.1 Itô Integral 3.1.1 Convergence in the Mean and Stieltjes Integral Definition 3.1 (Convergence in the Mean) A sequence {X n } n ln of random
More informationPricing in markets modeled by general processes with independent increments
Pricing in markets modeled by general processes with independent increments Tom Hurd Financial Mathematics at McMaster www.phimac.org Thanks to Tahir Choulli and Shui Feng Financial Mathematics Seminar
More informationConstructing Markov models for barrier options
Constructing Markov models for barrier options Gerard Brunick joint work with Steven Shreve Department of Mathematics University of Texas at Austin Nov. 14 th, 2009 3 rd Western Conference on Mathematical
More informationTHE MARTINGALE METHOD DEMYSTIFIED
THE MARTINGALE METHOD DEMYSTIFIED SIMON ELLERSGAARD NIELSEN Abstract. We consider the nitty gritty of the martingale approach to option pricing. These notes are largely based upon Björk s Arbitrage Theory
More informationAn Introduction to Stochastic Calculus
An Introduction to Stochastic Calculus Haijun Li lih@math.wsu.edu Department of Mathematics and Statistics Washington State University Lisbon, May 218 Haijun Li An Introduction to Stochastic Calculus Lisbon,
More informationLocal Volatility Dynamic Models
René Carmona Bendheim Center for Finance Department of Operations Research & Financial Engineering Princeton University Columbia November 9, 27 Contents Joint work with Sergey Nadtochyi Motivation 1 Understanding
More informationNo-arbitrage theorem for multi-factor uncertain stock model with floating interest rate
Fuzzy Optim Decis Making 217 16:221 234 DOI 117/s17-16-9246-8 No-arbitrage theorem for multi-factor uncertain stock model with floating interest rate Xiaoyu Ji 1 Hua Ke 2 Published online: 17 May 216 Springer
More informationArbitrage of the first kind and filtration enlargements in semimartingale financial models. Beatrice Acciaio
Arbitrage of the first kind and filtration enlargements in semimartingale financial models Beatrice Acciaio the London School of Economics and Political Science (based on a joint work with C. Fontana and
More informationUniversität Regensburg Mathematik
Universität Regensburg Mathematik Modeling financial markets with extreme risk Tobias Kusche Preprint Nr. 04/2008 Modeling financial markets with extreme risk Dr. Tobias Kusche 11. January 2008 1 Introduction
More informationIntroduction to Stochastic Calculus
Introduction to Stochastic Calculus Director Chennai Mathematical Institute rlk@cmi.ac.in rkarandikar@gmail.com Introduction to Stochastic Calculus - 1 A Game Consider a gambling house. A fair coin is
More informationWeierstrass Institute for Applied Analysis and Stochastics Maximum likelihood estimation for jump diffusions
Weierstrass Institute for Applied Analysis and Stochastics Maximum likelihood estimation for jump diffusions Hilmar Mai Mohrenstrasse 39 1117 Berlin Germany Tel. +49 3 2372 www.wias-berlin.de Haindorf
More informationHints on Some of the Exercises
Hints on Some of the Exercises of the book R. Seydel: Tools for Computational Finance. Springer, 00/004/006/009/01. Preparatory Remarks: Some of the hints suggest ideas that may simplify solving the exercises
More informationConditional Full Support and No Arbitrage
Gen. Math. Notes, Vol. 32, No. 2, February 216, pp.54-64 ISSN 2219-7184; Copyright c ICSRS Publication, 216 www.i-csrs.org Available free online at http://www.geman.in Conditional Full Support and No Arbitrage
More informationMath-Stat-491-Fall2014-Notes-V
Math-Stat-491-Fall2014-Notes-V Hariharan Narayanan December 7, 2014 Martingales 1 Introduction Martingales were originally introduced into probability theory as a model for fair betting games. Essentially
More informationStochastic Integral Representation of One Stochastically Non-smooth Wiener Functional
Bulletin of TICMI Vol. 2, No. 2, 26, 24 36 Stochastic Integral Representation of One Stochastically Non-smooth Wiener Functional Hanna Livinska a and Omar Purtukhia b a Taras Shevchenko National University
More informationThe Azema Yor embedding in non-singular diusions
Stochastic Processes and their Applications 96 2001 305 312 www.elsevier.com/locate/spa The Azema Yor embedding in non-singular diusions J.L. Pedersen a;, G. Peskir b a Department of Mathematics, ETH-Zentrum,
More informationAdditional questions for chapter 3
Additional questions for chapter 3 1. Let ξ 1, ξ 2,... be independent and identically distributed with φθ) = IEexp{θξ 1 })
More informationDeterministic Income under a Stochastic Interest Rate
Deterministic Income under a Stochastic Interest Rate Julia Eisenberg, TU Vienna Scientic Day, 1 Agenda 1 Classical Problem: Maximizing Discounted Dividends in a Brownian Risk Model 2 Maximizing Discounted
More informationSHORT-TERM RELATIVE ARBITRAGE IN VOLATILITY-STABILIZED MARKETS
SHORT-TERM RELATIVE ARBITRAGE IN VOLATILITY-STABILIZED MARKETS ADRIAN D. BANNER INTECH One Palmer Square Princeton, NJ 8542, USA adrian@enhanced.com DANIEL FERNHOLZ Department of Computer Sciences University
More informationHedging of Contingent Claims in Incomplete Markets
STAT25 Project Report Spring 22 Hedging of Contingent Claims in Incomplete Markets XuanLong Nguyen Email: xuanlong@cs.berkeley.edu 1 Introduction This report surveys important results in the literature
More informationMath 6810 (Probability) Fall Lecture notes
Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas April 16, 2013 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),
More informationIntroduction to Stochastic Calculus
Introduction to Stochastic Calculus Director Chennai Mathematical Institute rlk@cmi.ac.in rkarandikar@gmail.com Introduction to Stochastic Calculus - 1 The notion of Conditional Expectation of a random
More information1 Mathematics in a Pill 1.1 PROBABILITY SPACE AND RANDOM VARIABLES. A probability triple P consists of the following components:
1 Mathematics in a Pill The purpose of this chapter is to give a brief outline of the probability theory underlying the mathematics inside the book, and to introduce necessary notation and conventions
More informationRisk, Return, and Ross Recovery
Risk, Return, and Ross Recovery Peter Carr and Jiming Yu Courant Institute, New York University September 13, 2012 Carr/Yu (NYU Courant) Risk, Return, and Ross Recovery September 13, 2012 1 / 30 P, Q,
More information1.1 Basic Financial Derivatives: Forward Contracts and Options
Chapter 1 Preliminaries 1.1 Basic Financial Derivatives: Forward Contracts and Options A derivative is a financial instrument whose value depends on the values of other, more basic underlying variables
More informationS t d with probability (1 p), where
Stochastic Calculus Week 3 Topics: Towards Black-Scholes Stochastic Processes Brownian Motion Conditional Expectations Continuous-time Martingales Towards Black Scholes Suppose again that S t+δt equals
More informationIn Discrete Time a Local Martingale is a Martingale under an Equivalent Probability Measure
In Discrete Time a Local Martingale is a Martingale under an Equivalent Probability Measure Yuri Kabanov 1,2 1 Laboratoire de Mathématiques, Université de Franche-Comté, 16 Route de Gray, 253 Besançon,
More informationOn the Lower Arbitrage Bound of American Contingent Claims
On the Lower Arbitrage Bound of American Contingent Claims Beatrice Acciaio Gregor Svindland December 2011 Abstract We prove that in a discrete-time market model the lower arbitrage bound of an American
More informationMath 416/516: Stochastic Simulation
Math 416/516: Stochastic Simulation Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Week 13 Haijun Li Math 416/516: Stochastic Simulation Week 13 1 / 28 Outline 1 Simulation
More informationEFFICIENT MONTE CARLO ALGORITHM FOR PRICING BARRIER OPTIONS
Commun. Korean Math. Soc. 23 (2008), No. 2, pp. 285 294 EFFICIENT MONTE CARLO ALGORITHM FOR PRICING BARRIER OPTIONS Kyoung-Sook Moon Reprinted from the Communications of the Korean Mathematical Society
More information