Xinfu Chen. Mathematical Finance III. Department of mathematics, University of Pittsburgh Pittsburgh, PA 15260, USA

Size: px
Start display at page:

Download "Xinfu Chen. Mathematical Finance III. Department of mathematics, University of Pittsburgh Pittsburgh, PA 15260, USA"

Transcription

1 Xinfu Chen Mathematical Finance III Department of mathematics, University of Pittsburgh Pittsburgh, PA 15260, USA

2 ii

3 MATHEMATICAL FINANCE III Course Outline This course is an introduction to modern mathematical finance. Topics include 1. single period portfolio optimization based on the mean-variance analysis, capital asset pricing model, factor models and arbitrage pricing theory. 2. pricing and hedging derivative securities based on a fundamental state model, the well-received Cox-Ross-Rubinstein s binary lattice model, and the celebrated Black-Scholes continuum model; 3. discrete-time and continuous-time optimal portfolio growth theory, in particular the universal logoptimal pricing formula; 4. necessary mathematical tools for finance, such as theories of measure, probability, statistics, and stochastic process. Prerequisites Calculus, Knowledge on Excel Spreadsheet, or Matlab, or Mathematica, or Maple. Textbooks Financial Calculus, Martin Baxter & Andrew Rennie. Xinfu Chen, Lecture Notes, available online xfc. Recommended References David G. Luenberger, Investment Science, Oxford University Press, John C. Hull, Options, Futures and Other Derivatives, Fourth Edition, Prentice-Hall, Martin Baxter and Andrew Rennie, Financial Calculus, Cambridge University Press, P. Wilmott, S. Howison & J. Dewynne, The Mathematics of Financial Derviatives, CUP, Stanley R. Pliska, Inroduction to Mathematical Finance, Blackwell, Grading Scheme Homework 40 % Take Home Midterms 40% Final 40% iii

4 iv MATHEMATICAL FINANCE III

5 Contents MATHEMATICAL FINANCE III iii 1 Stochastic Process Certain Mathematical Tools Random Walk Description Characteristic Properties of a Random Walk Brownian Motion Brownian Motion as the Limit of Random Walks A Fourier Series Representation of the Brownian Motion Brownian Motion on Space of Continuous Functions Filtration Vector Valued Brownian Motion Filtration and Martingale Filtration Filtration, Partition, and Information Conditional Probability Martingale Itô Integrals Stochastic (Itô) Integration Itô s Lemma Stochastic Differential Equation (SDE) Diffusion Process Itô diffusion Semigroup Generator Kolmogorov s Backward Equation Kolmogorov s Forward or Fokker-Planck equation The Feynman-Kac Formula The Black Scholes Theory Replicating Portfolio Modelling Unit Share Prices of Securities Portfolios Claim and Replicating Strategy Expectation Pricing and Arbitrage-free Pricing The Black Scholes Equation Solutions to the Black Scholes Equation v

6 vi CONTENTS A PDE Method Probabilistic Method The Risk-Neutral Measure Examples European Options Foreign Exchange Equities and Dividends Quantos Binomial Tree Model Pricing Options Replicating Portfolio for Derivative Security A Model for Stock Prices Continuous Model As Limit of Discrete Model The Black Scholes Equation References 75 Index 76

7 Chapter 1 Stochastic Process Stochastic process has been used in mathematical finance to model various kinds of indeterministic quantities such as stock prices, interest rates, etc. It has become a necessary tool for modern theories on finance. In this chapter, we introduce certain elementary theories on the stochastic process for our basic needs in this course. 1.1 Certain Mathematical Tools Here we introduce necessary mathematical tools from probability that are needed for the study of stochastic process. A probability space is a triple (Ω, F, P), where Ω is a set, F is a σ-algebra on Ω, and P is a probability measure on (Ω, F). A random variable is a measurable function on a probability space. A stochastic process is a collection {S t } t T of random variables on a probability space (Ω, F, P); here T is a set for time such as T = [0,T], T = [0, ),T = {0,1,2, }, or T = {t 0,t 1,t 2,,t n }, etc. Here by F being a σ-algebra on Ω it means that F is a non-empty collection of subsets of Ω that is closed under the operation of compliment and countable union. Also, by a probability measure, it means that P : F [0,1] is a non-negative function satisfying P(Ω) = 1 and for every countable disjoint sets A 1,A 2, in F, ( P i=1 A i ) = P(A i ). i=1 Finally, a function f : Ω R is called F measurable if {ω Ω f(ω) < r} F r R. The simplest measurable function is the characteristic function 1 A of a measurable set A F defined by { 1 if ω A, 1 A (ω) = 0 if ω A. 1

8 2 CHAPTER 1. STOCHASTIC PROCESS A simple function is a linear combination of finitely many characteristic functions. For a simple function n i=1 c i1 Ai, its integral is defined as ( n ) n c i 1 Ai (ω) P(dω) := c i P(A i ). Ω i=1 The integral of a general measurable function is defined as the limit (if it exists) of integrals of an approximation sequence of simple functions. Two random variables X and Y are called equal and write X = Y if P({ω Ω; X(ω) Y (ω)}) = 0. i=1 In a probability space (Ω, F, P), each ω Ω is called a sample event, and each A F is called an observable event with observable probability P(A). Similarly, if A is not measurable, then A is called a non-observable event. Let Ω be a set and S be a collection of subsets of Ω. We denote by σ(s) the smallest σ-algebra that contains S. σ(s) is called the σ-algebra generated by S. In R, the algebra B generated by all open intervals is called the Borel σ-algebra and each set in B is called a Borel set. If X is a random variable on (Ω, F, P) and B is a Borel set of R, then X 1 (B) := {ω Ω X(ω) B} is a measurable set. In the sequel, we use notation, for every Borel set B in R, Note that the mapping: Prob(X B) := P(X 1 (B)) = P({ω Ω X(ω) B}). PX 1 : B B P(X 1 (B)) defines a probability measure on (R, B); namely, (R, B, PX 1 ) is a probability space. R P F X 1 B In the study of a single random variable X, all the properties of the random variable are observed through the probability space (R, B, PX 1 ), since X is experimentally observed by measuring the probability of the outcome X 1 (B) for every B B. Given a random variable X, its distribution function is defined by F(x) := Prob(X x) := Prob(X (,x]) := P({ω Ω X(ω) x}) x R. The distribution density is defined as the derivative of F: ρ(x) = df(x) dx x R. A random variable is called N(µ,σ 2 ) (µ R,σ > 0) distributed, or normally distributed with mean µ and standard deviation σ (i.e. variance σ 2 ) if it has probability density ρ(µ,σ; x) := 1 ( exp (x ) µ)2 2πσ 2 2σ 2.

9 1.1. CERTAIN MATHEMATICAL TOOLS 3 When σ = 0, a N(µ,0) random variable X becomes a deterministic constant function X(ω) µ for (almost) all ω Ω. In the sequel, we use E for expectation and Var for the variance E[X] := X(ω)P(dω), Ω ( 2P(dω) Var[X] := X(ω) E(X)) = E[X 2 ] (E[X]) 2. The Law of the unconscious statistician Ω Given a random variable X with probability density function ρ and an itegrable real function f on (R, B), the expectation of f(x) is E[f(X)] := f(x(ω))p(dω) = f(x)ρ(x) dx. Ω As far as a single random variable X is concerned, in certain sense all relevant information in P is contained in the measure PX 1, i.e. the distribution function F. Of course in such a study we lost track of the model underlying the random variable. Nothing is lost just so long as we are interested in one random variable. If however we have two random variables X and Y on (Ω, F, P), then two distribution functions for PX 1 and PY 1 are not by themselves sufficient to say all there is about X,Y and their interrelation. We need to consider Z = (X,Y ) as a map of Ω into R 2 and define a measure by PZ 1. This measure on the Borel sets of the plane defines what is usually called the joint distribution of X and Y, and is sufficient for a complete study of X,Y and their interrelations. This idea of course extends to any finite number of random variables. The concept of a stochastic process is now a straightforward generalization of these ideas. Let T be any index set of points t. A stochastic process on T is a collection of random variables {ξ t } t T on a probability space (Ω, F, P). For each ω Ω, the map ξ(ω) : T R defined by t ξ t (ω) is a function from T to R, which we call a sample path. Let s denote by Map(T; R) the collection of all functions from T to R. Thus, a stochastic process can be viewed as a function ξ : Ω Map(T; R) by the realization ξ(ω)(t) = ξ t (ω) for all t T. In many applications, one simply take Ω as a subset of Map(T; R). In such a case, any ω Ω is a function in Map(t; R) and hence, the function from Ω Map(T; R) is realized through the default π(ω) := ω( ) t T,ω Ω Map(T; R). Note that the probability is build upon Map(T; R). Exercise 1.1. Assume that f is a simple function. Prove the law of unconscious statistician. Exercise 1.2. Let Ω = {1,2,3,4}. Let F be the smallest σ-algebra that contains {1} and {2}. (i) List all the elements in F; (ii) Is the event {1,2,3} observable?

10 4 CHAPTER 1. STOCHASTIC PROCESS Exercise 1.3. For every random variable X, show that Var[X] = E[X 2 ] (E[X]) 2. Exercise 1.4. Suppose X is N(µ,σ 2 ) distributed. Find the following: E[X], Var[X], E[(X E[x]) 3 ], E[(X E[x]) 4 ], E[e iλx ] (λ C). Exercise 1.5. Let f,g : R R be continuous functions. Show that there exists a smallest σ-algebra F on R such that both f and g are F measurable. Also show that each element in F is a Borel set. Suppose f(x) = 1 and g(x) = 2 for all x R. Describe F. 1.2 Random Walk Brownian motion is one of the most important building block for stochastic process. To study Brownian motion, we begin with a random walk, a discretized version of the Brownian motion. Roughly speaking, A random walk is the motion of a particle on a line, which walks in each unit time step a unit space step in one direction or the opposite with probability 1/2 each Description Mathematically, we describe a random walk on the real line by the following steps. 1. Let X 1,X 2,X 3, be a sequence of independent binomial random variables taking values +1 and 1 with equal probability: Probability(X i = 1) = 1 2, Probability(X i = 1) = 1 2. Such a sequence can be obtained, for example, by tossing a fair coin, head for 1 and tail for Let t be the unit time step and x be the unit space step. Let W i t be the position of the particle at time t i := i t. It is a random variable, and can be defined as W 0 := 0, i W i t := X j x = W [i 1] t + X i x, i = 1,2,. j=1 Here the last equation W i t = W [i 1] t +X i x means that the particle moves from the position W [i 1] t at time t i 1 = [i 1] t to a new position W i t at time t i = i t by walking one unit space step x, either to the left or to the right, depending on the choice of X i being 1 or Though not necessary, it is sometimes convenient to define the position W t of the particle at an arbitrary time t 0. There are jump version and continuous versions. Here we use a constant speed version by a linear interpolation: ( W t = [i + 1] t ) ( t ) W i t + t t i W [i+1] t t (i t,[i + 1] t), i = 0,1,. (1.1) We call {W t } t 0 the process of a random walk with time step t and space step x.

11 1.2. RANDOM WALK Characteristic Properties of a Random Walk The random walk is described by the stochastic process {W t } t 0 defined earlier. Here the probability space for the process is determined by the probability space associated with the random variables {X 1 } i=1. A standard probability space (Ω, F, P) for a sequence {X i } i=1 of binary random variables can be obtained as follows. 1. First we set Ω = { 1,1} N = {(x 1,x 2, ) x i {1, 1} i N}. Each ω = (x 1,x 2, ) Ω can be regarded as the record of a sequences of coin tossing where x i records the outcome of the ith toss, x i = +1 for head and x i = 1 for tail. 2. We define random variables X i for each i N by X i (ω) = x i ω = (x 1,x 2, ) Ω. Thus, X i (ω) is the i-th outcome of the event ω Ω. Note that X i takes only two values, +1 and 1. We denote Ω +1 i := {ω Ω X i (ω) = +1} = {(x 1,x 2, ) Ω x i = +1}, Ω 1 i := {ω Ω X i (ω) = 1} = {(x 1,x 2, ) Ω x i = 1}. 3. The σ algebra F on Ω is the smallest σ-algebra on Ω such that all X 1,X 2, are measurable. Clearly, it is necessary and sufficient to define F as the σ-algebra generated by the the countable boxes Ω +1 1, Ω 1 1, Ω+1 2, Ω 1 2,. Under this σ-algebra, each X i is (Ω, F) measurable. For each (x 1,,x n ) { 1,1} n, the cylindrical box c(x 1,,x n ) is defined by c(x 1,,x n ) := (x 1,,x n ) Note that each cylindrical box belongs to F since j=n+1 { 1, 1} = {(x 1,,x n,y n+1,y n+2, ) y j { 1,1} j n + 1}. c(x 1,,x n ) = n {ω Ω X i (ω) = x i }. i=1 4. To define P such that both X i = 1 and X i = 1 has probability 1/2, we first define P on the each cylindrical box by P(c(x 1,,x n )) = 2 n n N,(x 1,,x n ) { 1,1} n. One can show that such defined P on all cylindrical boxes can be extended onto F to become a probability measure. Hence, we have a standard probability space (Ω, F, P). 5. One can show that under the probability space (Ω, F, P), {X i } i=1 is a sequence of i.i.d. random variables having the property that the probability of X i = ±1 is 1/2.

12 6 CHAPTER 1. STOCHASTIC PROCESS The random walk, modelled by the stochastic process {W t } t 0 on (Ω, F, P), bas the following properties: 1. W 0 (ω) 0 for every ω Ω; 2. E[W(t)] = 0 and Var[W(t)] = σt for every t T, where T = {i t} i=0, σ = ( x)2. t 3. For every t 0,t 1,t 2,,t n T with 0 = t 0 < t 1 < < t n, the following random variables are independent: W(t n ) W(t n 1 ), W(t n 1 t n 2 ),, W(t 1 ) W(t 0 ). 4. For every ω Ω, the function t [0, ) W t (ω) R is a continuous function. Exercise 1.6. From the definition, show that P({ω Ω X i (ω) = 1) = 1 2, Prob(X i A,X j B) = Prob(X i A) Prob(X j B) i j. Exercise 1.7. Show that the random walk has the above four listed properties. Exercise 1.8. Show that for every positive integer n, W n x has the range {k x} n k= n and Prob(W n t = k x) = Ck n 2 n, Ck n := k!(n k)!. n! Exercise 1.9. Let n be a positive integer and set t = 1/n and x = 1/ n. Let Ω n = { 1,1} n, F n = 2 Ωn (the collection of all subsets of Ω n ) and P(ω) = 1 2 n ω Ω n. 1. Show that (Ω n, F n, P n ) is a probability space. 2. In time interval [0,1], show that there are a total of 2 n samples random walks. We denote by W 1,,W 2n all the sample random walks. 3. Denote by C([0,1]) = C([0,1]; R) the space of all continuous functions from [0,1] to R. Denote by B the Borel algebra on C([0,1]) generated by all open sets under the norm x = sup x(t) t [0,1] x C([0,1]). On C([0,1]) we define P({W i }) = 1 2 n i = 1,,2 n, P(C([0,1]) \ {W 1,,W 2n }) = 0. Show that there exists an extension of P on B such that (C([0,1]); B, P) is a probability space. [For each B B, one can define P(B) as the number of random walks contained in B.]

13 1.3. BROWNIAN MOTION 7 4. Show that the map π from ω to the function t W t (ω) lift the probability space (Ω n, F n, P n ) to the probability space (C([0,1]), B, P); that is, prove the following: (a) for every A F n, π(a) B and P n (A) = P(π(A)); [Notice that A is a finite set so π(a) is also a finite set, and hence is a closed set in C([0,1]) which is of course a Borel set.] (b) for every B B, π 1 (B) F n and P(B) = P n (π 1 (B)). [Count how many random walks are in the set B. Be careful about the definition of P on B.] 5. When n = 6, calculate the probability Prob(W 1 = 0,W 1/2 = 0). 1.3 Brownian Motion The Brownian motion was first described as continuous swarming motion by the English botanist Robert Brown in studying the motion of small pollen grains immersed in a liquid medium in Albert Einstein in 1905 showed that the swarming motion, now called Brownian motion, could be the consequence of the continual bombardment of particles by the molecules of liquid. A formal mathematical description of the Brownian motion and its properties was first given by Nobert Wiener in It is especially interesting to note that the Brownian motion was used by the French mathematician Bachelier to model stock prices, for his doctoral dissertation in 1900! It was nearly a century after Brown first observed microscopic particles zigzagging that the mathematical model for their movement was properly developed. There are many mathematical ways to model the Brownian motion. Here, we provide three mathematical models Brownian Motion as the Limit of Random Walks Fix a positive integer n. We denote by {W n t } t 0 the random walk obtained by taking t = 1 n, x = 1 n = t. Fix t > 0. Consider the sequence of random variables {Wt n } n=1. Denote by [nt] the largest integer no bigger than nt. We have, by (1.1) with i = [nt], ( ) ) Wt n = [nt] + 1 nt W[nt] (nt n + [nt] W n [nt]+ 1 n ( ) = W[nt] n + (nt [nt]) W[nt]+1/n n W [nt] n = = [nt] 1 X i + n i=1 [nt] n [nt] i=1 X i [nt] (nt [nt]) X [nt]+1 n + (nt [nt]) n X [nt]+1. To consider the limit as n, we recall the celebrated central limit theorem.

14 8 CHAPTER 1. STOCHASTIC PROCESS If {X i } i=1 is a sequence of i.i.d (independent identically distributed) random variables with mean µ and variance σ 2, then in probability, the random variable m i=1 X i mµ mσ 2 approaches, as m, a random variable that is N(0,1) distributed. Using the central limit theorem, we see that there exists a random variable B t such that lim n W n t = B t Here limit in probability means ( ) lim Prob Wt n (a,b) n in probability. ( ) = Prob B t (a,b) In addition, B t is N(0,t) distributed. The stochastic process {B t } t 0 has the following properties: (a,b) R. 1. B 0 = 0; 2. For each t 0, B t is a N(0,t) distributed random variable; 3. For each positive integer k and each 0 = t 0 < t 1 < t 2 < < t k, the following random variables are independent: B t0 B t1, B t2 B t1,, B tk 1 B tk. We shall henceforth call a stochastic process {B t } t 0 a Brownian motion if it has the above stated properties. We remark that the limit W n t B t is only in probability, not pointwise. Hence, it is a mathematical nightmare to see what the probability space (Ω, F, P) that the Brownian motion is defined on. So far we shall be content with what we have obtained A Fourier Series Representation of the Brownian Motion Let (Ω, F, P) be a probability space on which there exists a sequence {ξ n } n=0 of i.i.d random variables which are N(0,1) distributed. Consider the stochastic process {B t } t [0,π/2) defined by B t (ω) = 2 sin([2n + 1]t) ξ n (ω) π (2n + 1) n=0 ω Ω, t [0,π/2). One can show that the series is convergent for P-almost all ω Ω. We claim that {B t } t [0,π/2) is a Brownian motion on (Ω, F, P). For this purpose, let 0 < t 1 < < t m < π/2 be arbitrary times in (0,π/2). We consider the joint distribution of the m-dimensional random variable Z := (B t1,b t2,,b tm ). We want to show that Z has a Gaussian distribution with mean (0,,0) and covariance matrix Cov(B ti,b tj ) := E[(B ti 0)(B tj 0)] = min{t i,t j }.

15 1.3. BROWNIAN MOTION 9 Suppose this is proven. Then (i) Var(B t ) = t for all t (0,π/2) so as t ց 0, B t 0 = B 0 in measure; (ii) B t is N(0,t) distributed; (iii) B t1,b t2 B t1,,b tm B tm 1 are independent. Thus, {B t } t [0,π/2) is a Brownian motion. Indeed, an alternative definition for a stochastic process {B t } t 0 to be a Brownian motion is for every 0 < t 1 < < t m, the joint distribution of (B t1,,b tm ) is Gaussian with mean vector (0,,0) and covariance matrix (min{t i,t j }) m m. To show that Z = (B t1,,b tm ) is Gaussian distributed, we need only show that the characteristic function ψ(λ 1,,λ m ) of Z is the characteristic function of the required Gaussian distribution. Here the characteristic function of a random variable is defined as the Fourier transform of the density function of the Random variable. Thus if we let ρ(x 1,,x m ) be the joint distribution of Z, then the characteristic function ψ(λ 1,,λ m ) of Z is ψ(λ 1,,λ m ) := e i È m j=1 λjxj ρ(x 1,,x m ) dx 1 dx m R m = E [e i È ] m j=1 λjbt j = E = n=0 m exp (iξ n j=1 m E exp (iξ n n=0 since all ξ 0,ξ 1, are independent random variables. Since ξ n is N(0,1) distributed, we have E[e iλξn ] = 1 e iλx x2 /2 dx = e λ2 /2 2π It then follows that ψ(λ 1,,λ m ) = = R exp 1 ( m 2 n=1 exp 1 2 n=1 = exp ( 1 2 j=1 m i,j=1 m ) σ ij λ i λ j i,j=1 j=1 2λ j sin([2n + 1]t j ) ) π(2n + 1) 2λ j sin([2n + 1]t j ) ) π(2n + 1) λ C. 2λ j sin([2n + 1]t j ) ) 2 π(2n + 1) 4λ i λ j sin([2n + 1]t j )sin([2n + 1]t i ) π(2n + 1) 2 where σ ij := 4 π = 2 π n=0 n=0 sin([2n + 1]t j )sin([2n + 1]t i ) (2n + 1) 2 1 cos([2n + 1][t i + t j ]) (2n + 1) π = 1 2 t i + t j 1 2 t i t j = min{t i,t j }. n=0 cos([2n + 1][t i t j ]) 1 (2n + 1) 2

16 10 CHAPTER 1. STOCHASTIC PROCESS Here in the last equation, we have used the identity t = 4 π n=0 1 cos([2n + 1]t) (2n + 1) 2 l t ( π,π). To see this, we first use the Fourier sine series expansion for the constant function 1 to obtain 1 = 4 π n=0 sin([2n + 1]s) (2n + 1) s (0,π). Integrating both sides on s (0,t) then gives required identity Brownian Motion on Space of Continuous Functions Quite often people use Wiener process as a synonym for the Brownian motion. Though Winner process and Brownian motion are the same thing, deep mathematical theories are associated with the Wiener process. Here we introduce two important aspects of the Wiener process: continuous sample path and filtration.. For each fixed ω Ω, the function t [0, ) B t (ω) is called a sample path. Ideally, one would like to build a stochastic process upon sample paths. Hence, we would like, if possible, to relate our probability space (Ω, F, P) with the collection of sample paths. It is said that Brownian motion has continuous sample paths; this means that for almost all ω Ω, the function t [0, ) B t (ω) is continuous in t. If we delete that measure zero set from Ω, then every sample path is continuous. In our definition, E[ ] is understood as the expectation derived from the measure space (Ω, F, P) on which each individual random variable B t ( ) : Ω R is defined. One fundamental question is the very existence of such a probability space (Ω, F, P) for which each sample path t B t (ω) is continuous. Clearly, a natural probability space is to take directly Ω as the space of continuous functions. Here we briefly describe how a Wiener process can be built upon such a space. Let T > 0 be a fixed time. The Brownian motion can be defined on the space of continuous functions Ω T := C([0,T]). Here C([0,T]) = C([0,T]; R) denotes the set of all continuous function from [0,T] to R. For each ω Ω, the value B t (ω) is defined as B t (ω) = ω(t) ω Ω T t [0,T]. (1.2) One can show that there is a (minimum) σ-algebra F T on Ω T and a probability measure P T on (Ω T,F T ) such that the so defined {B t } t [0,T] is a Brownian motion on (Ω T,F T, P T ). In this definition, we see that for each ω Ω, the function t [0,T] B t (ω) is indeed exactly the function t [0,T] ω(t). Since ω Ω T is continuous, we see that every Brownian motion sample path {(t,b t (ω))} 0 t T = {(t,ω(t))} 0 t T is a continuous curve on the time-space coordinate system. We remak that C([0,T]) is a Banach space under the norm ω = max t [0,T] ω(t) ω Ω T = C([0,T]).

17 1.3. BROWNIAN MOTION 11 We know a Banach space admits a default σ-algebra -the Borel σ-algebra, being the σ-algebra generated by all open sets. The σ-algebra F T for the Brownian motion is indeed the default Borel algebra. Thus, under the above norm, every open set is in F T. Consider the set A := {ω C([0,T]) ω(0) = 0}; B := {ω C([0,T]) ω(0) 0}. Then A is a closed set and B is an open set. From the property of Brownian motion, we see that P T (A) = 1, P T (B) = 0. One can show more regularity (smoothness) of Brownian motion paths. For this, we introduce the Hölder space. For each α (0,1], we define the norm We set ω C α ([0,T]) = max ω(t) + t [0,T] sup 0 s<t T ω(t) ω(s) t s α ω C([0,T]). C α ([0,T]) := {ω C([0,T]) ω C α ([0,T]) < }. One can show that C α ([0,1]) is a closed set in C([0,T]) and is F T measurable. More importantly, we have the following zero-one law. Almost every sample path of the Brownian motion is Hölder continuous with exponent α [0,1/2); namely, P T (C α ([0,T])) = 1 α [0,1/2). For every α (1/2,1], almost none of the Brownian motion sample path is Hölder continuous with exponent α (1/2,1]; namely, P T (C α ([0,T]) = 0 α (1/2,1]. Since C 1 ([0,1]) is traditionally used to denote the space of contunuously differentiable functions, when α = 1, the Hölder space with exponent α = 1 is denoted by C 0,1 ([0,T]), which is indeed the space of all Lipschitz continuous functions. Clearly, a differentiable function is also Holder continuous for every exponent α [0,1]. The above statement shows that no Brownian motion sample path is everywhere differentiable. Indeed, one can shows that, except a set of measure zero, every Brownian motion sample path is nowhere differentiable Filtration Now we set Ω = C([0, )). For each ω Ω, let ω [0,T] be the restriction of ω on [0,T]; then ω ω [0,T] is a projection from Ω to Ω T. This projection allows us to lift F T to a σ-algebra F T on Ω and lift P T to a probability measure P on (Ω, F T ) as follows.

18 12 CHAPTER 1. STOCHASTIC PROCESS For each set A in F T, we define a cylinder c T (A) in Ω by Also, we define c T (A) = {ω Ω ω [0,T] A} A F T. F T := {c T (A) A F T } T 0, P(c T (A)) := P T (A) A F T,T 0. Then F T is a σ-algebra on Ω and P is a probability measure on (Ω, F T ). It is easy to see that (i) F t F s if 0 < s < t; (ii) B t is F t measurable. (iii) P is a probability measure on (Ω, F t ) for every t 0. Finally, we define F = t 0 F t. We then have a Wiener process {B t } t 0 defined on (Ω, F, P). For this we introduce the following important concept. A filtration on a set Ω is family of σ-algebra {F t } t 0 on Ω such that F s F t 0 s < t. A stochastic process {S t } t 0 on (Ω, F, P) is called adapted to a filtration {F t } t 0 if F = t 0 F t and for each t 0, S t is a random variable on the probability space (Ω, F t, P) Vector Valued Brownian Motion If {B t } is a standard Brownian motion and a R, {a + B t } is often called Brownian motion started at a. If B 1,...,B n are n independent standard Brownian motions, then the process B t = (B 1 t,...,b n t ) is called an n-dimensional Brownian motion, which can be realized on C(T; R n ). If A = (a ki ) n m is a matrix, then the process ˆB t = B t A = ( ˆB 1 t,, ˆB m t ) is a general m dimensional Brownian motion where m ˆB t i = Bt k a ki k=1 i = 1,,m. Note that the covariance σ ij := 1 t Cov ( ˆBi t, ˆB j t ) = 1 t Cov( ˆB i t, ˆB j t ) = 1 t = n k=1 l=1 n a ki a lj δ kl = n n k=1 l=1 ( ) a ki a lj Cov Bt k,bt l n a ki a kj = (A T A) ij k=1 where A T is the transpose of A and (A T A) ij represents the entry on ith row jth column. Hence, we say {ˆB t } is an m-dimensional Brownian motion with covariance matrix C = (σ ij ) m m = A T A. Every vector valued Brownian motion is characterized by its mean vector and covariance matrix. Many books and countless research papers have been written about Brownian motion. A few introductory references are (in increasing order of difficulty) [8], [22], and [14].

19 1.4. FILTRATION AND MARTINGALE 13 Exercise Suppose 0 < s < t, show that Z = (B t,b s ) has distribution density ρ(x,t;y,s) = (x y) 2 e 2(t s) 2π(t s) e y 2 2s. 2πs Exercise From the basic property of the Brownian motion listed in subsection show that for every 0 < t 1 < < t m, the random variable Z := (B t1,,b tm ) is Gaussian distributed with mean (0,,0) and covariance matrix (min{t i,t j }) m m. Hint: It is equivalent to show that the characteristic function ψ(λ) := E(e iλ Z ) is given by ψ(λ 1,,λ m ) = e 1 2 È m i,j=1 σijλiλj, σ ij = min{t i,t j }. Exercise Suppose Z is a normally distributed random variable. Consider the process {X t } t 0 defined by X t = tz. Show that for each t 0, X t is N(0,t) distributed. Explain that {X t } t 0 is not a Brownian motion. Exercise Suppose {B t } t 0 is Brownian motion. (i) Show that for each t 2 > t 1 0, B t2 B t1 is N(0,t 2 t 1 ) distributed. Also, show that for each t 0 0, the stochastic process {B t+t0 B t0 } t 0 is also a Brownian motion. (ii) Show that for each a > 0 and t > 0, 1 a B at is N(0,t) distributed. Defining B t := 1 a B at, show that { B t } t 0 is a Brownian motion. (iii) Are {B t } and { B t } independent? Exercise Suppose {B 1 t } t 0 and {B 2 (t)} t 0 are two independent Brownian motions and θ [0,2π] is a constant. Show that {B 1 t cos θ + B 2 t sin θ} t 0 is a Brownian motion. Exercise For the Brownian motion process {B t }, show that (db) 2 = dt in the sense that ( lim P (B t+h B t ) 2 hց0 h ) 1 > ε = 0 ε > 0. [Calculate E([(B t+h B t ) 2 h] 2 ) where B t+h B t has distribution density (2πh) 1 e x2 /(2h 2).] Exercise Show that lim t 0 B t = 0 in measure; that is, show that lim P( B t > ε) = 0 ε > 0. tց0 [ Use the inequality E[ B t ] εp( B t ε) and calculate E[ B t ] using the fact that B t is N(0,t) distributed.] Exercise Show that for every β > 0, there exist α > 0 and C > 0 such that E[ B t B s β ] sup t>s 0 t s α C. 1.4 Filtration and Martingale Filtration We recall the following

20 14 CHAPTER 1. STOCHASTIC PROCESS A filtration on a state space Ω is a collection {F t } t 0 of σ-algebras on Ω that is indexed by time and satisfies F s F t for all s < t. A stochastic process {X t } t 0 defined on (Ω, F, P) is called adapted to a filtration {F t } t 0 if F = t 0 F t and for each t 0, X t ( ) is a random variable on the probability space (Ω, F t, P). Note that if P is a probability measure on each F t = 0 τ t F τ, then by setting F := t 0 F t every needed probability (involving finite time) can be calculated. Namely, it is not necessary to extend P from F to σ(f ) (the smallest σ-algebra containing F ), though it can be done (since F is an algebra). Given a stochastic process {S t } t [0,T) (T (0, ]) on probability space (Ω, F, P), we can define a filtration F t as follows: For each t 0, F t is defined as the smallest σ-algebra on Ω such that each S s, s [0,t], is F t measurable. It is easy to see that {F t } t [0,T) is a filtration and {S t } t [0,T) is adapted to {F t } t [0,T). Such a filtration is called the natural filtration of the process. In the sequel, all Wiener process are assumed to be adapted to its natural filtration Filtration, Partition, and Information A filtration is usually taken to represent the flow of information; that is F t is used to accommodate the degree of detail on information that can be revealed up to time t. In application F t is designed so it cannot resolve any more detailed information revealed after time t. Consider the situation that the state space Ω is finite. Then an σ-algebra F can be characterize by its atoms; here a set A F is called an atom if A does not contain any proper subset in F, i.e. if B A and B F, then either B = A or B =. Thus on a finite state space, there is a one to one correspondence between σ-algebras and partitions. The finer the partition, the smaller the atoms, and the larger the σ-algebra. Hence, if F s F t, then F s has coarser partition than F t, in other words F t has finer partition than F s. Taking an example of locating an address of a person, a coarse partition only resolve the detail up to, sat city, whereas a finer partition many have a resolution up to street, or ever street numbers. We work on a few examples. 1. Consider the game of tossing coins. We let N be the total number of coin toss. Denote by +1 for head and 1 for tail. We set Ω := {1, 1} N, Ω i := {1, 1} i i = 1,,N, F i = 2 Ωi { 1,1} N i = {ω {1, 1} N i ω Ω i }, c(x 1,,x i ) := (x 1,,x i ) { 1,1} N i = {(x 1,,x i,y i+1,,y N ) y j {1, 1} j i + 1} (x 1,,x i ) Ω i. Here we use convention { 1,1} N i = so F i is a σ-algebra. One can show that {F i } N i=1 is a filtration on Ω for T := {1,,N}.

21 1.4. FILTRATION AND MARTINGALE 15 Note that for each (x 1,,x i ) Ω i, the cylinder c(x 1,,x i ) is an atom of F i ; that is to say, if B is a proper subset of c(x 1,,x i ) and B F i, then B =. Now suppose a function X is F i measurable. Then, X(x 1,x 2,,x i,y i+1,,y n ) is independent of y j for every j i + 1. Indeed, since b(x 1,,x i ) is an atom of F i, and X is F i measurable, X has to be a constant function on the set c(x 1,,x i ). Hence, if X is an F i measurable function, then there exists a function X defined on Ω i such that X(x 1,,x N ) = X(x 1,,x i ) (x 1,,x N ) Ω. Now suppose X is a random variable on (Ω, F N ) and for each sequence of outcome (x 1,,x N ) Ω, X(x 1,,x N ) represents the award that once can collect from a particular gambling contract. If X is F i measurable, then one knows the award after the ith tossing of the coin; namely, there is no need to wait for the results of all N coin tossing to find out the award. This is so since X(x 1,,x N ) = X(x 1,,x i ); as long as the rsults x 1,,x i of the first ith coin tossing are known, the value X(x 1,,x N ) is also known. For example, suppose X represent the award of $8.00 for the first three consecutive heads, and lost $1.00 otherwise. Then as long as the result of the first three coin toss are revealed, the game with agreement X can be considered as finished, since the payment is clear. On the other hand, if X is not F N 1 measurable, for example, X(x 1,,x N ) = N (2x i 1), then one has to find the results of the final tossing before knows the exact amount of award. In summary, for this example, by saying that X : Ω R is F i measurable, it means that there exists a function X : Ω i R such that X(x 1,,x N ) = X(x 1,,x i ) for all (x 1,,x N ) Ω; that is, X does not dependent on the variables (x i+1,,x N ). 2. Consider the Brownian motion {B t } t 0 adapted to the filtration {F t } t 0 that we defined in the earlier section. Given an event z Ω = C([0, ); R), we have B t (z) = z(t); thus no matter what value z(t + h) is, as long as z(t) = x, we always have B t (z) = x. On the other hand, if we divide the set A := {z Ω z(t) = x} into two sets: A 1 := {z Ω z(t) = x,z(t + h) > 0} and A 2 := {z Ω z(t) = x,z(t + h) 0}. Then this information A 1 or A 2 cannot be resolved by F t. We have to use a finer σ-algebra F t+h to categorize such information, which can only be known for sure on or after time t + h. Here that A 1 is not F t measurable means that it is an non-observable event; that is, at time t, it is impossible to observe an event which tells that B t = x and B t+h > 0. i=1 Now consider the stochastic process {B t } t 0 defined by B t (ω) = max 0 s t B s(ω) ω Ω. This is the running high of the Brownian motion. It is a stochastic process adapted to the same filtration {F t } t 0 as that of {B t }. Clearly, for every t > 0 and every ω Ω = C([0, )), we can always know the value B (t) from the restriction ω [0,t]. Next consider the random variable X(ω) = 1 0 B s (ω)ds ω Ω.

22 16 CHAPTER 1. STOCHASTIC PROCESS This random variable is F 1 measurable, but not F s measurable for every s < 1. That is to say, the outcome of X can be calculated only after time t 1. For example, at time t < 1, it is impossible to observe an event which tells that X < 0. Of course, as a random variable, the expectation, variance, etc of X can be evaluated by using the finest filtration F. The filtration is vital in studying stopping times. For example, consider the first time that a Brownian motion has a running high equal to 1: τ(ω) := inf{t > 0 B t (ω) 1} = sup{t > 0 B s (ω) < 1 s [0,t]} ω Ω. The measurability of the function τ deeply relies on the filtration. For a natural filtration, τ is a stopping time in the sense that at each time t, one knows whether τ t has happened or not, based on the knowledge of F t. 3. Consider another example, the stock price of a company. We use a space Ω sufficiently large so as to contain all information (past, current, and future) that is needed to determine the stock price (past, current, and future). That is, for each ω Ω, all financial conditions specified by ω provide a unique stock price S t (ω) for all t T := [0, ). Suppose current tome is t = 1 and the stock price is S = 10. Then, in the language of probability, we say we have observed the event E 1 = {ω Ω S 1 (ω) = 10}. Suppose we also know that at time t = 1/2 the stock price is S = 9. Then we say we have observed the event E 2 = {ω Ω S 1 (ω) = 10,S 1/2 (ω) = 9} which could be a fairly large set in Ω. As we see, the more information we have, the smaller the set E. Nevertheless, usually it is impossible to use all past information to pin down a future event. That is, even if we know the value of S for all t 1, it is impossible to know the value S at t = 1.1, since, for example, the set E := {ω Ω S 1.1 (ω) = 11} is not in t 1 F t so it is impossible to observe such an event at time t (e.g. predict with sure that S = 11 at time t = 1.1). The following definition may clarify some of our thought about filtration as information. If current time is t and an event E is in F t+h \ F t, then E is called a future event. 4. Let s go back to the consider coin tossing game. Suppose one starts from a total of capital V 0 at tome t = 0 and will bet the outcome of the tossing of a coin at each integer time t j = j. Suppose current time is t = t j 1 and one has V j 1 amount of capital. One makes a bet b j 1 V j 1 on a head outcome and c j 1 V j 1 on tail outcome where b j 1 0,c j 1 0 and b j 1 + c j 1 1. At time t j = j, a coin is tossed and if the outcome is a head (denoted by X j+1 = 1) one collects award b j 1 V j 1 from the bet on the head outcome and lost a penalty of c j 1 V j 1 from the bet on the tail, so at time t = t j, one has capital V j = V j 1 { 1 + (b j 1 c j 1 )X j } j = V 0 {1 + [b i 1 c i 1 ]X i }. Similarly, if the outcome is a tail (denoted by X j = 1), one collects award c j 1 V j 1 from the bet on tail and lost b j 1 V j 1 from the bet on head. Still one have same formula for V j. Now the question is how to maximize the final capital V N, say at N = 10. Here the capital V 10 depends on various kinds of betting strategies and on the outcomes of the coin tossing. Here for a fair game, a central restriction is that both b j 1 and c j 1 can depend only on a V 0,V 1,,V j 1 and on X 1,,X j 1, i.e., known information. It would be regarded as cheating if i=1

23 1.4. FILTRATION AND MARTINGALE 17 b j 1 or c j 1 depends on X j, since it means that one makes bets at time t j 1, with the knowledge of the out come X j at time t j. In the language of filtration, the restriction means that both b j 1 and c j 1 have to be F j 1 measurable. In terms of information tree, it means that one can only use the past and current information i j 1 F i = F j 1 to make a bet (investment) at time t j 1 = j 1. Since V 1,,V j 1 are functions of V 0,b 0,c 0,,b j 2,c j 1,X 0,,X j 1, a game strategy is represented a set of functions b j 1 = b j 1 (V 0,x 1,,x j 1 ), c j 1 = c j 1 (V 0,x 1,,x j 1 ), j = 1,,N such that at the out comes of X 0 = x 0,,X j 1 = x j 1, one bets the portion b j 1 (V 0,x 0,,x j 1 ) of total capital V j 1 on heads and c j 1 (V 0,x 0,,x j 1 ) of total capital on tails. Here although b j 1 can also depend on {b 0,,b j 2,c 0,,c j 2 }, the above strategy covers such case since by induction, each b i,c i, for every i j 2, are functions of V 0,x 0,,x i. Given a target, say maximize the expectation of V N, the optimal strategy is to find a game strategy in the form above such that the expected value V N is maximized Conditional Probability 1. We recall that if A and B are two measurable sets of a probability space (Ω, F, P), then the conditional probability that the outcome is in A knowing that the outcome is in B is defined as P(A B) := P(A B). P(B) In the sequel, for a Borel set A in R and random variable X on (Ω, F, P), the set A := {ω Ω X(ω) A} will be simply written as {X A} or simply X A. Hence, given Borel sets A and B of R, we can define P(X A X B} = P(A B), where A := {ω Ω X(ω) A}, B := {ω X(ω) B}. In particular, if ρ(x) is the density function of X, then P(A B) = P(X A B) = ρ(x)dx, P(B) = P(X B) = so A B P(A B) = P(X A X B) = A B ρ(x)dx B ρ(x)dx. B ρ(x) dx 2. Let s consider an application of the conditional probability for the Brownian motion {B t } t 0. Let 0 s < t, A be a Borel set of R and y R. We want to define the conditional probability P(B t A B s = y), i.e. the probability B t A under the condition that B s = y. Since P(B s = y) = 0, we cannot directly define such a conditional probability by our definition. Nevertheless, for Brownian motion, we can define it through a limiting process: P(B t A B s = y) := P(B t A B s (y dy,y + dy)) := lim yց0 P(B t A B s (y y,y + y)).

24 18 CHAPTER 1. STOCHASTIC PROCESS Since the density of the joint distribution of (B t,b s ) is ρ(x,t;y,s) = e (x y)2 /[2(t s)] y 2 /[2s] /(2π (t s)s), we find that where P(B t A B s = y) = lim y 0 = A A dx y+ y y y ρ (y,s) (x,t)dx, e (x z)2 /[2(t s)] z 2 /[2s] y+ y y y 2π (t s)s e z2 /[2s] 2πs dz dz ρ (y,s) (x,t) := e (x y)2 /(2[t s]) 2π[t s]. If in particular A = (x,x + dx), then P(B t (x,x + dx) B s = y) = ρ (y,s) (x,t)dx. We call P(B t A B s = y) the transition probability measure and ρ (s,y) (x,t) the transition probability density. 3. Suppose X is a random variable on (Ω, F, P) and A is a measurable set. We denote by E[X A] the conditional expectation of X under A defined by E[X A] := X(ω)P(dω A) = X(ω) P(dω) P(A) = A X(ω)P(dω). P(A) A When A has measure zero, we need to take a limit process. For example, for the Brownian motion, when t > s, E[B t B s = y] = lim yց0 E[B t B s (y y,y + y)] = = A R dx y+ y R y y x 2 /(2[t s]) z 2 /(2s) e (x z) 2π (t s)s y+ y e z2 /(2s) y y 2πs dz xρ y,s (x,t)dx = y y R. (1.3) 4. Suppose X is a random variable on (Ω, F, P). Assume G is an σ-algebra on Ω satisfying G F. We say that that Y is the expectation of X under G F and written as Y = E[X G] if Y is G measurable and E[X Y A] = 0 A G. dz Note that E[Y X G] = 0 To illustrate the idea, we consider the following example: A { } X(ω) Y (ω) P(dω) = 0 A G. Ω = { 1,1} 2, F = 2 Ω, P(ω) = 1 ω Ω, 4 G = {, {1} { 1,1}, { 1} { 1,1}, Ω}.

25 1.4. FILTRATION AND MARTINGALE 19 It is easy to see that (Ω, F, P) is a probability space. Also, G is a sub-σ-algebra of F on Ω. Now consider the functions X(x 1,x 2 ) := sin(x 1 ) + x 1 {x 2 + (x 2 ) 2 }, Y (x 1,x 2 ) := Ỹ (x 1) := sin(x 1 ) + x 1. It is easy to see that X and Y are random variables on (Ω, F). Moreover, Y is G measurable. It is easy to verify that P(X A) = P(Y A) for every A G. For example, for each x 1 { 1,1}, A = {x 1 } { 1,1} G is an atom of G, so On the other hand, E[X A] = A Y (ω)p(dω) P(A) = E[Y A] = Ỹ (x 1). 1 4 {sin(x 1) + x 1 [ ]} {sin(x 1) + x 1 [ 1 + ( 1) 2 ]} 1/2 = sin(x 1 ) + x 1 = Ỹ (x 1) = Y (x 1,x 2 ) x 2 { 1,1}. In general, for the coin tossing example, a random variable X is a function of (x 1,,x N ). The condition expectation E[X F i ] is a function of (x 1,,x i ). Note that by definition, if Y is G measurable, then Y = E[Y G]. For the Brownian motion we have a simple expression for conditional expectation. From the identity (1.3), we see that E[B t F s ] = B s s 0,t s. Finally, we derive the following equivalent conditions to illustrate the idea. Assume that X is random variable on (Ω, F, P) and Y is G measurable where G is a sub-σ-algebra of F. Assume that both X and Y have nice density functions. Then Y = E[X G] A { } Y (ω) X(ω) P(dω) = 0 E[X Y Y = y] = 0 y R, E[ X Y = y] = y y R. A G Here we point out that for every y R, Y 1 ((y dy,y + dy)) is a measurable set in G Martingale Brownian motion is the canonical example of many classical stochastic processes. Here we introduce a few of them. A stochastic process {X t } t 0 on (Ω, F, P) is a martingale with respect to an adapted filtration {F t } t 0 if for each t 0, E( X t ) < and ) E[ X t+h F t ] = X t (i.e. E[ X t+h X t = x] = x a.s. h > 0,t 0.

26 20 CHAPTER 1. STOCHASTIC PROCESS A process {S t } t 0 is said to have stationary increment if for every h > 0, the distribution of S t+h S t is independent of t 0. A Markov process is a stochastic process {X t } t 0 on certain probability space (Ω, F, P) such that ( ) ( ) P X t A X s1 A 1,,S sn A n = P X t A X sn A n n N, s 1 < < s n < t, A,A 1,,A n B. From (1.3), we can see that Brownian motion is a martingale. Also, for the Brownian motion, B t+h B t has N(0,h) distribution, which is independent of t > 0, so Brownian motion has stationary increment. The condition for Markov process means that X t depend only on the most recent history. Namely, the probability distribution of X t under condition X s1 A 1,,X sn A n is the same as the probability distribution of X t under the sole condition condition X sn A n, provided that s 1,,s n 1 < s n. A random walk, for example, is a (discrete) Markov process, since the position W [i+1] t depends only on the most recent history X i t. Namely, under the condition W i t = x, W [i+1]t = x + X i+1 dx which has nothing to do with how the position W i t = x is achieved from the past walks. For the Brownian motion, one can show the following: Hence, Brownian motion is a Markov process. P(B t A B s1 = x 1,,B sn = x n ) = P(B t A B xn = x n ) 0 s 1 < < s n < t,x 1,,x n R,A B. (1.4) A Markov process is determined essentially by the transition probabilities ( ) p (y,s) (A,t) := P X t A X s = y, y R,s < t,a B. Let s regard X t as the position of a particle at time t. Knowing the particle at position y at time s, the probability that the particle is in A at time t > s is p (y,s) (A,t). It has nothing to do with where the particle was at any time before s. From here, we see that it is history independent. The transition probability must satisfy the Chapman-Kolmogorov equation p (y,s) (A;t) = p (y,s) ((z,z + dz),τ))p (z,τ) (A;t) s < τ < t. (1.5) R This equation says that to enter A at time t from a position y at time s, the particle has to appear at some interval (z,z + dz) at time τ with probability p (y,s) ((z,z + dz),τ) and then from there enter A at time t, with probability p (z,τ) (A,t). Conversely, let {p (y,s) ( ;t)} y R,t>s 0 be a collection of probabilities satisfies the Chapman-Kolmogorov equation (1.5), and in addition assume that X 0 is given. Then there is a unique Markov process {X t } t 0 having the given initial distribution and the given transition probabilities. In [22], the Brownian motion is indeed constructed from the Markov process by using the transition probability density ρ (y,s) (x,t) = e (x y)2 /(2[t s]) 2π(t s).

27 1.4. FILTRATION AND MARTINGALE 21 In using transition probability density and Markov process to define a stochastic process, it is a priori unknown if the resulting process has continuous paths. The following theorem characterizes a large class of Markov processes. A Markov process {X t } t 0 with transition probabilities p (, ) (, ) can be realized in the space of continuous functions if for every ε > 0, 1 lim sup p (y,s)( ) {z z y > ε}, t = 0. δց0 δ y R,0<t s<δ Exercise Suppose X is a random variable on (Ω, F) and A is an atom of F. Show that X A is a constant function; namely, for every ω 1,ω 2 A, X(ω 1 ) = X(ω 2 ). Exercise Assume that 0 s 1 < s 2 < t. Let A = (x,y),a 1 = (x 1,y 1 ),A 2 = (x 2,y 2 ) be non-empty intervals. For the Brownian motion show that P(B t A B s2 A 2,B s1 A 1 ) = P(B t A B s2 A 2 ). Exercise Let n 1 be an integer and set T = {0,1,,n} so we are considering the discrete version of a stochastic process. Suppose we are entering a game of tossing a fair coin. We start with X 0 = 1. If we have X i capital after ith game, we bet X i for the next game. If the tossing result is a head, we win half of the bet and otherwise we lost half of the bet, so that X i+1 = 3 2 X i for head and X i+1 = 1 2 X i for tail. Show that {X i } i T is a Markov process as well as a martingale, after building up appropriately a filtration (information tree). Suppose Y 0 = 1,Y 1 = 1, and for i 2, Y i+1 = 7 4 Y i if both the ith and (i + 1)th outcomes of tossing are the same (both heads or both tails) and Y i+1 = 1 4 Y i if the outcomes of the ith tossing and the (i+1)th tossing are different. Show that {Y i } i T is a martingale, but not a Markov process. Exercise Let {B t } t 0 be the standard Brownian motion process adapted to a filtration {F t } t 0. Show that {(B t ) 2 } t 0 is a stochastic process that is not a martingale; that is, for some s < t and x 0, E[B 2 t B 2 s = x] x. Also show that {(B t ) 2 } t 0 is a Markov process; that is, for every s 1 < s 2 < < s n < t and a 1,,a n (0, ), namely, P(B 2 t B 2 s i = a i,i = 1,,n) = P(B 2 t B 2 s n = a n }; P(B 2 t B 2 s i (a i,a i + dx),i = 1,,n) = P(B 2 t B sn (a n,a n + dx)}. Exercise Let {B t } t 0 be a Brownian motion adapted to a filtration {F t } t 0. Show that {B 2 t t} t 0 is a martingale; that is,, show that for every h > 0,t 0, and x R, E[B 2 t+h (t + h) B 2 t t = x] = x.

S t d with probability (1 p), where

S t d with probability (1 p), where Stochastic Calculus Week 3 Topics: Towards Black-Scholes Stochastic Processes Brownian Motion Conditional Expectations Continuous-time Martingales Towards Black Scholes Suppose again that S t+δt equals

More information

Introduction to Probability Theory and Stochastic Processes for Finance Lecture Notes

Introduction to Probability Theory and Stochastic Processes for Finance Lecture Notes Introduction to Probability Theory and Stochastic Processes for Finance Lecture Notes Fabio Trojani Department of Economics, University of St. Gallen, Switzerland Correspondence address: Fabio Trojani,

More information

Drunken Birds, Brownian Motion, and Other Random Fun

Drunken Birds, Brownian Motion, and Other Random Fun Drunken Birds, Brownian Motion, and Other Random Fun Michael Perlmutter Department of Mathematics Purdue University 1 M. Perlmutter(Purdue) Brownian Motion and Martingales Outline Review of Basic Probability

More information

BROWNIAN MOTION Antonella Basso, Martina Nardon

BROWNIAN MOTION Antonella Basso, Martina Nardon BROWNIAN MOTION Antonella Basso, Martina Nardon basso@unive.it, mnardon@unive.it Department of Applied Mathematics University Ca Foscari Venice Brownian motion p. 1 Brownian motion Brownian motion plays

More information

MATH3075/3975 FINANCIAL MATHEMATICS TUTORIAL PROBLEMS

MATH3075/3975 FINANCIAL MATHEMATICS TUTORIAL PROBLEMS MATH307/37 FINANCIAL MATHEMATICS TUTORIAL PROBLEMS School of Mathematics and Statistics Semester, 04 Tutorial problems should be used to test your mathematical skills and understanding of the lecture material.

More information

Stochastic calculus Introduction I. Stochastic Finance. C. Azizieh VUB 1/91. C. Azizieh VUB Stochastic Finance

Stochastic calculus Introduction I. Stochastic Finance. C. Azizieh VUB 1/91. C. Azizieh VUB Stochastic Finance Stochastic Finance C. Azizieh VUB C. Azizieh VUB Stochastic Finance 1/91 Agenda of the course Stochastic calculus : introduction Black-Scholes model Interest rates models C. Azizieh VUB Stochastic Finance

More information

Lecture Notes for Chapter 6. 1 Prototype model: a one-step binomial tree

Lecture Notes for Chapter 6. 1 Prototype model: a one-step binomial tree Lecture Notes for Chapter 6 This is the chapter that brings together the mathematical tools (Brownian motion, Itô calculus) and the financial justifications (no-arbitrage pricing) to produce the derivative

More information

Risk Neutral Valuation

Risk Neutral Valuation copyright 2012 Christian Fries 1 / 51 Risk Neutral Valuation Christian Fries Version 2.2 http://www.christian-fries.de/finmath April 19-20, 2012 copyright 2012 Christian Fries 2 / 51 Outline Notation Differential

More information

AMH4 - ADVANCED OPTION PRICING. Contents

AMH4 - ADVANCED OPTION PRICING. Contents AMH4 - ADVANCED OPTION PRICING ANDREW TULLOCH Contents 1. Theory of Option Pricing 2 2. Black-Scholes PDE Method 4 3. Martingale method 4 4. Monte Carlo methods 5 4.1. Method of antithetic variances 5

More information

An Introduction to Stochastic Calculus

An Introduction to Stochastic Calculus An Introduction to Stochastic Calculus Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Week 2-3 Haijun Li An Introduction to Stochastic Calculus Week 2-3 1 / 24 Outline

More information

Homework Assignments

Homework Assignments Homework Assignments Week 1 (p. 57) #4.1, 4., 4.3 Week (pp 58 6) #4.5, 4.6, 4.8(a), 4.13, 4.0, 4.6(b), 4.8, 4.31, 4.34 Week 3 (pp 15 19) #1.9, 1.1, 1.13, 1.15, 1.18 (pp 9 31) #.,.6,.9 Week 4 (pp 36 37)

More information

1.1 Basic Financial Derivatives: Forward Contracts and Options

1.1 Basic Financial Derivatives: Forward Contracts and Options Chapter 1 Preliminaries 1.1 Basic Financial Derivatives: Forward Contracts and Options A derivative is a financial instrument whose value depends on the values of other, more basic underlying variables

More information

Stochastic Calculus, Application of Real Analysis in Finance

Stochastic Calculus, Application of Real Analysis in Finance , Application of Real Analysis in Finance Workshop for Young Mathematicians in Korea Seungkyu Lee Pohang University of Science and Technology August 4th, 2010 Contents 1 BINOMIAL ASSET PRICING MODEL Contents

More information

From Discrete Time to Continuous Time Modeling

From Discrete Time to Continuous Time Modeling From Discrete Time to Continuous Time Modeling Prof. S. Jaimungal, Department of Statistics, University of Toronto 2004 Arrow-Debreu Securities 2004 Prof. S. Jaimungal 2 Consider a simple one-period economy

More information

Stochastic Dynamical Systems and SDE s. An Informal Introduction

Stochastic Dynamical Systems and SDE s. An Informal Introduction Stochastic Dynamical Systems and SDE s An Informal Introduction Olav Kallenberg Graduate Student Seminar, April 18, 2012 1 / 33 2 / 33 Simple recursion: Deterministic system, discrete time x n+1 = f (x

More information

RMSC 4005 Stochastic Calculus for Finance and Risk. 1 Exercises. (c) Let X = {X n } n=0 be a {F n }-supermartingale. Show that.

RMSC 4005 Stochastic Calculus for Finance and Risk. 1 Exercises. (c) Let X = {X n } n=0 be a {F n }-supermartingale. Show that. 1. EXERCISES RMSC 45 Stochastic Calculus for Finance and Risk Exercises 1 Exercises 1. (a) Let X = {X n } n= be a {F n }-martingale. Show that E(X n ) = E(X ) n N (b) Let X = {X n } n= be a {F n }-submartingale.

More information

Introduction to Stochastic Calculus and Financial Derivatives. Simone Calogero

Introduction to Stochastic Calculus and Financial Derivatives. Simone Calogero Introduction to Stochastic Calculus and Financial Derivatives Simone Calogero December 7, 215 Preface Financial derivatives, such as stock options for instance, are indispensable instruments in modern

More information

Martingales. by D. Cox December 2, 2009

Martingales. by D. Cox December 2, 2009 Martingales by D. Cox December 2, 2009 1 Stochastic Processes. Definition 1.1 Let T be an arbitrary index set. A stochastic process indexed by T is a family of random variables (X t : t T) defined on a

More information

Reading: You should read Hull chapter 12 and perhaps the very first part of chapter 13.

Reading: You should read Hull chapter 12 and perhaps the very first part of chapter 13. FIN-40008 FINANCIAL INSTRUMENTS SPRING 2008 Asset Price Dynamics Introduction These notes give assumptions of asset price returns that are derived from the efficient markets hypothesis. Although a hypothesis,

More information

Binomial model: numerical algorithm

Binomial model: numerical algorithm Binomial model: numerical algorithm S / 0 C \ 0 S0 u / C \ 1,1 S0 d / S u 0 /, S u 3 0 / 3,3 C \ S0 u d /,1 S u 5 0 4 0 / C 5 5,5 max X S0 u,0 S u C \ 4 4,4 C \ 3 S u d / 0 3, C \ S u d 0 S u d 0 / C 4

More information

Basic Arbitrage Theory KTH Tomas Björk

Basic Arbitrage Theory KTH Tomas Björk Basic Arbitrage Theory KTH 2010 Tomas Björk Tomas Björk, 2010 Contents 1. Mathematics recap. (Ch 10-12) 2. Recap of the martingale approach. (Ch 10-12) 3. Change of numeraire. (Ch 26) Björk,T. Arbitrage

More information

4: SINGLE-PERIOD MARKET MODELS

4: SINGLE-PERIOD MARKET MODELS 4: SINGLE-PERIOD MARKET MODELS Marek Rutkowski School of Mathematics and Statistics University of Sydney Semester 2, 2016 M. Rutkowski (USydney) Slides 4: Single-Period Market Models 1 / 87 General Single-Period

More information

Economathematics. Problem Sheet 1. Zbigniew Palmowski. Ws 2 dw s = 1 t

Economathematics. Problem Sheet 1. Zbigniew Palmowski. Ws 2 dw s = 1 t Economathematics Problem Sheet 1 Zbigniew Palmowski 1. Calculate Ee X where X is a gaussian random variable with mean µ and volatility σ >.. Verify that where W is a Wiener process. Ws dw s = 1 3 W t 3

More information

IEOR E4703: Monte-Carlo Simulation

IEOR E4703: Monte-Carlo Simulation IEOR E4703: Monte-Carlo Simulation Simulating Stochastic Differential Equations Martin Haugh Department of Industrial Engineering and Operations Research Columbia University Email: martin.b.haugh@gmail.com

More information

1 IEOR 4701: Notes on Brownian Motion

1 IEOR 4701: Notes on Brownian Motion Copyright c 26 by Karl Sigman IEOR 47: Notes on Brownian Motion We present an introduction to Brownian motion, an important continuous-time stochastic process that serves as a continuous-time analog to

More information

1 Mathematics in a Pill 1.1 PROBABILITY SPACE AND RANDOM VARIABLES. A probability triple P consists of the following components:

1 Mathematics in a Pill 1.1 PROBABILITY SPACE AND RANDOM VARIABLES. A probability triple P consists of the following components: 1 Mathematics in a Pill The purpose of this chapter is to give a brief outline of the probability theory underlying the mathematics inside the book, and to introduce necessary notation and conventions

More information

BROWNIAN MOTION II. D.Majumdar

BROWNIAN MOTION II. D.Majumdar BROWNIAN MOTION II D.Majumdar DEFINITION Let (Ω, F, P) be a probability space. For each ω Ω, suppose there is a continuous function W(t) of t 0 that satisfies W(0) = 0 and that depends on ω. Then W(t),

More information

STOCHASTIC CALCULUS AND BLACK-SCHOLES MODEL

STOCHASTIC CALCULUS AND BLACK-SCHOLES MODEL STOCHASTIC CALCULUS AND BLACK-SCHOLES MODEL YOUNGGEUN YOO Abstract. Ito s lemma is often used in Ito calculus to find the differentials of a stochastic process that depends on time. This paper will introduce

More information

Equivalence between Semimartingales and Itô Processes

Equivalence between Semimartingales and Itô Processes International Journal of Mathematical Analysis Vol. 9, 215, no. 16, 787-791 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/1.12988/ijma.215.411358 Equivalence between Semimartingales and Itô Processes

More information

Risk Neutral Measures

Risk Neutral Measures CHPTER 4 Risk Neutral Measures Our aim in this section is to show how risk neutral measures can be used to price derivative securities. The key advantage is that under a risk neutral measure the discounted

More information

Stochastic Differential equations as applied to pricing of options

Stochastic Differential equations as applied to pricing of options Stochastic Differential equations as applied to pricing of options By Yasin LUT Supevisor:Prof. Tuomo Kauranne December 2010 Introduction Pricing an European call option Conclusion INTRODUCTION A stochastic

More information

Stochastic Processes and Financial Mathematics (part two) Dr Nic Freeman

Stochastic Processes and Financial Mathematics (part two) Dr Nic Freeman Stochastic Processes and Financial Mathematics (part two) Dr Nic Freeman April 25, 218 Contents 9 The transition to continuous time 3 1 Brownian motion 5 1.1 The limit of random walks...............................

More information

Math 416/516: Stochastic Simulation

Math 416/516: Stochastic Simulation Math 416/516: Stochastic Simulation Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Week 13 Haijun Li Math 416/516: Stochastic Simulation Week 13 1 / 28 Outline 1 Simulation

More information

Mathematics of Finance Final Preparation December 19. To be thoroughly prepared for the final exam, you should

Mathematics of Finance Final Preparation December 19. To be thoroughly prepared for the final exam, you should Mathematics of Finance Final Preparation December 19 To be thoroughly prepared for the final exam, you should 1. know how to do the homework problems. 2. be able to provide (correct and complete!) definitions

More information

Continuous Processes. Brownian motion Stochastic calculus Ito calculus

Continuous Processes. Brownian motion Stochastic calculus Ito calculus Continuous Processes Brownian motion Stochastic calculus Ito calculus Continuous Processes The binomial models are the building block for our realistic models. Three small-scale principles in continuous

More information

Martingale Pricing Theory in Discrete-Time and Discrete-Space Models

Martingale Pricing Theory in Discrete-Time and Discrete-Space Models IEOR E4707: Foundations of Financial Engineering c 206 by Martin Haugh Martingale Pricing Theory in Discrete-Time and Discrete-Space Models These notes develop the theory of martingale pricing in a discrete-time,

More information

Martingale representation theorem

Martingale representation theorem Martingale representation theorem Ω = C[, T ], F T = smallest σ-field with respect to which B s are all measurable, s T, P the Wiener measure, B t = Brownian motion M t square integrable martingale with

More information

An Introduction to Stochastic Calculus

An Introduction to Stochastic Calculus An Introduction to Stochastic Calculus Haijun Li lih@math.wsu.edu Department of Mathematics and Statistics Washington State University Lisbon, May 218 Haijun Li An Introduction to Stochastic Calculus Lisbon,

More information

Optimal stopping problems for a Brownian motion with a disorder on a finite interval

Optimal stopping problems for a Brownian motion with a disorder on a finite interval Optimal stopping problems for a Brownian motion with a disorder on a finite interval A. N. Shiryaev M. V. Zhitlukhin arxiv:1212.379v1 [math.st] 15 Dec 212 December 18, 212 Abstract We consider optimal

More information

Brownian Motion. Richard Lockhart. Simon Fraser University. STAT 870 Summer 2011

Brownian Motion. Richard Lockhart. Simon Fraser University. STAT 870 Summer 2011 Brownian Motion Richard Lockhart Simon Fraser University STAT 870 Summer 2011 Richard Lockhart (Simon Fraser University) Brownian Motion STAT 870 Summer 2011 1 / 33 Purposes of Today s Lecture Describe

More information

4 Martingales in Discrete-Time

4 Martingales in Discrete-Time 4 Martingales in Discrete-Time Suppose that (Ω, F, P is a probability space. Definition 4.1. A sequence F = {F n, n = 0, 1,...} is called a filtration if each F n is a sub-σ-algebra of F, and F n F n+1

More information

Non-semimartingales in finance

Non-semimartingales in finance Non-semimartingales in finance Pricing and Hedging Options with Quadratic Variation Tommi Sottinen University of Vaasa 1st Northern Triangular Seminar 9-11 March 2009, Helsinki University of Technology

More information

Lesson 3: Basic theory of stochastic processes

Lesson 3: Basic theory of stochastic processes Lesson 3: Basic theory of stochastic processes Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it Probability space We start with some

More information

An Introduction to Stochastic Calculus

An Introduction to Stochastic Calculus An Introduction to Stochastic Calculus Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Week 5 Haijun Li An Introduction to Stochastic Calculus Week 5 1 / 20 Outline 1 Martingales

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 19 11/20/2013. Applications of Ito calculus to finance

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 19 11/20/2013. Applications of Ito calculus to finance MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.7J Fall 213 Lecture 19 11/2/213 Applications of Ito calculus to finance Content. 1. Trading strategies 2. Black-Scholes option pricing formula 1 Security

More information

A No-Arbitrage Theorem for Uncertain Stock Model

A No-Arbitrage Theorem for Uncertain Stock Model Fuzzy Optim Decis Making manuscript No (will be inserted by the editor) A No-Arbitrage Theorem for Uncertain Stock Model Kai Yao Received: date / Accepted: date Abstract Stock model is used to describe

More information

Economics has never been a science - and it is even less now than a few years ago. Paul Samuelson. Funeral by funeral, theory advances Paul Samuelson

Economics has never been a science - and it is even less now than a few years ago. Paul Samuelson. Funeral by funeral, theory advances Paul Samuelson Economics has never been a science - and it is even less now than a few years ago. Paul Samuelson Funeral by funeral, theory advances Paul Samuelson Economics is extremely useful as a form of employment

More information

Functional vs Banach space stochastic calculus & strong-viscosity solutions to semilinear parabolic path-dependent PDEs.

Functional vs Banach space stochastic calculus & strong-viscosity solutions to semilinear parabolic path-dependent PDEs. Functional vs Banach space stochastic calculus & strong-viscosity solutions to semilinear parabolic path-dependent PDEs Andrea Cosso LPMA, Université Paris Diderot joint work with Francesco Russo ENSTA,

More information

Lecture 17. The model is parametrized by the time period, δt, and three fixed constant parameters, v, σ and the riskless rate r.

Lecture 17. The model is parametrized by the time period, δt, and three fixed constant parameters, v, σ and the riskless rate r. Lecture 7 Overture to continuous models Before rigorously deriving the acclaimed Black-Scholes pricing formula for the value of a European option, we developed a substantial body of material, in continuous

More information

CONVERGENCE OF OPTION REWARDS FOR MARKOV TYPE PRICE PROCESSES MODULATED BY STOCHASTIC INDICES

CONVERGENCE OF OPTION REWARDS FOR MARKOV TYPE PRICE PROCESSES MODULATED BY STOCHASTIC INDICES CONVERGENCE OF OPTION REWARDS FOR MARKOV TYPE PRICE PROCESSES MODULATED BY STOCHASTIC INDICES D. S. SILVESTROV, H. JÖNSSON, AND F. STENBERG Abstract. A general price process represented by a two-component

More information

Derivatives Pricing and Stochastic Calculus

Derivatives Pricing and Stochastic Calculus Derivatives Pricing and Stochastic Calculus Romuald Elie LAMA, CNRS UMR 85 Université Paris-Est Marne-La-Vallée elie @ ensae.fr Idris Kharroubi CEREMADE, CNRS UMR 7534, Université Paris Dauphine kharroubi

More information

PAPER 27 STOCHASTIC CALCULUS AND APPLICATIONS

PAPER 27 STOCHASTIC CALCULUS AND APPLICATIONS MATHEMATICAL TRIPOS Part III Thursday, 5 June, 214 1:3 pm to 4:3 pm PAPER 27 STOCHASTIC CALCULUS AND APPLICATIONS Attempt no more than FOUR questions. There are SIX questions in total. The questions carry

More information

1 The continuous time limit

1 The continuous time limit Derivative Securities, Courant Institute, Fall 2008 http://www.math.nyu.edu/faculty/goodman/teaching/derivsec08/index.html Jonathan Goodman and Keith Lewis Supplementary notes and comments, Section 3 1

More information

Path Dependent British Options

Path Dependent British Options Path Dependent British Options Kristoffer J Glover (Joint work with G. Peskir and F. Samee) School of Finance and Economics University of Technology, Sydney 18th August 2009 (PDE & Mathematical Finance

More information

Continuous Time Finance. Tomas Björk

Continuous Time Finance. Tomas Björk Continuous Time Finance Tomas Björk 1 II Stochastic Calculus Tomas Björk 2 Typical Setup Take as given the market price process, S(t), of some underlying asset. S(t) = price, at t, per unit of underlying

More information

Non replication of options

Non replication of options Non replication of options Christos Kountzakis, Ioannis A Polyrakis and Foivos Xanthos June 30, 2008 Abstract In this paper we study the scarcity of replication of options in the two period model of financial

More information

Week 1 Quantitative Analysis of Financial Markets Basic Statistics A

Week 1 Quantitative Analysis of Financial Markets Basic Statistics A Week 1 Quantitative Analysis of Financial Markets Basic Statistics A Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 October

More information

Math-Stat-491-Fall2014-Notes-V

Math-Stat-491-Fall2014-Notes-V Math-Stat-491-Fall2014-Notes-V Hariharan Narayanan December 7, 2014 Martingales 1 Introduction Martingales were originally introduced into probability theory as a model for fair betting games. Essentially

More information

ECON FINANCIAL ECONOMICS I

ECON FINANCIAL ECONOMICS I Lecture 3 Stochastic Processes & Stochastic Calculus September 24, 2018 STOCHASTIC PROCESSES Asset prices, asset payoffs, investor wealth, and portfolio strategies can all be viewed as stochastic processes.

More information

Stochastic Processes and Brownian Motion

Stochastic Processes and Brownian Motion A stochastic process Stochastic Processes X = { X(t) } Stochastic Processes and Brownian Motion is a time series of random variables. X(t) (or X t ) is a random variable for each time t and is usually

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 11 10/9/2013. Martingales and stopping times II

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 11 10/9/2013. Martingales and stopping times II MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 11 10/9/013 Martingales and stopping times II Content. 1. Second stopping theorem.. Doob-Kolmogorov inequality. 3. Applications of stopping

More information

( ) since this is the benefit of buying the asset at the strike price rather

( ) since this is the benefit of buying the asset at the strike price rather Review of some financial models for MAT 483 Parity and Other Option Relationships The basic parity relationship for European options with the same strike price and the same time to expiration is: C( KT

More information

Option Pricing Models for European Options

Option Pricing Models for European Options Chapter 2 Option Pricing Models for European Options 2.1 Continuous-time Model: Black-Scholes Model 2.1.1 Black-Scholes Assumptions We list the assumptions that we make for most of this notes. 1. The underlying

More information

Randomness and Fractals

Randomness and Fractals Randomness and Fractals Why do so many physicists become traders? Gregory F. Lawler Department of Mathematics Department of Statistics University of Chicago September 25, 2011 1 / 24 Mathematics and the

More information

No-arbitrage theorem for multi-factor uncertain stock model with floating interest rate

No-arbitrage theorem for multi-factor uncertain stock model with floating interest rate Fuzzy Optim Decis Making 217 16:221 234 DOI 117/s17-16-9246-8 No-arbitrage theorem for multi-factor uncertain stock model with floating interest rate Xiaoyu Ji 1 Hua Ke 2 Published online: 17 May 216 Springer

More information

MAS452/MAS6052. MAS452/MAS Turn Over SCHOOL OF MATHEMATICS AND STATISTICS. Stochastic Processes and Financial Mathematics

MAS452/MAS6052. MAS452/MAS Turn Over SCHOOL OF MATHEMATICS AND STATISTICS. Stochastic Processes and Financial Mathematics t r t r2 r t SCHOOL OF MATHEMATICS AND STATISTICS Stochastic Processes and Financial Mathematics Spring Semester 2017 2018 3 hours t s s tt t q st s 1 r s r t r s rts t q st s r t r r t Please leave this

More information

The stochastic calculus

The stochastic calculus Gdansk A schedule of the lecture Stochastic differential equations Ito calculus, Ito process Ornstein - Uhlenbeck (OU) process Heston model Stopping time for OU process Stochastic differential equations

More information

Lecture 3: Review of mathematical finance and derivative pricing models

Lecture 3: Review of mathematical finance and derivative pricing models Lecture 3: Review of mathematical finance and derivative pricing models Xiaoguang Wang STAT 598W January 21th, 2014 (STAT 598W) Lecture 3 1 / 51 Outline 1 Some model independent definitions and principals

More information

Option Pricing. 1 Introduction. Mrinal K. Ghosh

Option Pricing. 1 Introduction. Mrinal K. Ghosh Option Pricing Mrinal K. Ghosh 1 Introduction We first introduce the basic terminology in option pricing. Option: An option is the right, but not the obligation to buy (or sell) an asset under specified

More information

MORE REALISTIC FOR STOCKS, FOR EXAMPLE

MORE REALISTIC FOR STOCKS, FOR EXAMPLE MARTINGALES BASED ON IID: ADDITIVE MG Y 1,..., Y t,... : IID EY = 0 X t = Y 1 +... + Y t is MG MULTIPLICATIVE MG Y 1,..., Y t,... : IID EY = 1 X t = Y 1... Y t : X t+1 = X t Y t+1 E(X t+1 F t ) = E(X t

More information

Learning Martingale Measures to Price Options

Learning Martingale Measures to Price Options Learning Martingale Measures to Price Options Hung-Ching (Justin) Chen chenh3@cs.rpi.edu Malik Magdon-Ismail magdon@cs.rpi.edu April 14, 2006 Abstract We provide a framework for learning risk-neutral measures

More information

Option pricing in the stochastic volatility model of Barndorff-Nielsen and Shephard

Option pricing in the stochastic volatility model of Barndorff-Nielsen and Shephard Option pricing in the stochastic volatility model of Barndorff-Nielsen and Shephard Indifference pricing and the minimal entropy martingale measure Fred Espen Benth Centre of Mathematics for Applications

More information

MASM006 UNIVERSITY OF EXETER SCHOOL OF ENGINEERING, COMPUTER SCIENCE AND MATHEMATICS MATHEMATICAL SCIENCES FINANCIAL MATHEMATICS.

MASM006 UNIVERSITY OF EXETER SCHOOL OF ENGINEERING, COMPUTER SCIENCE AND MATHEMATICS MATHEMATICAL SCIENCES FINANCIAL MATHEMATICS. MASM006 UNIVERSITY OF EXETER SCHOOL OF ENGINEERING, COMPUTER SCIENCE AND MATHEMATICS MATHEMATICAL SCIENCES FINANCIAL MATHEMATICS May/June 2006 Time allowed: 2 HOURS. Examiner: Dr N.P. Byott This is a CLOSED

More information

Stats243 Introduction to Mathematical Finance

Stats243 Introduction to Mathematical Finance Stats243 Introduction to Mathematical Finance Haipeng Xing Department of Statistics Stanford University Summer 2006 Stats243, Xing, Summer 2007 1 Agenda Administrative, course description & reference,

More information

M5MF6. Advanced Methods in Derivatives Pricing

M5MF6. Advanced Methods in Derivatives Pricing Course: Setter: M5MF6 Dr Antoine Jacquier MSc EXAMINATIONS IN MATHEMATICS AND FINANCE DEPARTMENT OF MATHEMATICS April 2016 M5MF6 Advanced Methods in Derivatives Pricing Setter s signature...........................................

More information

MATH 5510 Mathematical Models of Financial Derivatives. Topic 1 Risk neutral pricing principles under single-period securities models

MATH 5510 Mathematical Models of Financial Derivatives. Topic 1 Risk neutral pricing principles under single-period securities models MATH 5510 Mathematical Models of Financial Derivatives Topic 1 Risk neutral pricing principles under single-period securities models 1.1 Law of one price and Arrow securities 1.2 No-arbitrage theory and

More information

Quantum theory for the binomial model in finance theory

Quantum theory for the binomial model in finance theory Quantum theory for the binomial model in finance theory CHEN Zeqian arxiv:quant-ph/0112156v6 19 Feb 2010 (Wuhan Institute of Physics and Mathematics, CAS, P.O.Box 71010, Wuhan 430071, China) Abstract.

More information

Geometric Brownian Motions

Geometric Brownian Motions Chapter 6 Geometric Brownian Motions 1 Normal Distributions We begin by recalling the normal distribution briefly. Let Z be a random variable distributed as standard normal, i.e., Z N(0, 1). The probability

More information

Math 489/Math 889 Stochastic Processes and Advanced Mathematical Finance Dunbar, Fall 2007

Math 489/Math 889 Stochastic Processes and Advanced Mathematical Finance Dunbar, Fall 2007 Steven R. Dunbar Department of Mathematics 203 Avery Hall University of Nebraska-Lincoln Lincoln, NE 68588-0130 http://www.math.unl.edu Voice: 402-472-3731 Fax: 402-472-8466 Math 489/Math 889 Stochastic

More information

Brownian Motion, the Gaussian Lévy Process

Brownian Motion, the Gaussian Lévy Process Brownian Motion, the Gaussian Lévy Process Deconstructing Brownian Motion: My construction of Brownian motion is based on an idea of Lévy s; and in order to exlain Lévy s idea, I will begin with the following

More information

Asymptotic results discrete time martingales and stochastic algorithms

Asymptotic results discrete time martingales and stochastic algorithms Asymptotic results discrete time martingales and stochastic algorithms Bernard Bercu Bordeaux University, France IFCAM Summer School Bangalore, India, July 2015 Bernard Bercu Asymptotic results for discrete

More information

arxiv: v2 [q-fin.gn] 13 Aug 2018

arxiv: v2 [q-fin.gn] 13 Aug 2018 A DERIVATION OF THE BLACK-SCHOLES OPTION PRICING MODEL USING A CENTRAL LIMIT THEOREM ARGUMENT RAJESHWARI MAJUMDAR, PHANUEL MARIANO, LOWEN PENG, AND ANTHONY SISTI arxiv:18040390v [q-fingn] 13 Aug 018 Abstract

More information

Modeling via Stochastic Processes in Finance

Modeling via Stochastic Processes in Finance Modeling via Stochastic Processes in Finance Dimbinirina Ramarimbahoaka Department of Mathematics and Statistics University of Calgary AMAT 621 - Fall 2012 October 15, 2012 Question: What are appropriate

More information

Tangent Lévy Models. Sergey Nadtochiy (joint work with René Carmona) Oxford-Man Institute of Quantitative Finance University of Oxford.

Tangent Lévy Models. Sergey Nadtochiy (joint work with René Carmona) Oxford-Man Institute of Quantitative Finance University of Oxford. Tangent Lévy Models Sergey Nadtochiy (joint work with René Carmona) Oxford-Man Institute of Quantitative Finance University of Oxford June 24, 2010 6th World Congress of the Bachelier Finance Society Sergey

More information

American Option Pricing Formula for Uncertain Financial Market

American Option Pricing Formula for Uncertain Financial Market American Option Pricing Formula for Uncertain Financial Market Xiaowei Chen Uncertainty Theory Laboratory, Department of Mathematical Sciences Tsinghua University, Beijing 184, China chenxw7@mailstsinghuaeducn

More information

Option Pricing Formula for Fuzzy Financial Market

Option Pricing Formula for Fuzzy Financial Market Journal of Uncertain Systems Vol.2, No., pp.7-2, 28 Online at: www.jus.org.uk Option Pricing Formula for Fuzzy Financial Market Zhongfeng Qin, Xiang Li Department of Mathematical Sciences Tsinghua University,

More information

2.1 Mathematical Basis: Risk-Neutral Pricing

2.1 Mathematical Basis: Risk-Neutral Pricing Chapter Monte-Carlo Simulation.1 Mathematical Basis: Risk-Neutral Pricing Suppose that F T is the payoff at T for a European-type derivative f. Then the price at times t before T is given by f t = e r(t

More information

Multi-Asset Options. A Numerical Study VILHELM NIKLASSON FRIDA TIVEDAL. Master s thesis in Engineering Mathematics and Computational Science

Multi-Asset Options. A Numerical Study VILHELM NIKLASSON FRIDA TIVEDAL. Master s thesis in Engineering Mathematics and Computational Science Multi-Asset Options A Numerical Study Master s thesis in Engineering Mathematics and Computational Science VILHELM NIKLASSON FRIDA TIVEDAL Department of Mathematical Sciences Chalmers University of Technology

More information

A note on the existence of unique equivalent martingale measures in a Markovian setting

A note on the existence of unique equivalent martingale measures in a Markovian setting Finance Stochast. 1, 251 257 1997 c Springer-Verlag 1997 A note on the existence of unique equivalent martingale measures in a Markovian setting Tina Hviid Rydberg University of Aarhus, Department of Theoretical

More information

Martingale Measure TA

Martingale Measure TA Martingale Measure TA Martingale Measure a) What is a martingale? b) Groundwork c) Definition of a martingale d) Super- and Submartingale e) Example of a martingale Table of Content Connection between

More information

Some Computational Aspects of Martingale Processes in ruling the Arbitrage from Binomial asset Pricing Model

Some Computational Aspects of Martingale Processes in ruling the Arbitrage from Binomial asset Pricing Model International Journal of Basic & Applied Sciences IJBAS-IJNS Vol:3 No:05 47 Some Computational Aspects of Martingale Processes in ruling the Arbitrage from Binomial asset Pricing Model Sheik Ahmed Ullah

More information

Discrete time interest rate models

Discrete time interest rate models slides for the course Interest rate theory, University of Ljubljana, 2012-13/I, part II József Gáll University of Debrecen, Faculty of Economics Nov. 2012 Jan. 2013, Ljubljana Introduction to discrete

More information

IEOR E4703: Monte-Carlo Simulation

IEOR E4703: Monte-Carlo Simulation IEOR E4703: Monte-Carlo Simulation Generating Random Variables and Stochastic Processes Martin Haugh Department of Industrial Engineering and Operations Research Columbia University Email: martin.b.haugh@gmail.com

More information

How Much Should You Pay For a Financial Derivative?

How Much Should You Pay For a Financial Derivative? City University of New York (CUNY) CUNY Academic Works Publications and Research New York City College of Technology Winter 2-26-2016 How Much Should You Pay For a Financial Derivative? Boyan Kostadinov

More information

Financial Mathematics. Spring Richard F. Bass Department of Mathematics University of Connecticut

Financial Mathematics. Spring Richard F. Bass Department of Mathematics University of Connecticut Financial Mathematics Spring 22 Richard F. Bass Department of Mathematics University of Connecticut These notes are c 22 by Richard Bass. They may be used for personal use or class use, but not for commercial

More information

Advanced Probability and Applications (Part II)

Advanced Probability and Applications (Part II) Advanced Probability and Applications (Part II) Olivier Lévêque, IC LTHI, EPFL (with special thanks to Simon Guilloud for the figures) July 31, 018 Contents 1 Conditional expectation Week 9 1.1 Conditioning

More information

Introduction to Affine Processes. Applications to Mathematical Finance

Introduction to Affine Processes. Applications to Mathematical Finance and Its Applications to Mathematical Finance Department of Mathematical Science, KAIST Workshop for Young Mathematicians in Korea, 2010 Outline Motivation 1 Motivation 2 Preliminary : Stochastic Calculus

More information

Advanced Stochastic Processes.

Advanced Stochastic Processes. Advanced Stochastic Processes. David Gamarnik LECTURE 16 Applications of Ito calculus to finance Lecture outline Trading strategies Black Scholes option pricing formula 16.1. Security price processes,

More information

Pricing theory of financial derivatives

Pricing theory of financial derivatives Pricing theory of financial derivatives One-period securities model S denotes the price process {S(t) : t = 0, 1}, where S(t) = (S 1 (t) S 2 (t) S M (t)). Here, M is the number of securities. At t = 1,

More information

MSc Financial Engineering CHRISTMAS ASSIGNMENT: MERTON S JUMP-DIFFUSION MODEL. To be handed in by monday January 28, 2013

MSc Financial Engineering CHRISTMAS ASSIGNMENT: MERTON S JUMP-DIFFUSION MODEL. To be handed in by monday January 28, 2013 MSc Financial Engineering 2012-13 CHRISTMAS ASSIGNMENT: MERTON S JUMP-DIFFUSION MODEL To be handed in by monday January 28, 2013 Department EMS, Birkbeck Introduction The assignment consists of Reading

More information