Applied Probability and Mathematical Finance Theory

Size: px
Start display at page:

Download "Applied Probability and Mathematical Finance Theory"

Transcription

1 Applied Probability and Mathematical Finance Theory Hiroshi Toyoizumi 1 January 16, toyoizumi@waseda.jp

2 Contents 1 Introduction 5 2 Basic Probability Theory Why Probability? Probability Space Conditional Probability and Independence Random Variables Expectation, Variance and Standard Deviation Covariance and Correlation How to Make a Random Variable References Exercises Normal Random Variables What is Normal Random Variable? Lognormal Random Variables References Exercises Useful Probability Theorems The Law of Large Numbers Poisson Random Variables and the Law of Small Numbers The Central Limit Theorem Useful Estimations References Exercises Brownian Motions and Poisson Processes Geometric Brownian Motions Discrete Time Tree Binomial Process Brownian Motions Poisson Processes References Exercises

3 CONTENTS 3 6 Simulating Brownian Motions Mathematica As an Advanced Calculator Generating Random Variables Two-Dimensional Brownian Motions Random Variable on Unit Circle Generating Brownian Motion Geometric Brownian Motions Generating Bernouilli Random Variables Generating geometric brownian motions Present Value Analysis Interest Rates Continuously Compounded Interest Rates Present Value Rate of Return References Exercises Risk Neutral Probability and Arbitrage Option to Buy Stocks Risk Neutralization Arbitrage and Price of Option Duplication and Law of One Price Arbitrage Theorem References Exercises Black-Scholes Formula Risk-neutral Tree Binomial Model Option Price on the Discrete Time Black-Scholes Model Examples of Option Price via Black-Scholes Formula References Exercises Delta Hedging Strategy Binomial Hedging Model Hedging in Black-Scholes Model Partial Derivative References Exercises

4 4 CONTENTS 11 More Vanilla Options in Detail American Call Options Put Options Pricing American Put Options Stock with Continuous Dividend Stock with Fixed-time-Dividend Forward and Futures Contract Exercises

5 Chapter 1 Introduction Can you answer the following questions and explain your answer logically? Example 1.1 (Flipping Coins [6]). Suppose you are flipping the coin twice. Can you find any differences between the following probabilities? 1. the probability that the both flips land on heads given that the first flip lands on heads. 2. the probability that the both flips land on heads given that at least one of flips lands on heads. Example 1.2 (Wait or Not Wait). Assume you would like to enter a building, but you realized that you forgot your ID card to enter the building. You have been waiting for your colleagues with ID card 10 minutes so far. Is it reasonable to think that you would enter the building more likely next 1 minute? Example 1.3 (Easy Money). Imagine you are betting on a gamble. Here s the best scheme [2] to get easy money on gambling. When you lose, you bet twice as much next time. In that way, even if you lose one time, you can get the loss next time. So, you can win eventually. Do you use this scheme? These questions are typical for decision making under uncertainty. In order to answer the above question, we need to study probability theory. The aim of this lecture is to learn 1. the basic of applied probability, 2. the basic of finance theory, 3. learn how to use them to answer those questions in above examples. This handout is based on the Text: Sheldon M. Ross, An Elementary Introduction to Mathematical Finance: Options and Other Topics[6]. Note that the items covered in [6] are not complete enough to cover all of this lecture. You can find other example of these questions in [2]. 5

6 Chapter 2 Basic Probability Theory 2.1 Why Probability? Example 2.1. Here s examples where we use probability: Lottery. Weathers forecast. Gamble. Baseball, Life insurance. Finance. Since our intuition sometimes leads us mistake in those random phenomena, we need to handle them using extreme care in rigorous mathematical framework, called probability theory. (See Exercise 2.1). 2.2 Probability Space Be patient to learn the basic terminology in probability theory. To determine the probabilistic structure, we need a probability space, which is consisted by a sample space, a probability measure and a family of (good) set of events. Definition 2.1 (Sample Space). The set of all events is called sample space, and we write it as Ω. Each element ω Ω is called an event. Example 2.2 (Lottery). Here s the example of Lottery. The sample space Ω is {first prize, second prize,..., lose}. 6

7 2.2. PROBABILITY SPACE 7 An event ω can be first prize, second prize,..., lose, and so on. Sometimes, it is easy to use sets of events in sample space Ω. Example 2.3 (Sets in Lottery). The followings are examples in Ω of Example 2.2. W = {win} = {first prize, second prize,..., sixth prize} (2.1) L = {lose} (2.2) Thus, we can say that what is the probability of win?, instead of saying what is the probability that we have either first prize, second prize,..., or sixth prize?. Definition 2.1 (Probability measure). The probability of A, P(A), is defined for each set of the sample space Ω, if the followings are satisfyed: 1. 0 P(A) 1 for all A Ω. 2. P(Ω) = For any sequence of mutually exclusive A 1,A 2... P( A i ) = i=1 i=1 In addition, P is said to be the probability measure on Ω. P(A i ). (2.3) Mathematically, all function f which satisfies Definition 2.1 can regarded as probability. Thus, we need to be careful to select which function is suitable for probability. Example 2.4 (Probability Measures in Lottery). Suppose we have a lottery such as 10 first prizes, 20 second prizes 60 sixth prizes out of total 1000 tickets, then we have a probability measure P defined by P(n) = P(win n-th prize) = n 100 (2.4) P(0) = P(lose) = (2.5) It is easy to see that P satisfies Definition 2.1. According to the definition P, we can calculate the probability on a set of events: P(W) = the probability of win = P(1) + P(2) + + P(6) = Of course, you can cheat your customer by saying you have 100 first prizes instead of 10 first prizes. Then your customer might have a different P satisfying Definition 2.1. Thus it is pretty important to select an appropriate probability measure. Selecting the probability measure is a bridge between physical world and mathematical world. Don t use wrong bridge!

8 8 CHAPTER 2. BASIC PROBABILITY THEORY Remark 2.1. There is a more rigorous way to define the probability measure. Indeed, Definition 2.1 is NOT mathematically satisfactory in some cases. If you are familiar with measure theory and advanced integral theory, you may proceed to read [3]. 2.3 Conditional Probability and Independence Now we introduce the most uselful and probably most difficult concepts of probability theory. Definition 2.2 (Conditional Probability). Define the probability of B given A by P(B A) = P(B & A) P(A) = P(B A). (2.6) P(A) We can use the conditional probability to calculate complex probability. It is actually the only tool we can rely on. Be sure that the conditional probability P(B A) is different with the regular probability P(B). Example 2.5 (Lottery). Let W = {win} and F = {first prize} in Example 2.4. Then we have the conditional probability that P(F W) = the probability of winning 1st prize given you win the lottery = P(F W) P(W) = P(F) P(W) = 10/ /1000 = = P(F). Remark 2.2. Sometimes, we may regard Definition 2.2 as a theorem and call Bayse rule. But here we use this as a definition of conditional probability. Definition 2.3 (Independence). Two sets of events A and B are said to be independent if P(A&B) = P(A B) = P(A)P(B) (2.7) Theorem 2.1 (Conditional of Probability of Independent Events). Suppose A and B are independent, then the conditional probability of B given A is equal to the probability of B. Proof. By Definition 2.2, we have P(B A) = P(B A) P(A) where we used A and B are independent. = P(B)P(A) P(A) = P(B),

9 2.4. RANDOM VARIABLES 9 Example 2.6 (Independent two dices). Of course two dices are independent. So P(The number on the first dice is even while the one on the second is odd) = P(The number on the first dice is even)p(the number on the second dice is odd) = Example 2.7 (Dependent events on two dice). Even though the two dices are independent, you can find dependent events. For example, P(The sum of two dice is even while the one on the second is odd) P(The sum of two dice is even)p(the number on the second dice is odd). See Exercise 2.4 for the detail. 2.4 Random Variables The name random variable has a strange and stochastic history 1. Although its fragile history, the invention of random variable certainly contribute a lot to the probability theory. Definition 2.4 (Random Variable). The random variable X = X(ω) is a real-valued function on Ω, whose value is assigned to each outcome of the experiment (event). Remark 2.3. Note that probability and random variables is NOT same! Random variables are function of events while the probability is a number. To avoid the confusion, we usually use the capital letter to random variables. Example 2.8 (Lottery). A random variable X can be designed to formulate a lottery. X = 1, when we get the first prize. X = 2, when we get the second prize. Example 2.9 (Bernouilli random variable). Let X be a random variable with { 1 with probability p. X = 0 with probability 1 p. (2.8) for some p [0,1]. The random variable X is said to be a Bernouilli random variable. 1 J. Doob quoted in Statistical Science. (One of the great probabilists who established probability as a branch of mathematics.) While writing my book [Stochastic Processes] I had an argument with Feller. He asserted that everyone said random variable and I asserted that everyone said chance variable. We obviously had to use the same name in our books, so we decided the issue by a stochastic procedure. That is, we tossed for it and he won.

10 10 CHAPTER 2. BASIC PROBABILITY THEORY Sometimes we use random variables to indicate the set of events. For example, instead of saying the set that we win first prize, {ω Ω : X(ω) = 1}, or simply {X = 1}. Definition 2.5 (Probability distribution). The probability distribution function F(x) is defined by F(x) = P{X x}. (2.9) The probability distribution function fully-determines the probability structure of a random variable X. Sometimes, it is convenient to consider the probability density function instead of the probability distribution. Definition 2.6 (probability density function). The probability density function f (t) is defined by f (x) = df(x) dx = dp{x x}. (2.10) dx Sometimes we use df(x) = dp{x x} = P(X (x,x + dx]) even when F(x) has no derivative. Lemma 2.1. For a (good) set A, P{X A} = A dp{x x} = f (x)dx. (2.11) A 2.5 Expectation, Variance and Standard Deviation Let X be a random variable. Then, we have some basic tools to evaluate random variable X. First we have the most important measure, the expectation or mean of X. Definition 2.7 (Expectation). E[X] = xdp{x x} = x f (x)dx. (2.12) Remark 2.4. For a discrete random variable, we can rewrite (2.12) as E[X] = x n P[X = x n ]. (2.13) n Lemma 2.2. Let (X n ) n=1,...,n be the sequence of random variables. change the order of summation and the expectation. Then we can E[X X N ] = E[X 1 ] + + E[X N ] (2.14)

11 2.5. EXPECTATION, VARIANCE AND STANDARD DEVIATION 11 Proof. See Exercise 2.6. E[X] gives you the expected value of X, but X is fluctuated around E[X]. So we need to measure the strength of this stochastic fluctuation. The natural choice may be X E[X]. Unfortunately, the expectation of X E[X] is always equal to zero. Thus, we need the variance of X, which is indeed the second moment around E[X]. Definition 2.8 (Variance). Var[X] = E[(X E[X]) 2 ]. (2.15) Lemma 2.3. We have an alternative to calculate Var[X], Var[X] = E[X 2 ] E[X] 2. (2.16) Proof. See Exercise 2.6. Unfortunately, the variance Var[X] has the dimension of X 2. So, in some cases, it is inappropriate to use the variance. Thus, we need the standard deviation σ[x] which has the order of X. Definition 2.9 (Standard deviation). σ[x] = (Var[X]) 1/2. (2.17) Example 2.10 (Bernouilli random variable). Let X be a Bernouilli random variable with P[X = 1] = p and P[X = 0] = 1 p. Then we have E[X] = 1p + 0(1 p) = p. (2.18) Var[X] = E[X 2 ] E[X] 2 = E[X] E[X] 2 = p(1 p), (2.19) where we used the fact X 2 = X for Bernouille random variables. In many cases, we need to deal with two or more random variables. When these random variables are independent, we are very lucky and we can get many useful result. Otherwise... Definition 2.2. We say that two random variables X and Y are independent when the sets {X x} and {Y y} are independent for all x and y. In other words, when X and Y are independent, P(X x,y y) = P(X x)p(y y) (2.20) Lemma 2.4. For any pair of independent random variables X and Y, we have

12 12 CHAPTER 2. BASIC PROBABILITY THEORY E[XY ] = E[X]E[Y ]. Var[X +Y ] = Var[X] +Var[Y ]. Proof. Extending the definition of the expectation, we have a double integral, E[XY ] = xydp(x x,y y). Since X and Y are independent, we have P(X x,y y) = P(X x)p(y y). Thus, E[XY ] = = xydp(x x)dp(y y) xdp(x x) ydp(x y) = E[X]E[Y ]. Using the first part, it is easy to check the second part (see Exercise 2.9.) Example 2.11 (Binomial random variable). Let X be a random variable with X = n i=1 X i, (2.21) where X i are independent Bernouilli random variables with the mean p. The random variable X is said to be a Binomial random variable. The mean and variance of X can be obtained easily by using Lemma 2.4 as E[X] = np, (2.22) Var[X] = np(1 p). (2.23) 2.6 Covariance and Correlation When we have two or more random variables, it is natural to consider the relation of these random variables. But how? The answer is the following: Definition 2.10 (Covariance). Let X and Y be two (possibly not independent) random variables. Define the covariance of X and Y by Cov[X,Y ] = E[(X E[X])(Y E[Y ])]. (2.24) Thus, the covariance measures the multiplication of the fluctuations around their mean. If the fluctuations are tends to be the same direction, we have larger covariance.

13 2.7. HOW TO MAKE A RANDOM VARIABLE 13 Example 2.12 (The covariance of a pair of Binomial random variables). Let X 1 and X 2 be the independent Binomial random variables with the same parameter n and p. The covariance of X 1 and X 2 is Cov[X 1,X 2 ] = E[X 1 X 2 ] E[X 1 ]E[X 2 ] = 0, since X 1 and X 2 are independent. Thus, more generally, if the two random variables are independent, their covariance is zero. (The converse is not always true. Give some example!) Now, let Y = X 1 + X 2. How about the covariance of X 1 and Y? Cov[X 1,Y ] = E[X 1 Y ] E[X 1 ]E[Y ] = E[X 1 (X 1 + X 2 )] E[X 1 ]E[X 1 + X 2 ] = E[X 2 1 ] E[X 1 ] 2 = Var[X 1 ] = np(1 p) > 0. Thus, the covariance of X 1 and Y is positive as can be expected. It is easy to see that we have Cov[X,Y ] = E[XY ] E[X]E[Y ], (2.25) which is sometimes useful for calculation. Unfortunately, the covariance has the order of XY, which is not convenience to compare the strength among different pair of random variables. Don t worry, we have the correlation function, which is normalized by standard deviations. Definition 2.11 (Correlation). Let X and Y be two (possibly not independent) random variables. Define the correlation of X and Y by ρ[x,y ] = Cov[X,Y ] σ[x]σ[y ]. (2.26) Lemma 2.5. For any pair of random variables, we have 1 ρ[x,y ] 1. (2.27) Proof. See Exercise How to Make a Random Variable Suppose we would like to simulate a random variable X which has a distribution F(x). The following theorem will help us.

14 14 CHAPTER 2. BASIC PROBABILITY THEORY Theorem 2.2. Let U be a random variable which has a uniform distribution on [0,1], i.e P[U u] = u. (2.28) Then, the random variable X = F 1 (U) has the distribution F(x). Proof. P[X x] = P[F 1 (U) x] = P[U F(x)] = F(x). (2.29) 2.8 References There are many good books which useful to learn basic theory of probability. The book [5] is one of the most cost-effective book who wants to learn the basic applied probability featuring Markov chains. It has a quite good style of writing. Those who want more rigorous mathematical frame work can select [3] for their starting point. If you want directly dive into the topic like stochatic integral, your choice is maybe [4]. 2.9 Exercises Exercise 2.1. Find an example that our intuition leads to mistake in random phenomena. Exercise 2.2. Define a probability space according to the following steps. 1. Take one random phenomena, and describe its sample space, events and probability measure 2. Define a random variable of above phenomena 3. Derive the probability function and the probability density. 4. Give a couple of examples of set of events. Exercise 2.3. Explain the meaning of (2.3) using Example 2.2 Exercise 2.4. Check P defined in Example 2.4 satisfies Definition 2.1. Exercise 2.5. Calculate the both side of Example 2.7. Check that these events are dependent and explain why. Exercise 2.6. Prove Lemma 2.2 and 2.3 using Definition 2.7.

15 2.9. EXERCISES 15 Exercise 2.7. Prove Lemma 2.4. Exercise 2.8. Let X be the Bernouilli random variable with its parameter p. Draw the graph of E[X], Var[X], σ[x] against p. How can you evaluate X? Exercise 2.9. Prove Var[X +Y ] = Var[X]+Var[Y ] for any pair of independent random variables X and Y. Exercise 2.10 (Binomial random variable). Let X be a random variable with X = n i=1 X i, (2.30) where X i are independent Bernouilli random variables with the mean p. The random variable X is said to be a Binomial random variable. Find the mean and variance of X. Exercise Prove for any pair of random variables, we have 1 ρ[x,y ] 1. (2.31)

16 Chapter 3 Normal Random Variables Normal random variables are important tool to analyze a series of independent random variables. 3.1 What is Normal Random Variable? Let s begin with the definition of normal random variables. Definition 3.1 (Normal random variable). Let X be a random variable with its probability density function dp{x x} = f (x)dx = 1 (2π) 1/2 σ e (x µ)2 /2σ 2 dx, (3.1) for some µ and σ. The random variable is called the normal random variable with the parameters µ and σ. Theorem 3.1 (Mean and variance of normal random variables). Let X be a normal random variable with the parameters µ and σ. Then, we have the mean and the variance E[X] = µ, (3.2) Var[X] = σ 2. (3.3) Proof. Definition 3.2 (Standard normal random variable). Let X be a normal random variable with µ = 0 and σ = 1. The random variable is called the standard normal random variable. 16

17 3.2. LOGNORMAL RANDOM VARIABLES 17 Lemma 3.1. Let X be a normal random variable with its mean µ and standard deviation σ. Set Z = (X µ)/σ. Then, Z is the standard normal random variable. Proof. See Exercise 3.5. Theorem 3.2. Let (X i ) i=1,2,...,n are independent normal random variables with its mean µ i and standard deviation σ i. Then the sum of these random variables X = n i=1 X i is again a normal random variable with µ = σ 2 = n i=1 n i=1 µ i, (3.4) σ 2 i. (3.5) Proof. We just prove X satisfies (3.4) and (3.5). By Lemma 2.2, µ = E[X] = E Also, by Lemma 2.4, we have σ 2 = E[X] = Var [ n i=1x i ] [ n i=1x i ] = = n i=1 n i=1 E [X i ] = n i=1 Var [X i ] = We can prove that X is a normal random variable by using so-called characteristic function method (or Fourier transform). µ i. n i=1 σ 2 i. So, it is very comfortable to be in the world of normal random variables. Very closed! 3.2 Lognormal Random Variables Definition 3.3 (lognormal random variable). The random variable Y is said to be lognormal if log(y ) is a normal random variable. Thus, a lognormal random variable can be expressed as Y = e X, (3.6) where X is a normal random variable. Lognormal random variables plays a measure role in finance theory!

18 18 CHAPTER 3. NORMAL RANDOM VARIABLES Theorem 3.3 (lognormal). If X is a normal random variable having the mean µ and the standard deviation σ 2, the lognormal random variable Y = e X has the mean and the variance as E[Y ] = e µ+σ 2 /2, (3.7) Var[Y ] = e 2µ+2σ 2 e 2µ+σ 2. (3.8) It is important to see that although the mean of the lognormal random variable is subjected to not only the mean of the original normal random variable but also the standard deviation. Proof. Let us assume X is the standard normal random variable, for a while. Let m(t) be the moment generation function of X, i.e., m(t) = E[e tx ]. (3.9) Then, by differentiate the left hand side and setting t = 0, we have Further, we have m (0) = d dt m(t) t=0 = E[Xe tx ] t=0 = E[X]. (3.10) m (0) = E[X 2 ]. On the other hand, since X is the standard normal random variable, we have m(t) = E[e tx ] = 1 2π Since tx x 2 /2 = {t 2 (x t) 2 }/2, we have m(t) = 1 e t2 /2 2π e tx e x2 /2 dx. e (x t)2 /2 dx, where the integrand of the right hand side is nothing but the density of the normal random variable N(t,1). Thus, m(t) = e t2 /2. More generally, when X is a normal random variable with N(µ,σ), we can obtain (See exercise 3.6.) Since Y = e X, we have m(t) = E[e tx ] = e µt+σ 2 t 2 /2. (3.11) E[Y ] = m(1) = e µ+σ 2 /2, (3.12)

19 3.2. LOGNORMAL RANDOM VARIABLES 19 and E[Y 2 ] = m(2) = e 2µ+2σ 2. (3.13) Thus, Var[Y ] = E[Y 2 ] E[Y ] 2 = e 2µ+2σ 2 e 2µ+σ 2. (3.14) Let S(n) be the price of a stock at time n. Let Y (n) be the growth rate of the stock, i.e., Y (n) = S(n) S(n 1) (3.15) In mathematical finance, it is commonly assumed that Y (n) is independent and identically distributed as the lognormal random variable. Taking log on both side of (3.15), then we have logs(n) = logs(n 1) + X(n), (3.16) where if X(n) = logy (n) is regarded as the error term and normally distributed, the above assumption is validated. Example 3.1 (Stock price rises in two weeks in a row). Suppose Y (n) is the growth rate of a stock at the n-th week, which is independent and lognormally distributed with the parameters µ and σ. We will find the probability that the stock price rises in two weeks in a row. First, we will estimate P{the stock rises}. Since it is equivalent that y > 1 and logy > 0, we have P{the stock rises} = P{S(1) > S(0)} { } S(1) = P S(0) > 1 { ( ) } S(1) = P log > 0 S(0) = P{X > 0} where X = logy (1) is N(µ,σ), and we can define as the standard normal distribution. Hence, we have { X µ P{S(1) > S(0)} = P σ Z = X µ σ, (3.17) = P{Z > µ/σ} = P{Z < µ/σ}, > 0 µ } σ

20 20 CHAPTER 3. NORMAL RANDOM VARIABLES where we used the symmetry of the normal distribution (see exercise 3.1). Now we consider the probability of two consecutive stock price rise. Since Y (n) is assumed to be independent, we have P{the stock rises two week in a row} = P{Y (1) > 0,Y (2) > 0} = P{Y (1) > 0}P{Y (2) > 0} = P{Z < µ/σ} References 3.4 Exercises Exercise 3.1. Prove the symmetry of the normal distribution. That is, let X be the normal distribution with the mean µ and the standard deviation σ, then for any x, we have P{X > x} = P{X < x}. (3.18) Exercise 3.2 (moment generating function). Let X be a random variable. The function m(t) = E[e tx ] is said to be the moment generating function of X. 1. Prove E[X] = m (0). 2. Prove E[X n ] = dn dt n m(t) t=0. 3. Rewrite the variance of X using m(t). Exercise 3.3. Let X be a normal random variable with the parameters µ = 0 and σ = Find the moment generating function of X. 2. By differentiation, find the mean and variance of X. Exercise 3.4. Use Microsoft Excel to draw the graph of probability distribution f (x) = dp{x x}/dx of normal random variable X with 1. µ = 5 and σ = 1, 2. µ = 5 and σ = 2, 3. µ = 4 and σ = 3.

21 3.4. EXERCISES 21 What can you say about these graphs, especially large x? Click help in your Excel to find the appropriate function. Of course, it won t help you to find the answer though... Exercise 3.5. Let X be a normal random variable with its mean µ and standard deviation σ. Set Z = (X µ)/σ. Prove Z is the standard normal random variable. Exercise 3.6. Verify (3.11) by using Lemma 3.1 Exercise 3.7. Let Y = e X be a lognormal random variable where X is N(µ,σ). Find E[Y ] and Var[Y ].

22 Chapter 4 Useful Probability Theorems 4.1 The Law of Large Numbers In many cases, we need to evaluate the average of large number of independent and identically-distributed random variables. Perhaps, you can intuitively assume the average will be approximated by its mean. This intuition can be validated by the following theorems. Theorem 4.1 (Weak law of large numbers). Let X 1,X 2,... be an i.i.d. sequence of random variables with Let S n = n i=1 X i. Then, for all ε > 0, µ = E[X n ]. (4.1) P{ S n /n µ > ε} 0 as n. (4.2) Or, if we take sufficiently large number of samples, the average will be close to µ with high probability. You can say, OK. I understand, for the large sample, we can expect the average will be close to the mean most of the times. So, it might be possible to have some exceptions... In that case, we have another answer. Theorem 4.2 (Strong law of large numbers). Let X 1,X 2,... be an i.i.d. sequence of random variables with With probability 1, we have S n n µ = E[X n ]. (4.3) µ as n. (4.4) So, now we cay say that almost all trials, the average will be converges to µ. Example 4.1. Gambler example 22

23 4.2. POISSON RANDOM VARIABLES AND THE LAW OF SMALL NUMBERS Poisson Random Variables and the Law of Small Numbers Compare to the the theorems of large numbers, the law of small numbers are less famous. But sometimes, it gives us great tool to analyze stochastic events. First of all, we define Poisson random variable. Definition 4.1 (Poisson random variable). A random variable N is said to be a Poisson random variable, when where λ is a constant equal to its mean E[N]. Theorem 4.3. Let N be a Poisson random variable. P{N = n} = (λ)n n! e λ, (4.5) Var[N] = λ. (4.6) Theorem 4.4 (The law of Poisson small number). The number of many independent rare events can be approximated by a Poisson random variable. Example 4.2. Let N be the number of customers arriving to a shop. If each customer visits this shop independently and its frequency to visit there is relatively rare, then N can be approximated by a Poisson random variable. 4.3 The Central Limit Theorem Let X 1,X 2,... be an i.i.d. sequence of random variables with µ = E[X n ], (4.7) σ 2 = Var[X n ]. (4.8) Now we would like to estimate the sum of this random variables, i.e. S n = n i=1 X i. (4.9) Theorem 4.5 (Central limit theorem). For a large n, we have { } Sn nµ P σ n x Φ(x), (4.10) where Φ(x) is the distribution function of the standard normal distribution N(0,1). Or S n N(nµ,σ n). (4.11)

24 24 CHAPTER 4. USEFUL PROBABILITY THEOREMS The central limit theorem indicate that no matter what the random variable X i is like, the sum S n can be regarded Instead of stating lengthy and technically advanced proof of laws of large numbers and ccentral limit theorem, we give some examples of how S n converges to a Normal random variable. Example 4.3 (Average of Bernouilli Random Variables). Let X i be i.i.d Bernoulli random variables with p = 1/2. Suppose we are going to evaluate the sample average A of X i s; A = 1 n n i=1 X i. (4.12) Of course, we expect that A is close to the mean E[X i ] = 1/2 but how close? Figure 4.1 shows the histogram of 1000 different runs of A with n = 10. Since n = 10 is relatively small samples, we have large deviation from the expected mean 1/2. On the other hand, when we have more samples, the sample average A tends to be 1/2 as we see in Figure 4.2 and 4.3, which can be a demonstration of Weak Law of Large Numbers (Theorem 4.1). As we see in in Figure 4.1 to 4.3, we may still have occasional high and low averages. How about individual A for large sample n. Figure 4.4 shows the sample path level convergence of the sample average A to E[X] = 1/2, as it can be expected from Strong Law of Large Numbers (Theorem 4.2). Now we know that A converges to 1/2. Further, due to Central Limit Theorem (Theorem 4.5), magnifying the histogram, the distribution can be approximate by Normal distribution. Figure 4.5 show the detail of the histogram of A. The histogram is quite similar to the corresponding Normal distribution. 4.4 Useful Estimations We have the following two estimation of the distribution, which is sometimes very useful. Theorem 4.6 (Markov s Inequality). Let X be a nonnegative random variable, then we have for all a > 0. P{X a} E[X]/a, (4.13) Proof. Let I A be the indicator function of the set A. It is easy to see Taking expectation on the both side, we have ai {X a} X. (4.14) ap{x a} E[X]. (4.15)

25 4.4. USEFUL ESTIMATIONS Figure 4.1: Histogram of the sample average A = 1 n n i=1 X i when n = 10, where X i is a Bernouilli random variable with E[X i ] = 1/ Figure 4.2: The sample average A when n = Figure 4.3: The sample average A when n = 1000.

26 26 CHAPTER 4. USEFUL PROBABILITY THEOREMS Figure 4.4: The sample path of the sample average A = 1 n n i=1 X i up to n = 100, where X i is a Bernouilli random variable with E[X i ] = 1/ Figure 4.5: The detailed histgram of the sample average A = 1 n n i=1 X i when n = 10, where X i is a Bernouilli random variable with E[X i ] = 1/2. The solid line is the corresponding Normal distribution.

27 4.5. REFERENCES 27 Theorem 4.7 (Chernov Bounds). Let X be the random variable with moment generating function M(t) = E[e tx ]. Then, we have P{X a} e ta M(t) for all t > 0, (4.16) P{X a} e ta M(t) for all t < 0. (4.17) Proof. For t > 0 we have P{X a} = P{e tx e ta }. (4.18) Using Theorem 4.6, P{e tx e ta } e ta E[e tx ]. (4.19) Thus, we have the result. 4.5 References The proofs omitted in this chapter can be found in those books like ITO KIYOSHI and Durret [3]. 4.6 Exercises Exercise 4.1. Something here!

28 Chapter 5 Brownian Motions and Poisson Processes Brownian Motions and Poisson Processes are the most useful and practical tools to understand stochastic processes appeared in the real world. 5.1 Geometric Brownian Motions Definition 5.1 (Geometric Brownian motion). We say S(t) is a geometric Brownian motion with the drift parameter µ and the volatility parameter σ if for any y 0, 1. the growth rate S(t + y)/s(t) is independent of all history of S up to t, and 2. log(s(t + y)/s(t)) is a normal random varialble with its mean µt and its variance σ 2 t. Let S(t) be the price of a stock at time t. In mathematical finance theory, often, we assume S(t) to be a geometric Brownian motion. If the price of stock S(t) is a geometric Brownian motion, we can say that 1. the future price growth rate is independent of the past price, and 2. the distribution of the growth rate is distributed as the lognormal with the parameter µt and σ 2 t.. The future price is probabilistically decided by the present price. Sometimes, this kind of stochastic processes are refereed to a Markov process. Lemma 5.1. If S(t) is a geometric Brownian motion, we have E[S(t) S(0) = s 0 ] = s 0 e t(µ+σ 2 /2). (5.1) 28

29 5.1. GEOMETRIC BROWNIAN MOTIONS 29 Proof. Set Y = S(t) S(0) = S(t) s 0. Then, Y is lognormal with (tµ,tσ 2 ). Thus by using Theorem 3.3 we have E[Y ] = e t(µ+σ 2 /2). On the other hand, we have Hence, we have (5.1). E[Y ] = E[S(t)] s 0. Remark 5.1. Here the mean of S(t) increase exponentially with the rate µ + σ 2 /2, not the rate of µ. The parameter σ represents the fluctuation, but since the lognormal distribution has some bias, σ affects the average exponential growth rate. Theorem 5.1. Let S(t) be a geometric Brownian motion with its drift µ and volatility σ, then for a fixed t 0, S(t) can be represented as where W is the normal random variable N(µt,σ 2 t). S(t) = S(0)e W, (5.2) Proof. We need to check that (5.2) satisfies the second part of Definition 5.1, which is easy by taking log on both sides in (5.2) and by seeing ( ) S(t) log = W. (5.3) S(0) Sometimes instead of Definition 5.1, we use the following stochastic differential equation to define geometric Brownian motions, ds(t) = µs(t)dt + σs(t)db(t), (5.4) where B(t) is the standard Brownian motion and db(t) is define by so-called Ito calculus. Note that the standard Brownian motion is continuous function but nowhere differentiable, so the terms like db(t)/dt should be treated appropriately. But these treatment includes the knowledge beyond this book. Thus, the following parts is not mathematically rigorous. The solution to this equation is S(t) = S(0)e µt+σb(t). (5.5) Formally, if we differentiate (5.5), we can verify this satisfy (5.4). It easy to see that S(t) satisfies Definition 5.1.

30 30 CHAPTER 5. BROWNIAN MOTIONS AND POISSON PROCESSES Theorem 5.2. The geometric Brownian motion S(t) satisfies the stochastic differential equation, with the initial condition S(0). ds(t) = µs(t)dt + σs(t)db(t), (5.6) 5.2 Discrete Time Tree Binomial Process In this section, we imitate a stock price dynamic on the discrete time, which is subject to geometric Brownian motion. Definition 5.2 (Tree binomial process). The process S n is said to be a Tree Binomial process, if the dynamics is express as { us n 1 with probability p, S n = (5.7) ds n 1 with probability 1 p, for some constant factors u d 0 and some probability p. Figure 5.1 depicts a sample path of Tree Binomial Process. Do you think it is reasonably similar to Figure 5.2, which is a exchange rate of yen? Theorem 5.3 (Approximation of geometric brownian motion). A geometric Brownian motion S(t) with the drift µ and the volatillity σ can be regardede as the limit of a tree binomial process with the parameters: d = e σ, (5.8) u = e σ, (5.9) p = 1 ( 1 + µ ). (5.10) 2 σ Proof. Let be a small interval and n = t/. Define a tree binomial process on the discrete time {i } such as { us((i 1) ) with probability p, S(i ) = (5.11) ds((i 1) ) with probability 1 p, for all n. The multiplication factors d and u as well as the probability p have some specific value, as described in the following, to imitate the dynamics of the Brownian motion. Let Y i be the indicator of up s (Bernouilli random variable), i.e., { 1 up at time i, Y i = (5.12) 0 down at time i.

31 5.2. DISCRETE TIME TREE BINOMIAL PROCESS Figure 5.1: An example of Tree binomial process with d = ,u = , p = and starting from S 0 = 100. Figure 5.2: Exchange rate of yen at 2005/10/19. Adopted from

32 32 CHAPTER 5. BROWNIAN MOTIONS AND POISSON PROCESSES Thus, the number of up s up to time n is n i=1 Y i, and the number of down s is n n i=1 Y i. Hence, the stock price at time n given that the initial price S(0) is Now set S(n ) = S(0)u Y i d n Y i ( = S(0)d n u ) Yi. d S(t) = S(0)d t/ (u/d) t/ i=1 Y i, (5.13) and show that the limit S(t) is geometric Brownian motion with the drift µ and the volatility σ as 0. Taking log on the both side, we have log ( ) S(t) = t ( u S(0) logd + log d ) t/ i=1 Y i. (5.14) To imitate the dynamics of geometric Brownian motion, here we artificially set d = e σ, u = e σ, p = 1 2 ( 1 + µ σ ). Note that logd and logu are symmetric, while d and u are asymmetric, and p 1/2 as 0. Now using these, we have log ( ) S(t) = tσ + 2σ t/ S(0) Y i. (5.15) Consider taking the limit 0, then the number of terms in the sum increases. Using Central Limit Theorem (Theorem 4.5), we have the approximation: t/ i=1 i=1 Y i Normal distribution. (5.16) Thus, log(s(t)/s(0)) has also the normal distribution with its mean and variance: E[log(S(t)/S(0)] = tσ + 2σ t/ E[Y i ] i=1 = tσ + 2σ t p, = tσ + σ t ( 1 + µ ) σ = µt,

33 5.3. BROWNIAN MOTIONS 33 and t/ Var[log(S(t)/S(0)] = 4σ 2 Var[Y i ] i=1 = 4σ 2 t p(1 p) σ 2 t, as 0, since p 1/2. Thus, log(s(t)/s(0)) N[µt,σ 2 t], (5.17) and moreover S(t + y)/s(t) is independent with the past because of the construction of S(t). Hence the limiting distribution of S(t) with 0 is a geometric Brownian motion, which means all the geometric Brownian motion S(t) with the drift µ and the volatility σ can be approximated by an appropriate tree Binomial process. Remark 5.2. In the following, to show the properties of geometric Brownian motions, we sometimes check the properties holds for the corresponding tree Binomial Process and then taking appropriate limit to show the validity of the properties of geometric Brownian motions. Example 5.1 (Trajectories of geometric brownian motions). This is an example of different trajectories of Geometric Brownian Motions. Here we set the drift µ = 0.1, the volatility σ = 0.3 and the interval = 0.01 in the geometric brownian motion. As pointed out in Remark 5.2, we use tree binomial process as an approximation of geometric brownian motion. Thus, the corresponding parameters are d = , u = and p = It is important to see that the path from geometric brownian motion will differ time by time, even we have the same parameters. It can rise and down. Figure 5.4 gathers those 10 different trajectories. Since we set the drift µ = 0.1 > 0, we can see up-ward trend while each trajectories are radically different. On the other hand, Figure 5.5 shows the geometric brownian motion with the negative drift µ = 0.1, and the probability of the upward change p = Can you see the difference between Figure 5.5 and Figure 5.4? 5.3 Brownian Motions Definition 5.3 (Brownian motion). A stochastic process S(t) is said to be a Brownian motion, if S(t +y) S(y) is a normal random variable N[µt,σ 2 t], which is independent with the past history up to the time y. Note that for a Brownian motion, we have E[S(t)] = µt, (5.18) Var[S(t)] = σ 2 t. (5.19)

34 34 CHAPTER 5. BROWNIAN MOTIONS AND POISSON PROCESSES Figure 5.3: Another example of geometric brownian motion with the drift µ = 0.1, the volatility σ = 0.3 and the interval = 0.01 and starting from S 0 = Figure 5.4: Many trajectories of geometric brownian motion with the upward drift µ = 0.1, the volatility σ = 0.3 and the interval = 0.01 and starting from S 0 = 100.

35 5.3. BROWNIAN MOTIONS Figure 5.5: Many trajectories of geometric brownian motion with the downward drift µ = 0.1. Thus, both quantities are increasing as t increases. Brownian motions is more widely used than geometric Brownian motions. However, to apply them in finance theory, Brownian motion has two major draw backs. First, Brownian motions allows to have negative value. Second, in Brownian motion, the price differences on the same time intervals with same length is stochastically same, no matter the initial value, which is not realistic in the real finance world. An example of trajectroy of two-dimensional brownian motion can be found in Figure Figure 5.6: An example of trajectory of two-dimensional brownian motion.

36 36 CHAPTER 5. BROWNIAN MOTIONS AND POISSON PROCESSES 5.4 Poisson Processes Definition 5.4 (Counting process). Let N(t) be the number of event during [0, t). The process N(t) is called a counting process. Definition 5.5 (Independent increments). A stochastic process N(t) is said to have independent increment if N(t 1 ) N(t 0 ),N(t 2 ) N(t 1 ),...,N(t n ) N(t n 1 ), (5.20) are independent for all choice of the time instants t 0 < t 1,<... < t n. Thus, intuitively the future direction of the process which has independent increment are independent with its past history. Definition 5.6 (Poisson Processes). Poisson process N(t) is a counting process of events, which has the following features: 1. N(0) = 0, 2. N(t) has independent increments, 3. P{N(t + s) N(s) = n} = e λt (λt) n n!, where λ > 0 is the rate of the events. Theorem 5.4. Let N(t) be a Poisson process with its rate λ, then we have Or rewriting this, we have which validates that we call λ the rate of the process. Proof. By Definition 5.6, we have E[N(t)] = E[N(t)] = λt, (5.21) Var[N(t)] = λt. (5.22) λ = E[N(t)], (5.23) t = n=0 n=0 np{n(t) = n} λt (λt)n ne n! = λte λt (λt) n 1 n=0 (n 1)! = λte λt e λt = λt. The other is left for readers to prove (see Exercise 5.2).

37 5.5. REFERENCES References 5.6 Exercises Exercise 5.1. Find E[S n ] and Var[S n ] for tree binomial process. Exercise 5.2. Let N(t) be a Poisson process with its rate λ, and prove Var[N(t)] = λt. (5.24)

38 Chapter 6 Simulating Brownian Motions We simulate Brownian motions using Mathematica, following the steps below. 6.1 Mathematica As an Advanced Calculator Mathematica can handle mathematical formulas and derive the result as much precision as possible. You can write an appropriate formula after the prompt In[ ]:=, and hit shift + return to obtain the result. Example 6.1 (Calculations). These are some examples of calculations in Mathematica. In[2]:=1/3 + 1/4 Out[2]= 7 12 In[3]:=Expand[(a + b) 2 ] Out[3]= a 2 + 2ab + b 2 In[4]:=D[ax 2 + bx + c,x] Out[4]= b + 2ax Those mathematical symbols are built-in. Example 6.2 (Symbols). Here s examples of symbols in Mathematica. Pi = π (6.1) I = i imaginary number (6.2) Log[x] = log e x (6.3) Sin[θ] = sin(θ). (6.4) (6.5) 38

39 6.2. GENERATING RANDOM VARIABLES Generating Random Variables Generating random variables is a core of simulation. The function Random[ ] will create a uniform random variable on the interval [0,1]. Each time the function called, Example 6.3 (Random variable sequence). Here s a simple example how to generate a sequence of random variables. In[3]:= SeedRandom[128] In[4]:= {Random[],Random[],Random[]} Out[4]= { , , } In[5]:= SeedRandom[128] In[6]:= {Random[],Random[],Random[]} Out[6]= { , , } The function SeedRandom set the seed of the random number. If no SeedRandom is set, Mathematica set the seed from the current time, so as to generate the different random sequence each time. Example 6.4. You can create table by using the function Table. In[1]:=Table[Random[Integer],{10}] Out[1]= {1, 1, 0, 0, 0, 1, 0, 1, 1, 0} Here s one more example to generate a kind of random walk. In[1]:=S = 0; Table[ t = Random[Integer]; S = S + t, 10] Out[1]= {0, 1, 2, 2, 2, 2, 3, 4, 5, 5} 6.3 Two-Dimensional Brownian Motions Random Variable on Unit Circle Generate a pair of random variable (X,Y ) on unit circle as shown in Figure 6.1. The pair should have the relation X 2 +Y 2 = 1. Check your pairs of random variables drawing a graph like Figure 6.1. You can use the function ListPlot to display the results Generating Brownian Motion Using the pairs of random variables as the movement, simulate a brownian motion as shown in Figure 6.2.

40 40 CHAPTER 6. SIMULATING BROWNIAN MOTIONS Figure 6.1: An example of Random Variables on Unit Circle Figure 6.2: An example of trajectroy of two-dimensional brownian motion.

41 6.4. GEOMETRIC BROWNIAN MOTIONS Geometric Brownian Motions Simulate geometric brownian motions with the drift µ and the volatility σ. Use an approximation of tree-binomial process Generating Bernouilli Random Variables Generate a Bernouilli random variable T with its mean E[T ] = p. You may use the package Statistics DiscreteDistributions, if needed. Make a sequence of Bernouilli random variables as Out[xx] = {0,1,1,0,1,0,0,0,1,0,0,1,1,0,0,0,0,1,0,0,1,0,0,1,1,1,1,0,1,0,1,0,0,1,0,1,1,1}. Then, use this sequence to generate the multiplication factors of tree-binomial process as shown below. Out[xx] = { , , , , , , , } Generating geometric brownian motions Take = 0.01 and set appropriate u, d and p for the tree-binomial process. Generate an approximation of geometric brownian motion with the drift µ and the volatility σ as in Figure Figure 6.3: Another example of geometric brownian motion with the drift µ = 0.1, the volatility σ = 0.3 and the interval = 0.01 and starting from S 0 = 100.

42 Chapter 7 Present Value Analysis 7.1 Interest Rates Let r be the interest rate on some term, and P be the amount of money you saved in the bank. Then, at the end of the term you get P + rp = (1 + r)p. (7.1) Example 7.1 (Compounding). If you have the interest rate r, which is compounded semi-annually, you can get P(1 + r/2)(1 + r/2) = P(1 + r/2) 2, (7.2) at the end of the year. Sometimes, we call r the nominal interest rate or yearly interest rate. Definition 7.1 (Effective interest rate). Let P 0 be the amount of money you invested initially, P 1 be the amount of money you obtained at the end of the year. Then, the value r e is said to be the effective interest rate, where r e = P 1 P 0 P 0. (7.3) In other words, given you invested P 0 initially, the amount you get in the end of a year can be expressed as P 1 = P 0 (1 + r e ). (7.4) Example 7.2 (Compound interest rate). Assume the same situation as in Example 7.1. The effective interest rate r e in this case is r e = P 1 P 0 P 0 = P(1 + r/2)2 P P = (1 + r/2)

43 7.2. CONTINUOUSLY COMPOUNDED INTEREST RATES 43 Remark 7.1. In economics, the word nominal interest rate is used as the interest rate not including the effect of inflation, while the word effective interest rate is used as indicated to be adjusted to the inflation factor. 7.2 Continuously Compounded Interest Rates First, let us recall the definition of exponential, or Euler constant. Definition 7.2 (exponential). We define e by the following: e = lim n (1 + 1/n) n. (7.5) The following is a useful expression of the so-called exponential function Exp(x) = e x. Lemma 7.1 (exponential). e x = lim n (1 + x/n) n. (7.6) Proof. By the Definition 7.2, we have { e x = lim (1 + 1/n)n} x n = lim (1 + 1/n) xn n = lim (1 + m x/m)m, where we set m = xn. Suppose we divide one year into n equal intervals, and consider n-compounded yearly interest rate. Let r be the nominal interest rate and assume we initially invest P, then at the end of the year, we obtain P(1 + r/n) n. (7.7) Now, consider we have large n and take the limit n, which is turn out to be the continuous compounding. where we used Lemma 7.1. lim P(1 + n r/n)n = Pe r, (7.8) Theorem 7.1 (Continuous compounding). Let r be the nominal interest rate. effective interest rate of continuous compounding on 1 year is The r e = e r 1. (7.9)

44 44 CHAPTER 7. PRESENT VALUE ANALYSIS Proof. By Definition 7.1, r e = P 1 P 0 P 0 = P 0e r P 0 P 0 = e r 1. Corollary 7.1. The effective interest rate is always larger than the nominal interest rate. Proof. By Theorem 7.1, we have for r 0. r e = e r 1 > r, (7.10) 7.3 Present Value Suppose we can get be the amount of money v at the end of the period i, when r is the nominal interest rate of the period. What can you say about the value of this money now? Or more precisely, when someone is offered to us the above investment, how much is the rational cost of this? The answer is this: Suppose we lend the amount X at time 0, and we need to pay its interest at the time i. We need to pay X(1 + r) i at the end of period i. If we have the relation; X(1 + r) i = v, (7.11) then we can pay back the money. So, the present value of v of time i is equal to v(1 + r) i. The term (1 + r) i bring back the future value to the present value. More generally, we have the following theorem for cash streams. Theorem 7.2 (Present value of cash stream). Let r be the interest rate. Given the cash stream; a = (a 1,a 2,...,a n ), (7.12) where a i is the return at the end of year i. Then, the cash stream a has the present value PV (a) such as PV (a) = n i=1 a i (1 + r) i. (7.13) Or, if we deposit the amount PV (a) at time 0 at a bank, we can obtain the equivalent cash stream as a.

45 7.4. RATE OF RETURN 45 Proof. At the end of the first year, we withdraw a 1 from our bank account. Thus, taking into the interest rate r, we have as the remainder, (1 + r)pv (a) a 1 = a r + + a n. (7.14) (1 + r) n 1 Continue this procedure to the end of the year i, and we have a i r + + a n, (7.15) (1 + r) n i at our bank. At the end of year n 1, we have only a n /(1 + r) at our bank. Thus, we pay a n at the end of the year n, and all cleared. We can even consider the perpetual payment. Corollary 7.2 (Present value of perpetual payment). Suppose we have a perpetual cash stream of c, i.e., a = (c,c,c,...). (7.16) Then the present value of this cash stream PV (a) = c/r, given the interest rate r. Proof. By extending Theorem 7.2, we have PV (a) = i=1 = c 1 + r = c 1 + r = c r. c (1 + r) i i=0 1 (1 + r) i 1 1 1/(1 + r) Remark 7.2. The result of Corollary 7.2 is not surprising, if we consider to deposit c/r in a bank, and we can get c as annual interest without affecting the original principal. 7.4 Rate of Return Definition 7.3 (Rate of return). Suppose we invest a as the initial payment of an investment, and let b be its return after 1 period. Then we can define the rate of return as r = b 1, (7.17) a which is the interest rate that makes the present value of the return equal to the initial payment.

Homework Assignments

Homework Assignments Homework Assignments Week 1 (p. 57) #4.1, 4., 4.3 Week (pp 58 6) #4.5, 4.6, 4.8(a), 4.13, 4.0, 4.6(b), 4.8, 4.31, 4.34 Week 3 (pp 15 19) #1.9, 1.1, 1.13, 1.15, 1.18 (pp 9 31) #.,.6,.9 Week 4 (pp 36 37)

More information

Brownian Motion. Richard Lockhart. Simon Fraser University. STAT 870 Summer 2011

Brownian Motion. Richard Lockhart. Simon Fraser University. STAT 870 Summer 2011 Brownian Motion Richard Lockhart Simon Fraser University STAT 870 Summer 2011 Richard Lockhart (Simon Fraser University) Brownian Motion STAT 870 Summer 2011 1 / 33 Purposes of Today s Lecture Describe

More information

Drunken Birds, Brownian Motion, and Other Random Fun

Drunken Birds, Brownian Motion, and Other Random Fun Drunken Birds, Brownian Motion, and Other Random Fun Michael Perlmutter Department of Mathematics Purdue University 1 M. Perlmutter(Purdue) Brownian Motion and Martingales Outline Review of Basic Probability

More information

Reading: You should read Hull chapter 12 and perhaps the very first part of chapter 13.

Reading: You should read Hull chapter 12 and perhaps the very first part of chapter 13. FIN-40008 FINANCIAL INSTRUMENTS SPRING 2008 Asset Price Dynamics Introduction These notes give assumptions of asset price returns that are derived from the efficient markets hypothesis. Although a hypothesis,

More information

Random Variables Handout. Xavier Vilà

Random Variables Handout. Xavier Vilà Random Variables Handout Xavier Vilà Course 2004-2005 1 Discrete Random Variables. 1.1 Introduction 1.1.1 Definition of Random Variable A random variable X is a function that maps each possible outcome

More information

Probability. An intro for calculus students P= Figure 1: A normal integral

Probability. An intro for calculus students P= Figure 1: A normal integral Probability An intro for calculus students.8.6.4.2 P=.87 2 3 4 Figure : A normal integral Suppose we flip a coin 2 times; what is the probability that we get more than 2 heads? Suppose we roll a six-sided

More information

S t d with probability (1 p), where

S t d with probability (1 p), where Stochastic Calculus Week 3 Topics: Towards Black-Scholes Stochastic Processes Brownian Motion Conditional Expectations Continuous-time Martingales Towards Black Scholes Suppose again that S t+δt equals

More information

IEOR 3106: Introduction to OR: Stochastic Models. Fall 2013, Professor Whitt. Class Lecture Notes: Tuesday, September 10.

IEOR 3106: Introduction to OR: Stochastic Models. Fall 2013, Professor Whitt. Class Lecture Notes: Tuesday, September 10. IEOR 3106: Introduction to OR: Stochastic Models Fall 2013, Professor Whitt Class Lecture Notes: Tuesday, September 10. The Central Limit Theorem and Stock Prices 1. The Central Limit Theorem (CLT See

More information

IEOR E4703: Monte-Carlo Simulation

IEOR E4703: Monte-Carlo Simulation IEOR E4703: Monte-Carlo Simulation Generating Random Variables and Stochastic Processes Martin Haugh Department of Industrial Engineering and Operations Research Columbia University Email: martin.b.haugh@gmail.com

More information

Math 489/Math 889 Stochastic Processes and Advanced Mathematical Finance Dunbar, Fall 2007

Math 489/Math 889 Stochastic Processes and Advanced Mathematical Finance Dunbar, Fall 2007 Steven R. Dunbar Department of Mathematics 203 Avery Hall University of Nebraska-Lincoln Lincoln, NE 68588-0130 http://www.math.unl.edu Voice: 402-472-3731 Fax: 402-472-8466 Math 489/Math 889 Stochastic

More information

5.7 Probability Distributions and Variance

5.7 Probability Distributions and Variance 160 CHAPTER 5. PROBABILITY 5.7 Probability Distributions and Variance 5.7.1 Distributions of random variables We have given meaning to the phrase expected value. For example, if we flip a coin 100 times,

More information

Martingales, Part II, with Exercise Due 9/21

Martingales, Part II, with Exercise Due 9/21 Econ. 487a Fall 1998 C.Sims Martingales, Part II, with Exercise Due 9/21 1. Brownian Motion A process {X t } is a Brownian Motion if and only if i. it is a martingale, ii. t is a continuous time parameter

More information

AMH4 - ADVANCED OPTION PRICING. Contents

AMH4 - ADVANCED OPTION PRICING. Contents AMH4 - ADVANCED OPTION PRICING ANDREW TULLOCH Contents 1. Theory of Option Pricing 2 2. Black-Scholes PDE Method 4 3. Martingale method 4 4. Monte Carlo methods 5 4.1. Method of antithetic variances 5

More information

Introduction to Probability Theory and Stochastic Processes for Finance Lecture Notes

Introduction to Probability Theory and Stochastic Processes for Finance Lecture Notes Introduction to Probability Theory and Stochastic Processes for Finance Lecture Notes Fabio Trojani Department of Economics, University of St. Gallen, Switzerland Correspondence address: Fabio Trojani,

More information

BROWNIAN MOTION Antonella Basso, Martina Nardon

BROWNIAN MOTION Antonella Basso, Martina Nardon BROWNIAN MOTION Antonella Basso, Martina Nardon basso@unive.it, mnardon@unive.it Department of Applied Mathematics University Ca Foscari Venice Brownian motion p. 1 Brownian motion Brownian motion plays

More information

Math489/889 Stochastic Processes and Advanced Mathematical Finance Homework 5

Math489/889 Stochastic Processes and Advanced Mathematical Finance Homework 5 Math489/889 Stochastic Processes and Advanced Mathematical Finance Homework 5 Steve Dunbar Due Fri, October 9, 7. Calculate the m.g.f. of the random variable with uniform distribution on [, ] and then

More information

Continuous Processes. Brownian motion Stochastic calculus Ito calculus

Continuous Processes. Brownian motion Stochastic calculus Ito calculus Continuous Processes Brownian motion Stochastic calculus Ito calculus Continuous Processes The binomial models are the building block for our realistic models. Three small-scale principles in continuous

More information

Market Volatility and Risk Proxies

Market Volatility and Risk Proxies Market Volatility and Risk Proxies... an introduction to the concepts 019 Gary R. Evans. This slide set by Gary R. Evans is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International

More information

1.1 Basic Financial Derivatives: Forward Contracts and Options

1.1 Basic Financial Derivatives: Forward Contracts and Options Chapter 1 Preliminaries 1.1 Basic Financial Derivatives: Forward Contracts and Options A derivative is a financial instrument whose value depends on the values of other, more basic underlying variables

More information

CRRAO Advanced Institute of Mathematics, Statistics and Computer Science (AIMSCS) Research Report. B. L. S. Prakasa Rao

CRRAO Advanced Institute of Mathematics, Statistics and Computer Science (AIMSCS) Research Report. B. L. S. Prakasa Rao CRRAO Advanced Institute of Mathematics, Statistics and Computer Science (AIMSCS) Research Report Author (s): B. L. S. Prakasa Rao Title of the Report: Option pricing for processes driven by mixed fractional

More information

1 The continuous time limit

1 The continuous time limit Derivative Securities, Courant Institute, Fall 2008 http://www.math.nyu.edu/faculty/goodman/teaching/derivsec08/index.html Jonathan Goodman and Keith Lewis Supplementary notes and comments, Section 3 1

More information

1 Geometric Brownian motion

1 Geometric Brownian motion Copyright c 05 by Karl Sigman Geometric Brownian motion Note that since BM can take on negative values, using it directly for modeling stock prices is questionable. There are other reasons too why BM is

More information

Statistics for Business and Economics

Statistics for Business and Economics Statistics for Business and Economics Chapter 5 Continuous Random Variables and Probability Distributions Ch. 5-1 Probability Distributions Probability Distributions Ch. 4 Discrete Continuous Ch. 5 Probability

More information

Financial Derivatives Section 5

Financial Derivatives Section 5 Financial Derivatives Section 5 The Black and Scholes Model Michail Anthropelos anthropel@unipi.gr http://web.xrh.unipi.gr/faculty/anthropelos/ University of Piraeus Spring 2018 M. Anthropelos (Un. of

More information

Discrete Random Variables and Probability Distributions

Discrete Random Variables and Probability Distributions Chapter 4 Discrete Random Variables and Probability Distributions 4.1 Random Variables A quantity resulting from an experiment that, by chance, can assume different values. A random variable is a variable

More information

Exam M Fall 2005 PRELIMINARY ANSWER KEY

Exam M Fall 2005 PRELIMINARY ANSWER KEY Exam M Fall 005 PRELIMINARY ANSWER KEY Question # Answer Question # Answer 1 C 1 E C B 3 C 3 E 4 D 4 E 5 C 5 C 6 B 6 E 7 A 7 E 8 D 8 D 9 B 9 A 10 A 30 D 11 A 31 A 1 A 3 A 13 D 33 B 14 C 34 C 15 A 35 A

More information

Stochastic Processes and Financial Mathematics (part one) Dr Nic Freeman

Stochastic Processes and Financial Mathematics (part one) Dr Nic Freeman Stochastic Processes and Financial Mathematics (part one) Dr Nic Freeman December 15, 2017 Contents 0 Introduction 3 0.1 Syllabus......................................... 4 0.2 Problem sheets.....................................

More information

Lecture Notes for Chapter 6. 1 Prototype model: a one-step binomial tree

Lecture Notes for Chapter 6. 1 Prototype model: a one-step binomial tree Lecture Notes for Chapter 6 This is the chapter that brings together the mathematical tools (Brownian motion, Itô calculus) and the financial justifications (no-arbitrage pricing) to produce the derivative

More information

Business Statistics 41000: Probability 3

Business Statistics 41000: Probability 3 Business Statistics 41000: Probability 3 Drew D. Creal University of Chicago, Booth School of Business February 7 and 8, 2014 1 Class information Drew D. Creal Email: dcreal@chicagobooth.edu Office: 404

More information

Risk Neutral Valuation

Risk Neutral Valuation copyright 2012 Christian Fries 1 / 51 Risk Neutral Valuation Christian Fries Version 2.2 http://www.christian-fries.de/finmath April 19-20, 2012 copyright 2012 Christian Fries 2 / 51 Outline Notation Differential

More information

Theoretical Foundations

Theoretical Foundations Theoretical Foundations Probabilities Monia Ranalli monia.ranalli@uniroma2.it Ranalli M. Theoretical Foundations - Probabilities 1 / 27 Objectives understand the probability basics quantify random phenomena

More information

3 Stock under the risk-neutral measure

3 Stock under the risk-neutral measure 3 Stock under the risk-neutral measure 3 Adapted processes We have seen that the sampling space Ω = {H, T } N underlies the N-period binomial model for the stock-price process Elementary event ω = ω ω

More information

Class Notes on Financial Mathematics. No-Arbitrage Pricing Model

Class Notes on Financial Mathematics. No-Arbitrage Pricing Model Class Notes on No-Arbitrage Pricing Model April 18, 2016 Dr. Riyadh Al-Mosawi Department of Mathematics, College of Education for Pure Sciences, Thiqar University References: 1. Stochastic Calculus for

More information

SYSM 6304: Risk and Decision Analysis Lecture 6: Pricing and Hedging Financial Derivatives

SYSM 6304: Risk and Decision Analysis Lecture 6: Pricing and Hedging Financial Derivatives SYSM 6304: Risk and Decision Analysis Lecture 6: Pricing and Hedging Financial Derivatives M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas Email: M.Vidyasagar@utdallas.edu October

More information

Math-Stat-491-Fall2014-Notes-V

Math-Stat-491-Fall2014-Notes-V Math-Stat-491-Fall2014-Notes-V Hariharan Narayanan December 7, 2014 Martingales 1 Introduction Martingales were originally introduced into probability theory as a model for fair betting games. Essentially

More information

An Introduction to Stochastic Calculus

An Introduction to Stochastic Calculus An Introduction to Stochastic Calculus Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Week 5 Haijun Li An Introduction to Stochastic Calculus Week 5 1 / 20 Outline 1 Martingales

More information

Chapter 3 Common Families of Distributions. Definition 3.4.1: A family of pmfs or pdfs is called exponential family if it can be expressed as

Chapter 3 Common Families of Distributions. Definition 3.4.1: A family of pmfs or pdfs is called exponential family if it can be expressed as Lecture 0 on BST 63: Statistical Theory I Kui Zhang, 09/9/008 Review for the previous lecture Definition: Several continuous distributions, including uniform, gamma, normal, Beta, Cauchy, double exponential

More information

STOCHASTIC CALCULUS AND BLACK-SCHOLES MODEL

STOCHASTIC CALCULUS AND BLACK-SCHOLES MODEL STOCHASTIC CALCULUS AND BLACK-SCHOLES MODEL YOUNGGEUN YOO Abstract. Ito s lemma is often used in Ito calculus to find the differentials of a stochastic process that depends on time. This paper will introduce

More information

A No-Arbitrage Theorem for Uncertain Stock Model

A No-Arbitrage Theorem for Uncertain Stock Model Fuzzy Optim Decis Making manuscript No (will be inserted by the editor) A No-Arbitrage Theorem for Uncertain Stock Model Kai Yao Received: date / Accepted: date Abstract Stock model is used to describe

More information

Mathematics of Finance Final Preparation December 19. To be thoroughly prepared for the final exam, you should

Mathematics of Finance Final Preparation December 19. To be thoroughly prepared for the final exam, you should Mathematics of Finance Final Preparation December 19 To be thoroughly prepared for the final exam, you should 1. know how to do the homework problems. 2. be able to provide (correct and complete!) definitions

More information

Stochastic calculus Introduction I. Stochastic Finance. C. Azizieh VUB 1/91. C. Azizieh VUB Stochastic Finance

Stochastic calculus Introduction I. Stochastic Finance. C. Azizieh VUB 1/91. C. Azizieh VUB Stochastic Finance Stochastic Finance C. Azizieh VUB C. Azizieh VUB Stochastic Finance 1/91 Agenda of the course Stochastic calculus : introduction Black-Scholes model Interest rates models C. Azizieh VUB Stochastic Finance

More information

Outline Brownian Process Continuity of Sample Paths Differentiability of Sample Paths Simulating Sample Paths Hitting times and Maximum

Outline Brownian Process Continuity of Sample Paths Differentiability of Sample Paths Simulating Sample Paths Hitting times and Maximum Normal Distribution and Brownian Process Page 1 Outline Brownian Process Continuity of Sample Paths Differentiability of Sample Paths Simulating Sample Paths Hitting times and Maximum Searching for a Continuous-time

More information

MATH3075/3975 FINANCIAL MATHEMATICS TUTORIAL PROBLEMS

MATH3075/3975 FINANCIAL MATHEMATICS TUTORIAL PROBLEMS MATH307/37 FINANCIAL MATHEMATICS TUTORIAL PROBLEMS School of Mathematics and Statistics Semester, 04 Tutorial problems should be used to test your mathematical skills and understanding of the lecture material.

More information

Math 416/516: Stochastic Simulation

Math 416/516: Stochastic Simulation Math 416/516: Stochastic Simulation Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Week 13 Haijun Li Math 416/516: Stochastic Simulation Week 13 1 / 28 Outline 1 Simulation

More information

Chapter 5. Statistical inference for Parametric Models

Chapter 5. Statistical inference for Parametric Models Chapter 5. Statistical inference for Parametric Models Outline Overview Parameter estimation Method of moments How good are method of moments estimates? Interval estimation Statistical Inference for Parametric

More information

Favorite Distributions

Favorite Distributions Favorite Distributions Binomial, Poisson and Normal Here we consider 3 favorite distributions in statistics: Binomial, discovered by James Bernoulli in 1700 Poisson, a limiting form of the Binomial, found

More information

A useful modeling tricks.

A useful modeling tricks. .7 Joint models for more than two outcomes We saw that we could write joint models for a pair of variables by specifying the joint probabilities over all pairs of outcomes. In principal, we could do this

More information

IEOR E4703: Monte-Carlo Simulation

IEOR E4703: Monte-Carlo Simulation IEOR E4703: Monte-Carlo Simulation Simulating Stochastic Differential Equations Martin Haugh Department of Industrial Engineering and Operations Research Columbia University Email: martin.b.haugh@gmail.com

More information

Stochastic Analysis and its Applications 1

Stochastic Analysis and its Applications 1 Stochastic Analysis and its Applications 1 Hiroshi Toyoizumi 2 March 8, 213 1 This handout can be found at www.f.waseda.jp/toyoizumi. 2 E-mail: toyoizumi@waseda.jp Contents 1 Introduction 5 2 Example:

More information

IEOR 3106: Introduction to Operations Research: Stochastic Models SOLUTIONS to Final Exam, Sunday, December 16, 2012

IEOR 3106: Introduction to Operations Research: Stochastic Models SOLUTIONS to Final Exam, Sunday, December 16, 2012 IEOR 306: Introduction to Operations Research: Stochastic Models SOLUTIONS to Final Exam, Sunday, December 6, 202 Four problems, each with multiple parts. Maximum score 00 (+3 bonus) = 3. You need to show

More information

Lecture 17. The model is parametrized by the time period, δt, and three fixed constant parameters, v, σ and the riskless rate r.

Lecture 17. The model is parametrized by the time period, δt, and three fixed constant parameters, v, σ and the riskless rate r. Lecture 7 Overture to continuous models Before rigorously deriving the acclaimed Black-Scholes pricing formula for the value of a European option, we developed a substantial body of material, in continuous

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 11 10/9/2013. Martingales and stopping times II

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 11 10/9/2013. Martingales and stopping times II MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 11 10/9/013 Martingales and stopping times II Content. 1. Second stopping theorem.. Doob-Kolmogorov inequality. 3. Applications of stopping

More information

Lecture 6: Option Pricing Using a One-step Binomial Tree. Thursday, September 12, 13

Lecture 6: Option Pricing Using a One-step Binomial Tree. Thursday, September 12, 13 Lecture 6: Option Pricing Using a One-step Binomial Tree An over-simplified model with surprisingly general extensions a single time step from 0 to T two types of traded securities: stock S and a bond

More information

The Binomial Lattice Model for Stocks: Introduction to Option Pricing

The Binomial Lattice Model for Stocks: Introduction to Option Pricing 1/33 The Binomial Lattice Model for Stocks: Introduction to Option Pricing Professor Karl Sigman Columbia University Dept. IEOR New York City USA 2/33 Outline The Binomial Lattice Model (BLM) as a Model

More information

Dr. Maddah ENMG 625 Financial Eng g II 10/16/06

Dr. Maddah ENMG 625 Financial Eng g II 10/16/06 Dr. Maddah ENMG 65 Financial Eng g II 10/16/06 Chapter 11 Models of Asset Dynamics () Random Walk A random process, z, is an additive process defined over times t 0, t 1,, t k, t k+1,, such that z( t )

More information

5. In fact, any function of a random variable is also a random variable

5. In fact, any function of a random variable is also a random variable Random Variables - Class 11 October 14, 2012 Debdeep Pati 1 Random variables 1.1 Expectation of a function of a random variable 1. Expectation of a function of a random variable 2. We know E(X) = x xp(x)

More information

Lecture 23: April 10

Lecture 23: April 10 CS271 Randomness & Computation Spring 2018 Instructor: Alistair Sinclair Lecture 23: April 10 Disclaimer: These notes have not been subjected to the usual scrutiny accorded to formal publications. They

More information

Version A. Problem 1. Let X be the continuous random variable defined by the following pdf: 1 x/2 when 0 x 2, f(x) = 0 otherwise.

Version A. Problem 1. Let X be the continuous random variable defined by the following pdf: 1 x/2 when 0 x 2, f(x) = 0 otherwise. Math 224 Q Exam 3A Fall 217 Tues Dec 12 Version A Problem 1. Let X be the continuous random variable defined by the following pdf: { 1 x/2 when x 2, f(x) otherwise. (a) Compute the mean µ E[X]. E[X] x

More information

Random Variables and Applications OPRE 6301

Random Variables and Applications OPRE 6301 Random Variables and Applications OPRE 6301 Random Variables... As noted earlier, variability is omnipresent in the business world. To model variability probabilistically, we need the concept of a random

More information

Chapter 5. Continuous Random Variables and Probability Distributions. 5.1 Continuous Random Variables

Chapter 5. Continuous Random Variables and Probability Distributions. 5.1 Continuous Random Variables Chapter 5 Continuous Random Variables and Probability Distributions 5.1 Continuous Random Variables 1 2CHAPTER 5. CONTINUOUS RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Probability Distributions Probability

More information

This page intentionally left blank

This page intentionally left blank This page intentionally left blank An Elementary Introduction to Mathematical Finance, Third Edition This textbook on the basics of option pricing is accessible to readers with limited mathematical training.

More information

2. Modeling Uncertainty

2. Modeling Uncertainty 2. Modeling Uncertainty Models for Uncertainty (Random Variables): Big Picture We now move from viewing the data to thinking about models that describe the data. Since the real world is uncertain, our

More information

Stochastic Dynamical Systems and SDE s. An Informal Introduction

Stochastic Dynamical Systems and SDE s. An Informal Introduction Stochastic Dynamical Systems and SDE s An Informal Introduction Olav Kallenberg Graduate Student Seminar, April 18, 2012 1 / 33 2 / 33 Simple recursion: Deterministic system, discrete time x n+1 = f (x

More information

4.3 Normal distribution

4.3 Normal distribution 43 Normal distribution Prof Tesler Math 186 Winter 216 Prof Tesler 43 Normal distribution Math 186 / Winter 216 1 / 4 Normal distribution aka Bell curve and Gaussian distribution The normal distribution

More information

9 Expectation and Variance

9 Expectation and Variance 9 Expectation and Variance Two numbers are often used to summarize a probability distribution for a random variable X. The mean is a measure of the center or middle of the probability distribution, and

More information

Lecture Notes 6. Assume F belongs to a family of distributions, (e.g. F is Normal), indexed by some parameter θ.

Lecture Notes 6. Assume F belongs to a family of distributions, (e.g. F is Normal), indexed by some parameter θ. Sufficient Statistics Lecture Notes 6 Sufficiency Data reduction in terms of a particular statistic can be thought of as a partition of the sample space X. Definition T is sufficient for θ if the conditional

More information

The Black-Scholes Model

The Black-Scholes Model The Black-Scholes Model Liuren Wu Options Markets (Hull chapter: 12, 13, 14) Liuren Wu ( c ) The Black-Scholes Model colorhmoptions Markets 1 / 17 The Black-Scholes-Merton (BSM) model Black and Scholes

More information

MATH20180: Foundations of Financial Mathematics

MATH20180: Foundations of Financial Mathematics MATH20180: Foundations of Financial Mathematics Vincent Astier email: vincent.astier@ucd.ie office: room S1.72 (Science South) Lecture 1 Vincent Astier MATH20180 1 / 35 Our goal: the Black-Scholes Formula

More information

FINANCIAL OPTION ANALYSIS HANDOUTS

FINANCIAL OPTION ANALYSIS HANDOUTS FINANCIAL OPTION ANALYSIS HANDOUTS 1 2 FAIR PRICING There is a market for an object called S. The prevailing price today is S 0 = 100. At this price the object S can be bought or sold by anyone for any

More information

Binomial model: numerical algorithm

Binomial model: numerical algorithm Binomial model: numerical algorithm S / 0 C \ 0 S0 u / C \ 1,1 S0 d / S u 0 /, S u 3 0 / 3,3 C \ S0 u d /,1 S u 5 0 4 0 / C 5 5,5 max X S0 u,0 S u C \ 4 4,4 C \ 3 S u d / 0 3, C \ S u d 0 S u d 0 / C 4

More information

CS134: Networks Spring Random Variables and Independence. 1.2 Probability Distribution Function (PDF) Number of heads Probability 2 0.

CS134: Networks Spring Random Variables and Independence. 1.2 Probability Distribution Function (PDF) Number of heads Probability 2 0. CS134: Networks Spring 2017 Prof. Yaron Singer Section 0 1 Probability 1.1 Random Variables and Independence A real-valued random variable is a variable that can take each of a set of possible values in

More information

Introduction Random Walk One-Period Option Pricing Binomial Option Pricing Nice Math. Binomial Models. Christopher Ting.

Introduction Random Walk One-Period Option Pricing Binomial Option Pricing Nice Math. Binomial Models. Christopher Ting. Binomial Models Christopher Ting Christopher Ting http://www.mysmu.edu/faculty/christophert/ : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 October 14, 2016 Christopher Ting QF 101 Week 9 October

More information

Normal distribution Approximating binomial distribution by normal 2.10 Central Limit Theorem

Normal distribution Approximating binomial distribution by normal 2.10 Central Limit Theorem 1.1.2 Normal distribution 1.1.3 Approimating binomial distribution by normal 2.1 Central Limit Theorem Prof. Tesler Math 283 Fall 216 Prof. Tesler 1.1.2-3, 2.1 Normal distribution Math 283 / Fall 216 1

More information

1 Mathematics in a Pill 1.1 PROBABILITY SPACE AND RANDOM VARIABLES. A probability triple P consists of the following components:

1 Mathematics in a Pill 1.1 PROBABILITY SPACE AND RANDOM VARIABLES. A probability triple P consists of the following components: 1 Mathematics in a Pill The purpose of this chapter is to give a brief outline of the probability theory underlying the mathematics inside the book, and to introduce necessary notation and conventions

More information

Probability and Random Variables A FINANCIAL TIMES COMPANY

Probability and Random Variables A FINANCIAL TIMES COMPANY Probability Basics Probability and Random Variables A FINANCIAL TIMES COMPANY 2 Probability Probability of union P[A [ B] =P[A]+P[B] P[A \ B] Conditional Probability A B P[A B] = Bayes Theorem P[A \ B]

More information

The Binomial Lattice Model for Stocks: Introduction to Option Pricing

The Binomial Lattice Model for Stocks: Introduction to Option Pricing 1/27 The Binomial Lattice Model for Stocks: Introduction to Option Pricing Professor Karl Sigman Columbia University Dept. IEOR New York City USA 2/27 Outline The Binomial Lattice Model (BLM) as a Model

More information

Aspects of Financial Mathematics:

Aspects of Financial Mathematics: Aspects of Financial Mathematics: Options, Derivatives, Arbitrage, and the Black-Scholes Pricing Formula J. Robert Buchanan Millersville University of Pennsylvania email: Bob.Buchanan@millersville.edu

More information

STAT/MATH 395 PROBABILITY II

STAT/MATH 395 PROBABILITY II STAT/MATH 395 PROBABILITY II Distribution of Random Samples & Limit Theorems Néhémy Lim University of Washington Winter 2017 Outline Distribution of i.i.d. Samples Convergence of random variables The Laws

More information

Covariance and Correlation. Def: If X and Y are JDRVs with finite means and variances, then. Example Sampling

Covariance and Correlation. Def: If X and Y are JDRVs with finite means and variances, then. Example Sampling Definitions Properties E(X) µ X Transformations Linearity Monotonicity Expectation Chapter 7 xdf X (x). Expectation Independence Recall: µ X minimizes E[(X c) ] w.r.t. c. The Prediction Problem The Problem:

More information

RMSC 4005 Stochastic Calculus for Finance and Risk. 1 Exercises. (c) Let X = {X n } n=0 be a {F n }-supermartingale. Show that.

RMSC 4005 Stochastic Calculus for Finance and Risk. 1 Exercises. (c) Let X = {X n } n=0 be a {F n }-supermartingale. Show that. 1. EXERCISES RMSC 45 Stochastic Calculus for Finance and Risk Exercises 1 Exercises 1. (a) Let X = {X n } n= be a {F n }-martingale. Show that E(X n ) = E(X ) n N (b) Let X = {X n } n= be a {F n }-submartingale.

More information

The Binomial Model. Chapter 3

The Binomial Model. Chapter 3 Chapter 3 The Binomial Model In Chapter 1 the linear derivatives were considered. They were priced with static replication and payo tables. For the non-linear derivatives in Chapter 2 this will not work

More information

Probability Models.S2 Discrete Random Variables

Probability Models.S2 Discrete Random Variables Probability Models.S2 Discrete Random Variables Operations Research Models and Methods Paul A. Jensen and Jonathan F. Bard Results of an experiment involving uncertainty are described by one or more random

More information

Basic Data Analysis. Stephen Turnbull Business Administration and Public Policy Lecture 4: May 2, Abstract

Basic Data Analysis. Stephen Turnbull Business Administration and Public Policy Lecture 4: May 2, Abstract Basic Data Analysis Stephen Turnbull Business Administration and Public Policy Lecture 4: May 2, 2013 Abstract Introduct the normal distribution. Introduce basic notions of uncertainty, probability, events,

More information

The Black-Scholes Model

The Black-Scholes Model The Black-Scholes Model Liuren Wu Options Markets Liuren Wu ( c ) The Black-Merton-Scholes Model colorhmoptions Markets 1 / 18 The Black-Merton-Scholes-Merton (BMS) model Black and Scholes (1973) and Merton

More information

Martingales. by D. Cox December 2, 2009

Martingales. by D. Cox December 2, 2009 Martingales by D. Cox December 2, 2009 1 Stochastic Processes. Definition 1.1 Let T be an arbitrary index set. A stochastic process indexed by T is a family of random variables (X t : t T) defined on a

More information

Part 1 In which we meet the law of averages. The Law of Averages. The Expected Value & The Standard Error. Where Are We Going?

Part 1 In which we meet the law of averages. The Law of Averages. The Expected Value & The Standard Error. Where Are We Going? 1 The Law of Averages The Expected Value & The Standard Error Where Are We Going? Sums of random numbers The law of averages Box models for generating random numbers Sums of draws: the Expected Value Standard

More information

MSc Financial Engineering CHRISTMAS ASSIGNMENT: MERTON S JUMP-DIFFUSION MODEL. To be handed in by monday January 28, 2013

MSc Financial Engineering CHRISTMAS ASSIGNMENT: MERTON S JUMP-DIFFUSION MODEL. To be handed in by monday January 28, 2013 MSc Financial Engineering 2012-13 CHRISTMAS ASSIGNMENT: MERTON S JUMP-DIFFUSION MODEL To be handed in by monday January 28, 2013 Department EMS, Birkbeck Introduction The assignment consists of Reading

More information

Option Pricing Models for European Options

Option Pricing Models for European Options Chapter 2 Option Pricing Models for European Options 2.1 Continuous-time Model: Black-Scholes Model 2.1.1 Black-Scholes Assumptions We list the assumptions that we make for most of this notes. 1. The underlying

More information

Central limit theorems

Central limit theorems Chapter 6 Central limit theorems 6.1 Overview Recall that a random variable Z is said to have a standard normal distribution, denoted by N(0, 1), if it has a continuous distribution with density φ(z) =

More information

Stochastic Calculus, Application of Real Analysis in Finance

Stochastic Calculus, Application of Real Analysis in Finance , Application of Real Analysis in Finance Workshop for Young Mathematicians in Korea Seungkyu Lee Pohang University of Science and Technology August 4th, 2010 Contents 1 BINOMIAL ASSET PRICING MODEL Contents

More information

6. Martingales. = Zn. Think of Z n+1 as being a gambler s earnings after n+1 games. If the game if fair, then E [ Z n+1 Z n

6. Martingales. = Zn. Think of Z n+1 as being a gambler s earnings after n+1 games. If the game if fair, then E [ Z n+1 Z n 6. Martingales For casino gamblers, a martingale is a betting strategy where (at even odds) the stake doubled each time the player loses. Players follow this strategy because, since they will eventually

More information

Module 10:Application of stochastic processes in areas like finance Lecture 36:Black-Scholes Model. Stochastic Differential Equation.

Module 10:Application of stochastic processes in areas like finance Lecture 36:Black-Scholes Model. Stochastic Differential Equation. Stochastic Differential Equation Consider. Moreover partition the interval into and define, where. Now by Rieman Integral we know that, where. Moreover. Using the fundamentals mentioned above we can easily

More information

An Introduction to Stochastic Calculus

An Introduction to Stochastic Calculus An Introduction to Stochastic Calculus Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Week 2-3 Haijun Li An Introduction to Stochastic Calculus Week 2-3 1 / 24 Outline

More information

UQ, STAT2201, 2017, Lectures 3 and 4 Unit 3 Probability Distributions.

UQ, STAT2201, 2017, Lectures 3 and 4 Unit 3 Probability Distributions. UQ, STAT2201, 2017, Lectures 3 and 4 Unit 3 Probability Distributions. Random Variables 2 A random variable X is a numerical (integer, real, complex, vector etc.) summary of the outcome of the random experiment.

More information

MAS3904/MAS8904 Stochastic Financial Modelling

MAS3904/MAS8904 Stochastic Financial Modelling MAS3904/MAS8904 Stochastic Financial Modelling Dr Andrew (Andy) Golightly a.golightly@ncl.ac.uk Semester 1, 2018/19 Administrative Arrangements Lectures on Tuesdays at 14:00 (PERCY G13) and Thursdays at

More information

Chapter 1. Bond Pricing (continued)

Chapter 1. Bond Pricing (continued) Chapter 1 Bond Pricing (continued) How does the bond pricing illustrated here help investors in their investment decisions? This pricing formula can allow the investors to decide for themselves what the

More information

Lecture Note 8 of Bus 41202, Spring 2017: Stochastic Diffusion Equation & Option Pricing

Lecture Note 8 of Bus 41202, Spring 2017: Stochastic Diffusion Equation & Option Pricing Lecture Note 8 of Bus 41202, Spring 2017: Stochastic Diffusion Equation & Option Pricing We shall go over this note quickly due to time constraints. Key concept: Ito s lemma Stock Options: A contract giving

More information

Arbitrages and pricing of stock options

Arbitrages and pricing of stock options Arbitrages and pricing of stock options Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November

More information

Lecture 11: Ito Calculus. Tuesday, October 23, 12

Lecture 11: Ito Calculus. Tuesday, October 23, 12 Lecture 11: Ito Calculus Continuous time models We start with the model from Chapter 3 log S j log S j 1 = µ t + p tz j Sum it over j: log S N log S 0 = NX µ t + NX p tzj j=1 j=1 Can we take the limit

More information

The Black-Scholes PDE from Scratch

The Black-Scholes PDE from Scratch The Black-Scholes PDE from Scratch chris bemis November 27, 2006 0-0 Goal: Derive the Black-Scholes PDE To do this, we will need to: Come up with some dynamics for the stock returns Discuss Brownian motion

More information