Lesson 3: Basic theory of stochastic processes

Similar documents
Lesson 1: What is a time series

IEOR 165 Lecture 1 Probability Review

Stochastic Calculus, Application of Real Analysis in Finance

S t d with probability (1 p), where

Drunken Birds, Brownian Motion, and Other Random Fun

Martingales. by D. Cox December 2, 2009

UQ, STAT2201, 2017, Lectures 3 and 4 Unit 3 Probability Distributions.

Stochastic Processes and Stochastic Calculus - 9 Complete and Incomplete Market Models

Introduction to Probability Theory and Stochastic Processes for Finance Lecture Notes

An Introduction to Stochastic Calculus

Corso di Identificazione dei Modelli e Analisi dei Dati

Confidence Intervals: Review

Lecture 23. STAT 225 Introduction to Probability Models April 4, Whitney Huang Purdue University. Normal approximation to Binomial

Derivatives Pricing and Stochastic Calculus

I. Time Series and Stochastic Processes

Introduction to Stochastic Calculus and Financial Derivatives. Simone Calogero

Binomial model: numerical algorithm

Martingale Measure TA

Discrete Random Variables

Discrete Random Variables

Strategies for Improving the Efficiency of Monte-Carlo Methods

Valuation of derivative assets Lecture 8

Risk Neutral Valuation

1 Mathematics in a Pill 1.1 PROBABILITY SPACE AND RANDOM VARIABLES. A probability triple P consists of the following components:

On Complexity of Multistage Stochastic Programs

Stochastic calculus Introduction I. Stochastic Finance. C. Azizieh VUB 1/91. C. Azizieh VUB Stochastic Finance

Bivariate Birnbaum-Saunders Distribution

4 Martingales in Discrete-Time

Modeling via Stochastic Processes in Finance

Math-Stat-491-Fall2014-Notes-V

MTH6154 Financial Mathematics I Stochastic Interest Rates

Chapter 6. Importance sampling. 6.1 The basics

Martingale Pricing Theory in Discrete-Time and Discrete-Space Models

4: SINGLE-PERIOD MARKET MODELS

An Introduction to Stochastic Calculus

BROWNIAN MOTION Antonella Basso, Martina Nardon

Confidence Intervals Introduction

Normal Distribution. Notes. Normal Distribution. Standard Normal. Sums of Normal Random Variables. Normal. approximation of Binomial.

Building Infinite Processes from Regular Conditional Probability Distributions

Business Statistics 41000: Probability 3

Lecture 17. The model is parametrized by the time period, δt, and three fixed constant parameters, v, σ and the riskless rate r.

Introduction to Statistical Data Analysis II

Modes of Convergence

Elementary Statistics Lecture 5

IEOR E4703: Monte-Carlo Simulation

Optimal Dam Management

PORTFOLIO OPTIMIZATION AND EXPECTED SHORTFALL MINIMIZATION FROM HISTORICAL DATA

Slides for Risk Management

Statistics, Measures of Central Tendency I

Statistics for Managers Using Microsoft Excel 7 th Edition

Brownian Motion. Richard Lockhart. Simon Fraser University. STAT 870 Summer 2011

EE641 Digital Image Processing II: Purdue University VISE - October 29,

Stochastic Processes and Financial Mathematics (part one) Dr Nic Freeman

Probability without Measure!

A potentially useful approach to model nonlinearities in time series is to assume different behavior (structural break) in different subsamples

Arbitrage Theory without a Reference Probability: challenges of the model independent approach

Random Variables Handout. Xavier Vilà

BROWNIAN MOTION II. D.Majumdar

ECE 340 Probabilistic Methods in Engineering M/W 3-4:15. Lecture 10: Continuous RV Families. Prof. Vince Calhoun

Homework Assignments

Pricing theory of financial derivatives

Econ 8602, Fall 2017 Homework 2

Module 4: Probability

Machine Learning for Quantitative Finance

Probability Theory. Probability and Statistics for Data Science CSE594 - Spring 2016

Stochastic Programming and Financial Analysis IE447. Midterm Review. Dr. Ted Ralphs

Uniform Probability Distribution. Continuous Random Variables &

Sampling and sampling distribution

Chapter 5 Univariate time-series analysis. () Chapter 5 Univariate time-series analysis 1 / 29

Engineering Statistics ECIV 2305

Outline Brownian Process Continuity of Sample Paths Differentiability of Sample Paths Simulating Sample Paths Hitting times and Maximum

(5) Multi-parameter models - Summarizing the posterior

Chapter 4: Asymptotic Properties of MLE (Part 3)

Equivalence between Semimartingales and Itô Processes

Martingales. Will Perkins. March 18, 2013

Convergence. Any submartingale or supermartingale (Y, F) converges almost surely if it satisfies E Y n <. STAT2004 Martingale Convergence

Discrete time interest rate models

Brownian Motion. 1.1 Axioms of Probability. Chapter 1

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 11 10/9/2013. Martingales and stopping times II

A class of coherent risk measures based on one-sided moments

BETA DISTRIBUTION ON ARITHMETICAL SEMIGROUPS

Chapter ! Bell Shaped

On Existence of Equilibria. Bayesian Allocation-Mechanisms

Modelling financial data with stochastic processes

The (λ, κ)-fn and the order theory of bases in boolean algebras

Lecture Notes 6. Assume F belongs to a family of distributions, (e.g. F is Normal), indexed by some parameter θ.

Machine Learning in Computer Vision Markov Random Fields Part II

Probability. An intro for calculus students P= Figure 1: A normal integral

Stochastic Differential equations as applied to pricing of options

Scenario Generation and Sampling Methods

Statistics 431 Spring 2007 P. Shaman. Preliminaries

A Note on the POUM Effect with Heterogeneous Social Mobility

Conjugate priors: Beta and normal Class 15, Jeremy Orloff and Jonathan Bloom

Keeping Your Options Open: An Introduction to Pricing Options

Statistical Methods for NLP LT 2202

Functional vs Banach space stochastic calculus & strong-viscosity solutions to semilinear parabolic path-dependent PDEs.

1 The continuous time limit

Module 10:Application of stochastic processes in areas like finance Lecture 36:Black-Scholes Model. Stochastic Differential Equation.

A new approach for scenario generation in risk management

12. Conditional heteroscedastic models (ARCH) MA6622, Ernesto Mordecki, CityU, HK, 2006.

Transcription:

Lesson 3: Basic theory of stochastic processes Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it

Probability space We start with some definitions A probability space is a triple (Ω, A, P), where (i) Ω is a nonempty set, we call it the sample space. (ii) A is a σ-algebra of subsets of Ω, i.e. a family of subsets closed with respect to countable union and complement with respect to Ω. (iii) P is a probability measure defined for all members of A. That is a function P : A [0,1] such that P(A) 0 for all A A, P(Ω) = 1, P( i=1 A i) = i=1 P(A i), for all sequences A i A such that A k A j = for k j.

Random Variable A real random variable or real stochastic variable on (Ω, A, P) is a function x : Ω R, such that the inverse image of any interval (, a] belongs to A, i.e. x 1 ((, a]) = {ω Ω : x(ω) a} A for all a R. We also say that the function x is measurable A.

Stochastic process What is a stochastic process? Let T be a subset of R. A real stochastic process is a family of random variables {x t (ω); t T }, all defined on the same probability space (Ω, A, P)

The set T is called index set of the process. If T Z, then the process {x t (ω); t T } is called a discrete stochastic process. If T is an interval of R, then {x t (ω); t T } is called a continuous stochastic process. In the sequel we will consider only discrete stochastic processes. Any single real random variable is a (trivial) stochastic process. In this case we have {x t (ω); t T } with T ={t 1 }

When T = Z the stochastic process {x t (ω); t Z} becomes a sequence of random variables. It is important to keep in mind that the sequence {x t (ω); t Z} has to be understood as the function associating the random variable x t with the integer t. Therefore the processes x = {x t (ω); t Z}, y = {x t (ω); t Z} z = {x t 3 (ω); t Z} are different. Although they share the same range, i.e. the same set of random variables, the functions associating a random variable with each integer t are different.

: examples Let A(ω) be a random variable defined on (Ω, A, P). Consider the discrete stochastic process {x t (ω); t Z} where x t (ω) = A(ω) t Z. A slightly modified example is x t (ω) = ( 1) t A(ω).

: examples Other processes are: {y t (ω); t Z}, with y t (ω) = a + bt + u t (ω); {z t (ω); t Z}, with z t (ω) = tu t (ω). where the random variables u t (ω) are IID.

Let {x t (ω); t Z} be a stochastic process defined on the probability space (Ω, A, P). For a fixed ω Ω, {x t (ω ); t Z} is a sequence of real number called realization or sample function of the stochastic process.

Consider the discrete stochastic process {x t (ω); t N} where x t (ω) N (0, 1) for t = 1, 2... and x t (ω) x s (ω) for t s. The plot of a realization of this process is presented in Figure 1. Figure : Figure 1

We note that for each choice of ω Ω a realization of the stochastic process is determined. For example, if ω 1, ω 2 Ω we have that {x t (ω 1 ); t Z} and {x t (ω 2 ); t Z} are two possible realizations of our stochastic process.

Consider the discrete stochastic process where {x t (ω); t N} x t = log(t) + cos (A(ω)) A(ω) N(0, 1). Figure 2 shows the plot of two possible realizations of this process.

In the following figure we present the plot of five possible realization of a random walk stochastic process Figure :

Just as a random variable assigns a number to each outcome in a sample space, a stochastic process assigns a sample function (realization) to each outcome ω Ω. Each realization is a unique function of time different from the others.

The set of all possible realizations of a stochastic process is called ensemble. {{x t (ω); t Z}; ω Ω}

Consider a stochastic process {x t (ω); t Z}. It is important to point out that all the random variables x t (ω) are defined on the same probability space (Ω, A, P): x t : Ω R t Z. Therefore, for all s Z + and t 1 t 2 t s, the probability P(a 1 x t1 (ω) b 1, a 2 x t2 (ω) b 2,..., a s x ts (ω) b s ) is well defined and so we can give the following definition.

Definition. Let {t 1, t 2,, t s } be a finite set of integers, with s Z +.The joint distribution function of (x t1 (ω), x t2 (ω),..., x ts (ω)) is defined by F t1,t 2,,t s (b 1, b 2,, b s ) = P(x t1 (ω) b 1, x t2 (ω) b 2,..., x ts (ω) b s ) The family { Ft1,t 2,,t s (b 1, b 2,, b s ); s Z +, {t 1, t 2,, t s } Z } is called the finite dimensional distribution of the process.

Definition. Let {t 1, t 2,, t s } be a finite set of integers, with s Z +. The stochastic process {x t (ω); t Z} is said Gaussian if the joint distribution function of the random vector (x t1 (ω), x t2 (ω),..., x ts (ω)) is normal for any subset of Z, {t 1, t 2,, t s } with s 1. Thus a stochastic process is a Gaussian process if and only if all distribution functions belonging to the finite dimensional distribution of the process are normal. Many real world phenomena are well modeled as Gaussian processes.

If we know the finite dimensional distribution of the process, we are able to answer the questions such as: 1 Which is the probability that the process {x t (ω); t Z} passes through [a, b] at time t 1? 2 Which is the probability that the process {x t (ω); t Z} passes through [a, b] at time t 1 and through [c, d] at time t 2?

The answers: 1 P({a x t1 (ω) b}) = F t1 (b) F t1 (a) 2 P({a x t1 (ω) b, c x t2 (ω) d}) = F t1,t 2 (b, d) F t1,t 2 (a, d) F t1,t 2 (b, c) + F t1,t 2 (a, c).

An important point: Is the knowledge of the finite dimensional distribution of the process sufficient to answer all question about the stochastic process are of interest? Can the probabilistic structure of a stochastic process to be fully described by the finite dimensional distribution of the process?

Theorem. For any positive integer s, let {t 1, t 2,, t s } be any admissible set of values of t. Then under general conditions the probabilistic structure of the stochastic process {x t (ω); t Z} is completely specified if we are given the joint probability distribution of (x t1 (ω), x t2 (ω),, x tn (ω)) for all values of s and for all choices of {t 1, t 2,, t s } (Priestly 1981, p.104).

We can conclude that a stochastic process is defined completely in a probabilistic sense if one knows the joint distribution function of (x t1 (ω), x t2 (ω),..., x ts (ω)) F t1,t 2,,t s (b 1, b 2,, b s ) for any positive integer s and for all choices of finite set of random variables (x t1 (ω), x t2 (ω),..., x ts (ω)).

The stochastic process as model. If we take the point of view that the observed time series is a finite part of one realization of a stochastic process {x t (ω); t Z}, then the stochastic process can serve as model of the DGP that has produced the time series.

DGP SP x 1,..., x T

In particular, since a complete knowledge of a stochastic process requires the knowledge of the finite dimensional distribution of the process, the time series model is given by the family {F t1,t 2,,t s (b 1, b 2,, b s ); s 1, {t 1, t 2,, t s } Z} where the form of the joint distribution functions F t1,t 2,,t s (b 1, b 2,, b s ) is supposed known. It is clear that, in general, this model contains too unknown parameters to be estimated from observed data.

If, for example, we assume that our model is the stochastic process {x t (ω); t Z}, where x t N(µ t, σt 2 ) we have that { b ( ) } 1 v 2 µt F t (b) = exp dv for t = 0 ± 1,... 2πσ 2 t Thus considering only the univariate distributions, we have to estimate a infinite number of parameters {µ t, σ t ; t Z}. σ t

This task is impossible

Consequently, some restrictions have to be made concerning the stochastic process that is adopted as model. In particular, we will consider 1 restrictions on the time-heterogeneity of the process; 2 restrictions on the memory of the process.

The first kind of restrictions enables us to reduce the number of unknown parameters. The second allows us to obtain a consistent estimate of unknown parameters.