Introduction to Sequential Monte Carlo Methods
|
|
- Madison Farmer
- 6 years ago
- Views:
Transcription
1 Introduction to Sequential Monte Carlo Methods Arnaud Doucet NCSU, October 2008 Arnaud Doucet () Introduction to SMC NCSU, October / 36
2 Preliminary Remarks Sequential Monte Carlo (SMC) are a set of methods allowing us to approximate virtually any sequence of probability distributions. SMC are very popular in physics where they are used to compute eigenvalues of positive operators, the solution of PDEs/integral equations or simulate polymers. We focus here on Applications of SMC to Hidden Markov Models (HMM) for pedagogical reasons... In the HMM framework, SMC are also widely known as Particle Filtering/Smoothing methods. Arnaud Doucet () Introduction to SMC NCSU, October / 36
3 Markov Models We model the stochastic processes of interest as a discrete-time Markov process fx k g k1. fx k g k1 is characterized by its initial density and its transition density X 1 µ () X k j (X k 1 = x k 1 ) f ( j x k 1 ). We introduce the notation x i :j = (x i, x i+1,..., x j ) for i j. We have by de nition p (x ) = p (x 1 ) = µ (x 1 ) n k=2 n k=2 p (x k j x 1:k 1 ) f (x k j x k 1 ) Arnaud Doucet () Introduction to SMC NCSU, October / 36
4 Observation Model We do not observe fx k g k1 ; the process is hidden. We only have access to another related process fy k g k1. We assume that, conditional on fx k g k1, the observations fy k g k1 are independent and marginally distributed according to Formally this means that Y k j (X k = x k ) g ( j x k ). p (y j x ) = n g (y k j x k ). k=1 Arnaud Doucet () Introduction to SMC NCSU, October / 36
5 Figure: Graphical model representation of HMM Arnaud Doucet () Introduction to SMC NCSU, October / 36
6 Tracking Example Assume you want to track a target in the XY plane then you can consider the 4-dimensional state X k = (X k,1, V k,1, X k,2, V k,2 ) T The so-called constant velocity model states that i.i.d. X k = AX k 1 + W k, W k N (0, Σ), ACV 0 1 T A =, A 0 A CV =, CV 0 1 Σ = σ 2 ΣCV 0 T, Σ 0 Σ CV = 3 /3 T 2 /2 CV T 2 /2 T We obtain that f (x k j x k 1 ) = N (x k ; Ax k 1, Σ). Arnaud Doucet () Introduction to SMC NCSU, October / 36
7 Tracking Example (cont.) The observation equation is dependent on the sensor. Simple case so Y k = CX k + DE k, E k i.i.d. N (0, Σ e ) g (y k j x k ) = N (y k ; Cx k, Σ e ). Complex realistic case (Bearings-only-tracking) Y k = tan 1 Xk,2 i.i.d. + E k, E k N X k,1 so g (y k j x k ) = N 0, σ 2 y k ; tan 1 xk,2, σ 2. x k,1 Arnaud Doucet () Introduction to SMC NCSU, October / 36
8 Stochastic Volatility We have the following standard model X k = φx k 1 + V k, V k i.i.d. N 0, σ 2 so that We observe f (x k j x k 1 ) = N x k ; φx k 1, σ 2. Y k = β exp (X k /2) W k, W k i.i.d. N (0, 1) so that g (y k j x k ) = N y k ; 0, β 2 exp (x k ). Arnaud Doucet () Introduction to SMC NCSU, October / 36
9 Inference in HMM Given a realization of the observations Y = y, we are interested in inferring the states X. We are in a Bayesian framework where Prior: p (x ) = µ (x 1 ) n k=2 Likelihood: p (y j x ) = Using Bayes rule, we obtain f (x k j x k 1 ), n g (y k j x k ) k=1 p (x j y ) = p (y j x ) p (x ) p (y ) where the marginal likelihood is given by Z p (y ) = p (y j x ) p (x ) dx. Arnaud Doucet () Introduction to SMC NCSU, October / 36
10 Sequential Inference in HMM In particular, we will focus here on the sequential estimation of p (x j y ) and p (y ); that is at each time n we want update our knowledge of the hidden process in light of y n. There is a simple recursion relating p (x 1 j y 1 ) to p (x j y ) given by p (x j y ) = p (x 1 j y 1 ) f (x nj x n 1 ) g (y n j x n ) p (y n j y 1 ) where Z p (y n j y 1 ) = g (y n j x n ) f (x n j x n 1 ) p (x n 1 j y 1 ) dx n. We will also simply write p (x j y ) p (x 1 j y 1 ) f (x n j x n 1 ) g (y n j x n ). Arnaud Doucet () Introduction to SMC NCSU, October / 36
11 In many papers/books in the literature, you will nd the following two-step prediction-updating recursion for the marginals so-called ltering distributions p (x n j y ) which is a direct consequence. Prediction Step p (x n j y 1 ) = Updating Step = = Z Z Z p (x n j y 1 ) dx n 1 p (x n j x n 1, y 1 ) p (x n 1 j y 1 ) dx n 1 f (x n j x n 1 ) p (x n 1 j y 1 ) dx n 1. p (x n j y ) = g (y nj x n ) p (x n j y 1 ) p (y n j y 1 ) Arnaud Doucet () Introduction to SMC NCSU, October / 36
12 (Marginal) Likelihood Evaluation We have seen that Z p (y ) = p (y j x ) p (x ) dx. We also have the following decomposition where p (y k j y 1:k 1 ) = p (y ) = p (y 1 ) = = Z Z Z n k=2 p (y k, x k j y 1:k p (y k j y 1:k 1 ) 1 ) dx k g (y k j x k ) p (x k j y 1:k 1 ) dx k g (y k j x k ) f (x k j x k 1 ) p (x k 1 j y 1:k 1 ) dx k 1 We have broken" an high dimensional integral into the product of lower dimensional integrals. Arnaud Doucet () Introduction to SMC NCSU, October / 36
13 Closed-form Inference in HMM We have closed-form solutions for Finite state-space HMM; i.e. E = fe 1,..., e p g as all integrals are becoming nite sums Linear Gaussian models; all the posterior distributions are Gaussian; e.g. the celebrated Kalman lter. A whole reverse engineering literature exists for closed-form solutions in alternative cases... In many cases of interest, it is impossible to compute the solution in closed-form and we need approximations, Arnaud Doucet () Introduction to SMC NCSU, October / 36
14 Standard Approximations for Filtering Distributions Gaussian approximations: Extended Kalman lter, Unscented Kalman lter. Gaussian sum approximations. Projection lters, Variational approximations. Simple discretization of the state-space. Analytical methods work in simple cases but are not reliable and it is di cult to diagnose when they fail. Standard discretization of the space is expensive and di cult to implement in high-dimensional scenarios. Arnaud Doucet () Introduction to SMC NCSU, October / 36
15 Breakthrough At the beginning of the 90 s, the optimal ltering area was considered virtually dead; there had not been any signi cant progress for years then... Gordon, N.J. Salmond, D.J. Smith, A.F.M. "Novel approach to nonlinear/non-gaussian Bayesian state estimation", IEE Proceedings F: Radar and Signal Processing, vol. 140, no. 2, pp , This article introduces a simple method which relies neither on a functional approximation nor a deterministic grid. This paper was ignored by most researchers for a few years... Arnaud Doucet () Introduction to SMC NCSU, October / 36
16 Monte Carlo Sampling. Importance Sampling. Sequential Importance Sampling. Sequential Importance Sampling with Resampling. Arnaud Doucet () Introduction to SMC NCSU, October / 36
17 Monte Carlo Sampling Assume for the time being that you are interested in estimating the high-dimensional probability density p (x j y ) = p (x, y ) p (y ) p (x, y ) where n is xed. A Monte Carlo approximation consists of sampling a large number N of i.i.d. random variables X (i) i.i.d. p (x j y ) and build the following approximation bp (x j y ) = 1 N N δ (i) X (x ) i=1 where δ a (x ) is the delta-dirac mass which is such that Z 1 if a 2 A E δ a (x ) dx = n, 0 otherwise. A Arnaud Doucet () Introduction to SMC NCSU, October / 36
18 Issues with Standard Monte Carlo Sampling There are standard methods to sample from classical distributions such as Beta, Gamma, Normal, Poisson etc. We will not detail them here although we will rely on them. Problem 1: For most problems of interest, we cannot sample from p (x j y ). Problem 2: Even if we could sample exactly from p (x j y ), then the computational complexity of the algorithm would most likely increase with n: we want here an algorithm of xed computational complexity at each time step. To summarize, we cannot use standard MC sampling in our case and, even if we could, this would not solve our problem... Arnaud Doucet () Introduction to SMC NCSU, October / 36
19 Importance Sampling Importance Sampling (IS). We have p (x j y ) = p (y j x ) p (x ), p (y ) Z p (y ) = p (y j x ) p (x ) dx Generally speaking, we have for a so-called importance distribution q (x j y ) such that selected such that p (x j y ) > 0 ) q (x j y ) > 0 p (x j y ) = w (x, y ) q (x j y ), p (y ) Z p (y ) = w (x, y ) q (x j y ) dx where the unnormalized importance weight is w (x, y ) = p (x, y ) q (x j y ) p (x j y ) q (x j y ). Arnaud Doucet () Introduction to SMC NCSU, October / 36
20 Monte Carlo IS Estimates It is easy to sample from p (x ) thus we can build the standard MC approximation bp (x j y ) = 1 N N δ (i) X (x ) where X (i) i.i.d. p (x ). i=1 We plug these approximations in the IS identities to obtain Z p (y ) = p (y j x ) p (x ) dx, ) bp (y ) = 1 N N p y j X (i). i=1 bp (y ) is an unbiased estimate of p (y ) with variance Z 1 p 2 (y j x ) p (x ) dx 1. N Arnaud Doucet () Introduction to SMC NCSU, October / 36
21 We also get an approximation of the posterior using p (x j y ) = bp (x j y ) = = = p (y j x ) p (x ) R p (y j x ) p (x ) dx p (y j x ) bp (x ) R p (y j x ) bp (x ) dx 1 N N i=1 p y j X (i) δ (i) X (x ) N i=1 1 N N i=1 p W n (i) δ (i) X (x ) y j X (i) where the normalized importance weights are W n (i) = p y j X (i). N j=1 p y j X (j) Arnaud Doucet () Introduction to SMC NCSU, October / 36
22 Assume we are interested in computing E p( x jy )(ϕ), then we can use the estimate E bp( x jy )(ϕ) = W n (i) ϕ. N i=1 X (i) This estimate is biased for a nite N but is asymptotically consistent with lim N E bp( x jy N! )(ϕ) E p( x jy )(ϕ) Z p = 2 (x j y ) ϕ (x ) E p (x ) p( x jy )(ϕ) dx and ) N p N E bp( x jy )(ϕ) 0, MSE = bias 2 {z } O (N 2 ) Z p 2 (x j y ) p (x ) E p( x jy )(ϕ) ϕ (x ) 2 E p( x jy )(ϕ) dx. + variance {z } so asymptotic bias is irrelevant. O (N 1 ) Arnaud Doucet () Introduction to SMC NCSU, October / 36
23 Summary of Our Progresses Problem 1: For most problems of interest, we cannot sample from p (x j y ). Problem 1 solved : We use an IS approximation of p (x j y ) that relies on the IS prior distribution p (x ). Problem 2: Even if we could sample exactly from p (x j y ), then the computational complexity of the algorithm would most likely increase with n: we want here an algorithm of xed computational complexity at each time step. Problem 2 not solved yet: If at each time step n, we need to obtain new samples from p (x ) then the algorithm computational complexity will increase at each time step. Arnaud Doucet () Introduction to SMC NCSU, October / 36
24 Sequential Importance Sampling (SIS) To avoid having computational e orts increasing over time, we use the fact that p (x ) {z } IS at time n = p (x 1 ) f (x n j x n 1 ) {z } {z } IS at time n 1 New sampled component = µ (x 1 ) n k=2 f (x k j x k 1 ). In practical terms, this means that at time n 1, we have already sampled X (i) 1 p (x 1) and that to obtain at time n samples/particles X (i) p (x ), we just need to sample X n (i) X (i) n 1 f x n j X (i) n 1 and set = ( X (i) 1 {z }, X n (i) {z} previously sampled paths X (i) new sampled component Arnaud Doucet () Introduction to SMC NCSU, October / 36 )
25 Now, whatever being n, we have only one component X n to sample! However, can we compute our IS estimates of p (y ) and the target p (x j y ) recursively? Remember that where W (i) n We have p bp (y ) = 1 N bp (x j y ) = N p y j X (i), i=1 N i=1 W n (i) δ (i) X (x ), y j X (i), N i=1 W n (i) = 1. p (y j x ) = p (y 1 j x 1 ) g (y n j x n ) Arnaud Doucet () Introduction to SMC NCSU, October / 36
26 Sequential Importance Sampling Algorithm At time 1, Sample N particles X (i) 1 µ (x 1 ) and compute W (i) 1 g y 1 j X (i) 1. At time n, n 2 Sample N particles X n (i) f x n j X (i) n 1 and compute W (i) n W (i) n 1.g y n j X n (i). Arnaud Doucet () Introduction to SMC NCSU, October / 36
27 Practical Issues The algorithm can be easily parallelized. The computational complexity does not increase over time. n o It is not necessary to store the paths if we are only interested X (i) in n approximating o p (x n j y ) as the weights only depends on X (i) n! Arnaud Doucet () Introduction to SMC NCSU, October / 36
28 Example of Applications Consider the following model X k = 0.5X k X k Xk cos (1.2k) + V k 1 = ϕ (X k 1 ) + V k Y k = X 2 k 20 + W k, where X 1 N (0, 1), V k i.i.d. N 0, and W k i.i.d. N (0, 1). Arnaud Doucet () Introduction to SMC NCSU, October / 36
29 60 Histogram of log(importance weights) Figure: Histogram of log by one single particle p y 1:100 j X (i ) 1:100. The approximation is dominated Arnaud Doucet () Introduction to SMC NCSU, October / 36
30 Summary SIS is an attractive idea: sequential and parallelizable, reduces the design of an high-dimensional proposal to the design of a sequence of low-dimensional proposals. SIS can only work for moderate size problems. Is there a way to partially x this problem? Arnaud Doucet () Introduction to SMC NCSU, October / 36
31 Resampling n o Problem: As n increases, the variance of p y j X (i) increases and all the mass is concentrated on a few random samples/particles as W (i 0) n bp (x j y ) = N i=1 1 and W (i) n 0 for i 6= i 0. W n (i) δ (i) X (x ) δ (i X 0 ) (x ) Intuitive KEY idea: Kill in a principled way the particles with low weights W n (i) (relative to 1/N) and multiply the particles with high weights W n (i) (relative to 1/N). Rationale: If a particle at time n has a low weight then typically it will still have a low weight at time n + 1 (though I can easily give you a counterexample) and you want to focus your computational e orts on the promising parts of the space. Arnaud Doucet () Introduction to SMC NCSU, October / 36
32 At time n, IS provides the following approximation of p (x j y ) bp (x j y ) = N i=1 W n (i) δ (i) X (x ). The simplest resampling schemes consists of sampling N times ex (i) bp (x j y ) to build the new approximation ep (x j y ) = 1 N N δ X e (i) (x ). i=1 n o The new resampled particles X e (i) are approximately distributed according to p (x j y ) but statistically dependent Theoretically much more di cult to study. Arnaud Doucet () Introduction to SMC NCSU, October / 36
33 Sequential Importance Sampling Resampling Algorithm At time 1, Sample N particles X (i) 1 µ (x 1 ) and compute Resample W (i) 1 g y 1 j X (i) 1. n o n X (i) 1, W (i) 1 to obtain new particles also denoted At time n, n 2 Sample N particles X n (i) f x n j X (i) n 1 and compute Resample W (i) n g y n j X n (i). n o n X (i), W n (i) to obtain new particles also denoted X (i) 1 X (i) o. o. Arnaud Doucet () Introduction to SMC NCSU, October / 36
34 We also have Z p (y n j y 1 ) = g (y n j x n ) f (x n j x n 1 ) p (x n 1 j y 1 ) dx n so bp (y n j y 1 ) = 1 N N g y n j X n (i). i=1 Perhaps surprisingly, it can be shown that if we de ne bp (y ) = bp (y 1 ) n k=2 bp (y k j y 1:k 1 ) then E [bp (y )] = p (y ). Arnaud Doucet () Introduction to SMC NCSU, October / 36
35 Example (cont.) Consider again the following model X k = 0.5X k X k Xk cos (1.2k) + V k 1 Y k = X 2 k 20 + W k, where X 1 N (0, 1), V k i.i.d. N 0, and W k i.i.d. N (0, 1). Arnaud Doucet () Introduction to SMC NCSU, October / 36
36 Advanced SMC Methods I have presented the most basic algorithm. In practice, practitioners often select an IS distribution q (x n j y n, x n 1 ) 6= f (x n j x n 1 ). In such cases, we have W n (i) f X n (i) X (i) n 1 g y n j X n (i) q y n, X (i) X (i) n n 1 Better resampling steps have been developed. Variance reduction can also be developed. SMC methods can be used to sample from virtually any sequence of distributions. Arnaud Doucet () Introduction to SMC NCSU, October / 36
On Solving Integral Equations using. Markov Chain Monte Carlo Methods
On Solving Integral quations using Markov Chain Monte Carlo Methods Arnaud Doucet Department of Statistics and Department of Computer Science, University of British Columbia, Vancouver, BC, Canada mail:
More informationMonte Carlo and Empirical Methods for Stochastic Inference (MASM11/FMSN50)
Monte Carlo and Empirical Methods for Stochastic Inference (MASM11/FMSN50) Magnus Wiktorsson Centre for Mathematical Sciences Lund University, Sweden Lecture 5 Sequential Monte Carlo methods I January
More information15 : Approximate Inference: Monte Carlo Methods
10-708: Probabilistic Graphical Models 10-708, Spring 2016 15 : Approximate Inference: Monte Carlo Methods Lecturer: Eric P. Xing Scribes: Binxuan Huang, Yotam Hechtlinger, Fuchen Liu 1 Introduction to
More informationMonte Carlo and Empirical Methods for Stochastic Inference (MASM11/FMSN50)
Monte Carlo and Empirical Methods for Stochastic Inference (MASM11/FMSN50) Magnus Wiktorsson Centre for Mathematical Sciences Lund University, Sweden Lecture 6 Sequential Monte Carlo methods II February
More informationدرس هفتم یادگیري ماشین. (Machine Learning) دانشگاه فردوسی مشهد دانشکده مهندسی رضا منصفی
یادگیري ماشین توزیع هاي نمونه و تخمین نقطه اي پارامترها Sampling Distributions and Point Estimation of Parameter (Machine Learning) دانشگاه فردوسی مشهد دانشکده مهندسی رضا منصفی درس هفتم 1 Outline Introduction
More informationAnalysis of the Bitcoin Exchange Using Particle MCMC Methods
Analysis of the Bitcoin Exchange Using Particle MCMC Methods by Michael Johnson M.Sc., University of British Columbia, 2013 B.Sc., University of Winnipeg, 2011 Project Submitted in Partial Fulfillment
More informationA Macro-Finance Model of the Term Structure: the Case for a Quadratic Yield Model
Title page Outline A Macro-Finance Model of the Term Structure: the Case for a 21, June Czech National Bank Structure of the presentation Title page Outline Structure of the presentation: Model Formulation
More informationPosterior Inference. , where should we start? Consider the following computational procedure: 1. draw samples. 2. convert. 3. compute properties
Posterior Inference Example. Consider a binomial model where we have a posterior distribution for the probability term, θ. Suppose we want to make inferences about the log-odds γ = log ( θ 1 θ), where
More informationChapter 7: Estimation Sections
1 / 40 Chapter 7: Estimation Sections 7.1 Statistical Inference Bayesian Methods: Chapter 7 7.2 Prior and Posterior Distributions 7.3 Conjugate Prior Distributions 7.4 Bayes Estimators Frequentist Methods:
More informationIdentifying Long-Run Risks: A Bayesian Mixed-Frequency Approach
Identifying : A Bayesian Mixed-Frequency Approach Frank Schorfheide University of Pennsylvania CEPR and NBER Dongho Song University of Pennsylvania Amir Yaron University of Pennsylvania NBER February 12,
More informationMulti-period Portfolio Choice and Bayesian Dynamic Models
Multi-period Portfolio Choice and Bayesian Dynamic Models Petter Kolm and Gordon Ritter Courant Institute, NYU Paper appeared in Risk Magazine, Feb. 25 (2015) issue Working paper version: papers.ssrn.com/sol3/papers.cfm?abstract_id=2472768
More informationTwo hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER
Two hours MATH20802 To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER STATISTICAL METHODS Answer any FOUR of the SIX questions.
More informationELEMENTS OF MONTE CARLO SIMULATION
APPENDIX B ELEMENTS OF MONTE CARLO SIMULATION B. GENERAL CONCEPT The basic idea of Monte Carlo simulation is to create a series of experimental samples using a random number sequence. According to the
More informationEquity correlations implied by index options: estimation and model uncertainty analysis
1/18 : estimation and model analysis, EDHEC Business School (joint work with Rama COT) Modeling and managing financial risks Paris, 10 13 January 2011 2/18 Outline 1 2 of multi-asset models Solution to
More informationChapter 5 Univariate time-series analysis. () Chapter 5 Univariate time-series analysis 1 / 29
Chapter 5 Univariate time-series analysis () Chapter 5 Univariate time-series analysis 1 / 29 Time-Series Time-series is a sequence fx 1, x 2,..., x T g or fx t g, t = 1,..., T, where t is an index denoting
More informationEvaluating structural models for the U.S. short rate using EMM and optimal filters
Evaluating structural models for the U.S. short rate using EMM and optimal filters Drew Creal, Ying Gu, and Eric Zivot First version: August 10, 2006 Current version: March 17, 2007 Abstract We combine
More informationGPD-POT and GEV block maxima
Chapter 3 GPD-POT and GEV block maxima This chapter is devoted to the relation between POT models and Block Maxima (BM). We only consider the classical frameworks where POT excesses are assumed to be GPD,
More informationExact Sampling of Jump-Diffusion Processes
1 Exact Sampling of Jump-Diffusion Processes and Dmitry Smelov Management Science & Engineering Stanford University Exact Sampling of Jump-Diffusion Processes 2 Jump-Diffusion Processes Ubiquitous in finance
More informationSequential Monte Carlo Samplers
Sequential Monte Carlo Samplers Pierre Del Moral Université Nice Sophia Antipolis, France Arnaud Doucet University of British Columbia, Canada Ajay Jasra University of Oxford, UK Summary. In this paper,
More informationIdiosyncratic risk, insurance, and aggregate consumption dynamics: a likelihood perspective
Idiosyncratic risk, insurance, and aggregate consumption dynamics: a likelihood perspective Alisdair McKay Boston University June 2013 Microeconomic evidence on insurance - Consumption responds to idiosyncratic
More informationMuch of what appears here comes from ideas presented in the book:
Chapter 11 Robust statistical methods Much of what appears here comes from ideas presented in the book: Huber, Peter J. (1981), Robust statistics, John Wiley & Sons (New York; Chichester). There are many
More informationInference of the Structural Credit Risk Model
Inference of the Structural Credit Risk Model using Yuxi Li, Li Cheng and Dale Schuurmans Abstract Credit risk analysis is not only an important research topic in finance, but also of interest in everyday
More informationModel Estimation. Liuren Wu. Fall, Zicklin School of Business, Baruch College. Liuren Wu Model Estimation Option Pricing, Fall, / 16
Model Estimation Liuren Wu Zicklin School of Business, Baruch College Fall, 2007 Liuren Wu Model Estimation Option Pricing, Fall, 2007 1 / 16 Outline 1 Statistical dynamics 2 Risk-neutral dynamics 3 Joint
More informationMonte Carlo and Empirical Methods for Stochastic Inference (MASM11/FMS091)
Monte Carlo and Empirical Methods for Stochastic Inference (MASM11/FMS091) Magnus Wiktorsson Centre for Mathematical Sciences Lund University, Sweden Lecture 3 Importance sampling January 27, 2015 M. Wiktorsson
More informationIntroduction Dickey-Fuller Test Option Pricing Bootstrapping. Simulation Methods. Chapter 13 of Chris Brook s Book.
Simulation Methods Chapter 13 of Chris Brook s Book Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 April 26, 2017 Christopher
More informationAmerican Option Pricing: A Simulated Approach
Utah State University DigitalCommons@USU All Graduate Plan B and other Reports Graduate Studies 5-2013 American Option Pricing: A Simulated Approach Garrett G. Smith Utah State University Follow this and
More informationTop-down particle filtering for Bayesian decision trees
Top-down particle filtering for Bayesian decision trees Balaji Lakshminarayanan 1, Daniel M. Roy 2 and Yee Whye Teh 3 1. Gatsby Unit, UCL, 2. University of Cambridge and 3. University of Oxford Outline
More informationEstimating the Greeks
IEOR E4703: Monte-Carlo Simulation Columbia University Estimating the Greeks c 207 by Martin Haugh In these lecture notes we discuss the use of Monte-Carlo simulation for the estimation of sensitivities
More informationIEOR E4703: Monte-Carlo Simulation
IEOR E4703: Monte-Carlo Simulation Generating Random Variables and Stochastic Processes Martin Haugh Department of Industrial Engineering and Operations Research Columbia University Email: martin.b.haugh@gmail.com
More informationChapter 2 Uncertainty Analysis and Sampling Techniques
Chapter 2 Uncertainty Analysis and Sampling Techniques The probabilistic or stochastic modeling (Fig. 2.) iterative loop in the stochastic optimization procedure (Fig..4 in Chap. ) involves:. Specifying
More information1 Explaining Labor Market Volatility
Christiano Economics 416 Advanced Macroeconomics Take home midterm exam. 1 Explaining Labor Market Volatility The purpose of this question is to explore a labor market puzzle that has bedeviled business
More informationMonte Carlo and Empirical Methods for Stochastic Inference (MASM11/FMSN50)
Monte Carlo and Empirical Methods for Stochastic Inference (MASM11/FMSN50) Magnus Wiktorsson Centre for Mathematical Sciences Lund University, Sweden Lecture 2 Random number generation January 18, 2018
More informationHomework Assignments
Homework Assignments Week 1 (p. 57) #4.1, 4., 4.3 Week (pp 58 6) #4.5, 4.6, 4.8(a), 4.13, 4.0, 4.6(b), 4.8, 4.31, 4.34 Week 3 (pp 15 19) #1.9, 1.1, 1.13, 1.15, 1.18 (pp 9 31) #.,.6,.9 Week 4 (pp 36 37)
More informationWeek 1 Quantitative Analysis of Financial Markets Distributions B
Week 1 Quantitative Analysis of Financial Markets Distributions B Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 October
More informationExam M Fall 2005 PRELIMINARY ANSWER KEY
Exam M Fall 005 PRELIMINARY ANSWER KEY Question # Answer Question # Answer 1 C 1 E C B 3 C 3 E 4 D 4 E 5 C 5 C 6 B 6 E 7 A 7 E 8 D 8 D 9 B 9 A 10 A 30 D 11 A 31 A 1 A 3 A 13 D 33 B 14 C 34 C 15 A 35 A
More informationEC316a: Advanced Scientific Computation, Fall Discrete time, continuous state dynamic models: solution methods
EC316a: Advanced Scientific Computation, Fall 2003 Notes Section 4 Discrete time, continuous state dynamic models: solution methods We consider now solution methods for discrete time models in which decisions
More informationBROWNIAN MOTION Antonella Basso, Martina Nardon
BROWNIAN MOTION Antonella Basso, Martina Nardon basso@unive.it, mnardon@unive.it Department of Applied Mathematics University Ca Foscari Venice Brownian motion p. 1 Brownian motion Brownian motion plays
More informationStratified Sampling in Monte Carlo Simulation: Motivation, Design, and Sampling Error
South Texas Project Risk- Informed GSI- 191 Evaluation Stratified Sampling in Monte Carlo Simulation: Motivation, Design, and Sampling Error Document: STP- RIGSI191- ARAI.03 Revision: 1 Date: September
More informationAn Improved Skewness Measure
An Improved Skewness Measure Richard A. Groeneveld Professor Emeritus, Department of Statistics Iowa State University ragroeneveld@valley.net Glen Meeden School of Statistics University of Minnesota Minneapolis,
More informationMath 416/516: Stochastic Simulation
Math 416/516: Stochastic Simulation Haijun Li lih@math.wsu.edu Department of Mathematics Washington State University Week 13 Haijun Li Math 416/516: Stochastic Simulation Week 13 1 / 28 Outline 1 Simulation
More informationWeek 7 Quantitative Analysis of Financial Markets Simulation Methods
Week 7 Quantitative Analysis of Financial Markets Simulation Methods Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 November
More informationFinal exam solutions
EE365 Stochastic Control / MS&E251 Stochastic Decision Models Profs. S. Lall, S. Boyd June 5 6 or June 6 7, 2013 Final exam solutions This is a 24 hour take-home final. Please turn it in to one of the
More informationCalculating VaR. There are several approaches for calculating the Value at Risk figure. The most popular are the
VaR Pro and Contra Pro: Easy to calculate and to understand. It is a common language of communication within the organizations as well as outside (e.g. regulators, auditors, shareholders). It is not really
More informationParticle methods and the pricing of American options
Particle methods and the pricing of American options Peng HU Oxford-Man Institute April 29, 2013 Joint works with P. Del Moral, N. Oudjane & B. Rémillard P. HU (OMI) University of Oxford 1 / 46 Outline
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning Monte Carlo Methods Mark Schmidt University of British Columbia Winter 2019 Last Time: Markov Chains We can use Markov chains for density estimation, d p(x) = p(x 1 ) p(x }{{}
More informationMonte Carlo and Empirical Methods for Stochastic Inference (MASM11/FMSN50)
Monte Carlo and Empirical Methods for Stochastic Inference (MASM11/FMSN50) Magnus Wiktorsson Centre for Mathematical Sciences Lund University, Sweden Lecture 1 Introduction January 16, 2018 M. Wiktorsson
More informationCPSC 540: Machine Learning
CPSC 540: Machine Learning Monte Carlo Methods Mark Schmidt University of British Columbia Winter 2018 Last Time: Markov Chains We can use Markov chains for density estimation, p(x) = p(x 1 ) }{{} d p(x
More information2 f. f t S 2. Delta measures the sensitivityof the portfolio value to changes in the price of the underlying
Sensitivity analysis Simulating the Greeks Meet the Greeks he value of a derivative on a single underlying asset depends upon the current asset price S and its volatility Σ, the risk-free interest rate
More informationMonte Carlo Methods in Structuring and Derivatives Pricing
Monte Carlo Methods in Structuring and Derivatives Pricing Prof. Manuela Pedio (guest) 20263 Advanced Tools for Risk Management and Pricing Spring 2017 Outline and objectives The basic Monte Carlo algorithm
More informationWithout Replacement Sampling for Particle Methods on Finite State Spaces. May 6, 2017
Without Replacement Sampling for Particle Methods on Finite State Spaces Rohan Shah Dirk P. Kroese May 6, 2017 1 1 Introduction Importance sampling is a widely used Monte Carlo technique that involves
More informationEE266 Homework 5 Solutions
EE, Spring 15-1 Professor S. Lall EE Homework 5 Solutions 1. A refined inventory model. In this problem we consider an inventory model that is more refined than the one you ve seen in the lectures. The
More information2 Control variates. λe λti λe e λt i where R(t) = t Y 1 Y N(t) is the time from the last event to t. L t = e λr(t) e e λt(t) Exercises
96 ChapterVI. Variance Reduction Methods stochastic volatility ISExSoren5.9 Example.5 (compound poisson processes) Let X(t) = Y + + Y N(t) where {N(t)},Y, Y,... are independent, {N(t)} is Poisson(λ) with
More informationStatistical Inference and Methods
Department of Mathematics Imperial College London d.stephens@imperial.ac.uk http://stats.ma.ic.ac.uk/ das01/ 14th February 2006 Part VII Session 7: Volatility Modelling Session 7: Volatility Modelling
More informationAdaptive Experiments for Policy Choice. March 8, 2019
Adaptive Experiments for Policy Choice Maximilian Kasy Anja Sautmann March 8, 2019 Introduction The goal of many experiments is to inform policy choices: 1. Job search assistance for refugees: Treatments:
More informationEstimation after Model Selection
Estimation after Model Selection Vanja M. Dukić Department of Health Studies University of Chicago E-Mail: vanja@uchicago.edu Edsel A. Peña* Department of Statistics University of South Carolina E-Mail:
More informationFinancial Time Series Volatility Analysis Using Gaussian Process State-Space Models
15 IEEE Global Conference on Signal and Information Processing (GlobalSIP) Financial Time Series Volatility Analysis Using Gaussian Process State-Space Models Jianan Han, Xiao-Ping Zhang Department of
More informationOptimum Thresholding for Semimartingales with Lévy Jumps under the mean-square error
Optimum Thresholding for Semimartingales with Lévy Jumps under the mean-square error José E. Figueroa-López Department of Mathematics Washington University in St. Louis Spring Central Sectional Meeting
More informationRelevant parameter changes in structural break models
Relevant parameter changes in structural break models A. Dufays J. Rombouts Forecasting from Complexity April 27 th, 2018 1 Outline Sparse Change-Point models 1. Motivation 2. Model specification Shrinkage
More informationLikelihood-based Optimization of Threat Operation Timeline Estimation
12th International Conference on Information Fusion Seattle, WA, USA, July 6-9, 2009 Likelihood-based Optimization of Threat Operation Timeline Estimation Gregory A. Godfrey Advanced Mathematics Applications
More informationIntroduction to Algorithmic Trading Strategies Lecture 8
Introduction to Algorithmic Trading Strategies Lecture 8 Risk Management Haksun Li haksun.li@numericalmethod.com www.numericalmethod.com Outline Value at Risk (VaR) Extreme Value Theory (EVT) References
More informationEstimation of dynamic term structure models
Estimation of dynamic term structure models Greg Duffee Haas School of Business, UC-Berkeley Joint with Richard Stanton, Haas School Presentation at IMA Workshop, May 2004 (full paper at http://faculty.haas.berkeley.edu/duffee)
More informationGamma. The finite-difference formula for gamma is
Gamma The finite-difference formula for gamma is [ P (S + ɛ) 2 P (S) + P (S ɛ) e rτ E ɛ 2 ]. For a correlation option with multiple underlying assets, the finite-difference formula for the cross gammas
More informationFinancial Mathematics and Supercomputing
GPU acceleration in early-exercise option valuation Álvaro Leitao and Cornelis W. Oosterlee Financial Mathematics and Supercomputing A Coruña - September 26, 2018 Á. Leitao & Kees Oosterlee SGBM on GPU
More informationST440/550: Applied Bayesian Analysis. (5) Multi-parameter models - Summarizing the posterior
(5) Multi-parameter models - Summarizing the posterior Models with more than one parameter Thus far we have studied single-parameter models, but most analyses have several parameters For example, consider
More informationHigh Dimensional Bayesian Optimisation and Bandits via Additive Models
1/20 High Dimensional Bayesian Optimisation and Bandits via Additive Models Kirthevasan Kandasamy, Jeff Schneider, Barnabás Póczos ICML 15 July 8 2015 2/20 Bandits & Optimisation Maximum Likelihood inference
More information[D7] PROBABILITY DISTRIBUTION OF OUTSTANDING LIABILITY FROM INDIVIDUAL PAYMENTS DATA Contributed by T S Wright
Faculty and Institute of Actuaries Claims Reserving Manual v.2 (09/1997) Section D7 [D7] PROBABILITY DISTRIBUTION OF OUTSTANDING LIABILITY FROM INDIVIDUAL PAYMENTS DATA Contributed by T S Wright 1. Introduction
More informationVolume 37, Issue 2. Handling Endogeneity in Stochastic Frontier Analysis
Volume 37, Issue 2 Handling Endogeneity in Stochastic Frontier Analysis Mustafa U. Karakaplan Georgetown University Levent Kutlu Georgia Institute of Technology Abstract We present a general maximum likelihood
More informationHomework 1 posted, due Friday, September 30, 2 PM. Independence of random variables: We say that a collection of random variables
Generating Functions Tuesday, September 20, 2011 2:00 PM Homework 1 posted, due Friday, September 30, 2 PM. Independence of random variables: We say that a collection of random variables Is independent
More informationBusiness Statistics 41000: Probability 3
Business Statistics 41000: Probability 3 Drew D. Creal University of Chicago, Booth School of Business February 7 and 8, 2014 1 Class information Drew D. Creal Email: dcreal@chicagobooth.edu Office: 404
More informationAsymptotic Methods in Financial Mathematics
Asymptotic Methods in Financial Mathematics José E. Figueroa-López 1 1 Department of Mathematics Washington University in St. Louis Statistics Seminar Washington University in St. Louis February 17, 2017
More informationThe Values of Information and Solution in Stochastic Programming
The Values of Information and Solution in Stochastic Programming John R. Birge The University of Chicago Booth School of Business JRBirge ICSP, Bergamo, July 2013 1 Themes The values of information and
More informationEstimating a Dynamic Oligopolistic Game with Serially Correlated Unobserved Production Costs. SS223B-Empirical IO
Estimating a Dynamic Oligopolistic Game with Serially Correlated Unobserved Production Costs SS223B-Empirical IO Motivation There have been substantial recent developments in the empirical literature on
More informationStrategies for Improving the Efficiency of Monte-Carlo Methods
Strategies for Improving the Efficiency of Monte-Carlo Methods Paul J. Atzberger General comments or corrections should be sent to: paulatz@cims.nyu.edu Introduction The Monte-Carlo method is a useful
More informationStochastic Grid Bundling Method
Stochastic Grid Bundling Method GPU Acceleration Delft University of Technology - Centrum Wiskunde & Informatica Álvaro Leitao Rodríguez and Cornelis W. Oosterlee London - December 17, 2015 A. Leitao &
More informationA New Hybrid Estimation Method for the Generalized Pareto Distribution
A New Hybrid Estimation Method for the Generalized Pareto Distribution Chunlin Wang Department of Mathematics and Statistics University of Calgary May 18, 2011 A New Hybrid Estimation Method for the GPD
More informationApproximate Revenue Maximization with Multiple Items
Approximate Revenue Maximization with Multiple Items Nir Shabbat - 05305311 December 5, 2012 Introduction The paper I read is called Approximate Revenue Maximization with Multiple Items by Sergiu Hart
More informationA potentially useful approach to model nonlinearities in time series is to assume different behavior (structural break) in different subsamples
1.3 Regime switching models A potentially useful approach to model nonlinearities in time series is to assume different behavior (structural break) in different subsamples (or regimes). If the dates, the
More informationLecture Notes 1
4.45 Lecture Notes Guido Lorenzoni Fall 2009 A portfolio problem To set the stage, consider a simple nite horizon problem. A risk averse agent can invest in two assets: riskless asset (bond) pays gross
More informationChapter 8: Sampling distributions of estimators Sections
Chapter 8 continued Chapter 8: Sampling distributions of estimators Sections 8.1 Sampling distribution of a statistic 8.2 The Chi-square distributions 8.3 Joint Distribution of the sample mean and sample
More informationChapter 7: Estimation Sections
1 / 31 : Estimation Sections 7.1 Statistical Inference Bayesian Methods: 7.2 Prior and Posterior Distributions 7.3 Conjugate Prior Distributions 7.4 Bayes Estimators Frequentist Methods: 7.5 Maximum Likelihood
More informationMachine Learning for Quantitative Finance
Machine Learning for Quantitative Finance Fast derivative pricing Sofie Reyners Joint work with Jan De Spiegeleer, Dilip Madan and Wim Schoutens Derivative pricing is time-consuming... Vanilla option pricing
More informationSemiparametric Modeling, Penalized Splines, and Mixed Models
Semi 1 Semiparametric Modeling, Penalized Splines, and Mixed Models David Ruppert Cornell University http://wwworiecornelledu/~davidr January 24 Joint work with Babette Brumback, Ray Carroll, Brent Coull,
More informationChapter 4: Commonly Used Distributions. Statistics for Engineers and Scientists Fourth Edition William Navidi
Chapter 4: Commonly Used Distributions Statistics for Engineers and Scientists Fourth Edition William Navidi 2014 by Education. This is proprietary material solely for authorized instructor use. Not authorized
More informationOptimally Thresholded Realized Power Variations for Lévy Jump Diffusion Models
Optimally Thresholded Realized Power Variations for Lévy Jump Diffusion Models José E. Figueroa-López 1 1 Department of Statistics Purdue University University of Missouri-Kansas City Department of Mathematics
More informationIEOR E4703: Monte-Carlo Simulation
IEOR E4703: Monte-Carlo Simulation Other Miscellaneous Topics and Applications of Monte-Carlo Martin Haugh Department of Industrial Engineering and Operations Research Columbia University Email: martin.b.haugh@gmail.com
More informationAsymptotic Theory for Renewal Based High-Frequency Volatility Estimation
Asymptotic Theory for Renewal Based High-Frequency Volatility Estimation Yifan Li 1,2 Ingmar Nolte 1 Sandra Nolte 1 1 Lancaster University 2 University of Manchester 4th Konstanz - Lancaster Workshop on
More informationDependence Structure and Extreme Comovements in International Equity and Bond Markets
Dependence Structure and Extreme Comovements in International Equity and Bond Markets René Garcia Edhec Business School, Université de Montréal, CIRANO and CIREQ Georges Tsafack Suffolk University Measuring
More informationWeek 1 Quantitative Analysis of Financial Markets Basic Statistics A
Week 1 Quantitative Analysis of Financial Markets Basic Statistics A Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 October
More informationThe Binomial Lattice Model for Stocks: Introduction to Option Pricing
1/33 The Binomial Lattice Model for Stocks: Introduction to Option Pricing Professor Karl Sigman Columbia University Dept. IEOR New York City USA 2/33 Outline The Binomial Lattice Model (BLM) as a Model
More information(5) Multi-parameter models - Summarizing the posterior
(5) Multi-parameter models - Summarizing the posterior Spring, 2017 Models with more than one parameter Thus far we have studied single-parameter models, but most analyses have several parameters For example,
More informationEstimating Mixed Logit Models with Large Choice Sets. Roger H. von Haefen, NC State & NBER Adam Domanski, NOAA July 2013
Estimating Mixed Logit Models with Large Choice Sets Roger H. von Haefen, NC State & NBER Adam Domanski, NOAA July 2013 Motivation Bayer et al. (JPE, 2007) Sorting modeling / housing choice 250,000 individuals
More informationSTA 532: Theory of Statistical Inference
STA 532: Theory of Statistical Inference Robert L. Wolpert Department of Statistical Science Duke University, Durham, NC, USA 2 Estimating CDFs and Statistical Functionals Empirical CDFs Let {X i : i n}
More informationImplementing Models in Quantitative Finance: Methods and Cases
Gianluca Fusai Andrea Roncoroni Implementing Models in Quantitative Finance: Methods and Cases vl Springer Contents Introduction xv Parti Methods 1 Static Monte Carlo 3 1.1 Motivation and Issues 3 1.1.1
More informationLecture 22: Dynamic Filtering
ECE 830 Fall 2011 Statistical Signal Processing instructor: R. Nowak Lecture 22: Dynamic Filtering 1 Dynamic Filtering In many applications we want to track a time-varying (dynamic) phenomenon. Example
More informationUQ, STAT2201, 2017, Lectures 3 and 4 Unit 3 Probability Distributions.
UQ, STAT2201, 2017, Lectures 3 and 4 Unit 3 Probability Distributions. Random Variables 2 A random variable X is a numerical (integer, real, complex, vector etc.) summary of the outcome of the random experiment.
More informationQuarterly Storage Model of U.S. Cotton Market: Estimation of the Basis under Rational Expectations. Oleksiy Tokovenko 1 Lewell F.
Quarterly Storage Model of U.S. Cotton Market: Estimation of the Basis under Rational Expectations Oleksiy Tokovenko 1 Lewell F. Gunter Selected Paper prepared for presentation at the American Agricultural
More informationForward Monte-Carlo Scheme for PDEs: Multi-Type Marked Branching Diffusions
Forward Monte-Carlo Scheme for PDEs: Multi-Type Marked Branching Diffusions Pierre Henry-Labordère 1 1 Global markets Quantitative Research, SOCIÉTÉ GÉNÉRALE Outline 1 Introduction 2 Semi-linear PDEs 3
More informationPoint Estimators. STATISTICS Lecture no. 10. Department of Econometrics FEM UO Brno office 69a, tel
STATISTICS Lecture no. 10 Department of Econometrics FEM UO Brno office 69a, tel. 973 442029 email:jiri.neubauer@unob.cz 8. 12. 2009 Introduction Suppose that we manufacture lightbulbs and we want to state
More informationCalibration of Interest Rates
WDS'12 Proceedings of Contributed Papers, Part I, 25 30, 2012. ISBN 978-80-7378-224-5 MATFYZPRESS Calibration of Interest Rates J. Černý Charles University, Faculty of Mathematics and Physics, Prague,
More informationRisk Estimation via Regression
Risk Estimation via Regression Mark Broadie Graduate School of Business Columbia University email: mnb2@columbiaedu Yiping Du Industrial Engineering and Operations Research Columbia University email: yd2166@columbiaedu
More information