The Monte Carlo Method in High Performance Computing

Similar documents
Monte Carlo: Naïve Lecture #14 Rigoberto Hernandez. Monte Carlo Methods 3

Monte Carlo: Naïve Lecture #19 Rigoberto Hernandez. Monte Carlo Methods 3

CPSC 540: Machine Learning

Application of MCMC Algorithm in Interest Rate Modeling

Chapter 7: Estimation Sections

CPSC 540: Machine Learning

COS 513: Gibbs Sampling

Top-down particle filtering for Bayesian decision trees

Down-Up Metropolis-Hastings Algorithm for Multimodality

Calibration of Interest Rates

Monte Carlo Based Reliability Analysis

Estimating a Dynamic Oligopolistic Game with Serially Correlated Unobserved Production Costs. SS223B-Empirical IO

Evidence from Large Indemnity and Medical Triangles

Extracting Information from the Markets: A Bayesian Approach

Markov Processes and Applications

Evidence from Large Workers

Machine Learning for Quantitative Finance

Adaptive Experiments for Policy Choice. March 8, 2019

Introduction to Sequential Monte Carlo Methods

Bayesian Multinomial Model for Ordinal Data

"Pricing Exotic Options using Strong Convergence Properties

Chapter 2 Uncertainty Analysis and Sampling Techniques

Posterior Inference. , where should we start? Consider the following computational procedure: 1. draw samples. 2. convert. 3. compute properties

Statistical Computing (36-350)

SIMULATION METHOD FOR SOLVING HYBRID INFLUENCE DIAGRAMS IN DECISION MAKING. Xi Chen Enlu Zhou

An Introduction to Bayesian Inference and MCMC Methods for Capture-Recapture

Bayesian course - problem set 3 (lecture 4)

Oil Price Volatility and Asymmetric Leverage Effects

Analysis of the Bitcoin Exchange Using Particle MCMC Methods

Semiparametric Modeling, Penalized Splines, and Mixed Models David Ruppert Cornell University

Adaptive Metropolis-Hastings samplers for the Bayesian analysis of large linear Gaussian systems

Extend the ideas of Kan and Zhou paper on Optimal Portfolio Construction under parameter uncertainty

A potentially useful approach to model nonlinearities in time series is to assume different behavior (structural break) in different subsamples

The Time-Varying Effects of Monetary Aggregates on Inflation and Unemployment

Semiparametric Modeling, Penalized Splines, and Mixed Models

Part II: Computation for Bayesian Analyses

Technical Appendix: Policy Uncertainty and Aggregate Fluctuations.

GENERATION OF STANDARD NORMAL RANDOM NUMBERS. Naveen Kumar Boiroju and M. Krishna Reddy

A Non-Random Walk Down Wall Street

Random Tree Method. Monte Carlo Methods in Financial Engineering

Importance sampling and Monte Carlo-based calibration for time-changed Lévy processes

MCMC Package Example (Version 0.5-1)

A start of Variational Methods for ERGM Ranran Wang, UW

Chapter 7: Estimation Sections

Actuarial Society of India EXAMINATIONS

A Markov Chain Monte Carlo Approach to Estimate the Risks of Extremely Large Insurance Claims

STAT 509: Statistics for Engineers Dr. Dewei Wang. Copyright 2014 John Wiley & Sons, Inc. All rights reserved.

Institute of Actuaries of India Subject CT6 Statistical Methods

The Multinomial Logit Model Revisited: A Semiparametric Approach in Discrete Choice Analysis

Parallel Multilevel Monte Carlo Simulation

Evaluation of a New Variance Components Estimation Method Modi ed Henderson s Method 3 With the Application of Two Way Mixed Model

MCMC Package Example

Chapter 7: Estimation Sections

Relevant parameter changes in structural break models

(5) Multi-parameter models - Summarizing the posterior

The Values of Information and Solution in Stochastic Programming

Estimation after Model Selection

International Finance. Estimation Error. Campbell R. Harvey Duke University, NBER and Investment Strategy Advisor, Man Group, plc.

Comparative analysis of three MCMC methods for estimating GARCH models

# generate data num.obs <- 100 y <- rnorm(num.obs,mean = theta.true, sd = sqrt(sigma.sq.true))

Option Pricing Using Bayesian Neural Networks

Sample Size Calculations for Odds Ratio in presence of misclassification (SSCOR Version 1.8, September 2017)

M.Sc. ACTUARIAL SCIENCE. Term-End Examination

Chapter 5 Univariate time-series analysis. () Chapter 5 Univariate time-series analysis 1 / 29

Supplementary Material: Strategies for exploration in the domain of losses

BAYESIAN STATISTICAL PROCESS ADJUSTMENT FOR UNKNOWN PARAMETER SYSTEMS

On Implementation of the Markov Chain Monte Carlo Stochastic Approximation Algorithm

(11) Case Studies: Adaptive clinical trials. ST440/540: Applied Bayesian Analysis

Stochastic Claims Reserving _ Methods in Insurance

Statistical Inference and Methods

Estimation Appendix to Dynamics of Fiscal Financing in the United States

Algorithmic and High-Frequency Trading

RESEARCH ARTICLE. The Penalized Biclustering Model And Related Algorithms Supplemental Online Material

Consistent estimators for multilevel generalised linear models using an iterated bootstrap

Analysis of extreme values with random location Abstract Keywords: 1. Introduction and Model

Lecture 17: More on Markov Decision Processes. Reinforcement learning

ECON5160: The compulsory term paper

Bayesian Inference for Random Coefficient Dynamic Panel Data Models

Omitted Variables Bias in Regime-Switching Models with Slope-Constrained Estimators: Evidence from Monte Carlo Simulations

1 Overview. 2 The Gradient Descent Algorithm. AM 221: Advanced Optimization Spring 2016

Multi-Regime Analysis

High Dimensional Bayesian Optimisation and Bandits via Additive Models

Generating Random Numbers

SELECTION OF VARIABLES INFLUENCING IRAQI BANKS DEPOSITS BY USING NEW BAYESIAN LASSO QUANTILE REGRESSION

A Practical Implementation of the Gibbs Sampler for Mixture of Distributions: Application to the Determination of Specifications in Food Industry

A Comparative Analysis of Crossover Variants in Differential Evolution

1. You are given the following information about a stationary AR(2) model:

BSc (Hons) Software Engineering BSc (Hons) Computer Science with Network Security

Lecture 4: Model-Free Prediction

Week 1 Quantitative Analysis of Financial Markets Distributions B

Optimal Stopping. Nick Hay (presentation follows Thomas Ferguson s Optimal Stopping and Applications) November 6, 2008

Sequential Monte Carlo Samplers

Implementing Models in Quantitative Finance: Methods and Cases

Computational Statistics Handbook with MATLAB

The Market Price of Risk and the Equity Premium: A Legacy of the Great Depression? by Cogley and Sargent

List of tables List of boxes List of screenshots Preface to the third edition Acknowledgements

Internet Appendix for Asymmetry in Stock Comovements: An Entropy Approach

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER

Using MCMC and particle filters to forecast stochastic volatility and jumps in financial time series

PROBABILITY. Wiley. With Applications and R ROBERT P. DOBROW. Department of Mathematics. Carleton College Northfield, MN

Transcription:

The Monte Carlo Method in High Performance Computing Dieter W. Heermann Monte Carlo Methods 2015 Dieter W. Heermann (Monte Carlo Methods)The Monte Carlo Method in High Performance Computing 2015 1 / 1

Outline The Monte Carlo Method in High Performance Computing Dieter W. Heermann (Monte Carlo Methods)The Monte Carlo Method in High Performance Computing 2015 2 / 1

Revisiting the Poor Man s Monte Carlo Poor Man s Monte Carlo: Statistics Let X 1,...X n be iid random variables with E(X i ) = m = µ for all i. Then E( 1 n n i=1 X i) is an estimator for µ with variance v/n. Assume that we have κ processors available. Assume each processor j produces n samples independently. Then: E j := 1 n E = 1 κ n i=1 X (j) i (1) κ E j (2) j=1 Var(E) = v/(κn) (3) The variance is reduced linearly proportional to κ. Dieter W. Heermann (Monte Carlo Methods)The Monte Carlo Method in High Performance Computing 2015 3 / 1

Revisiting the Poor Man s Monte Carlo Assume now that each processor runs at different speeds. Each processor j obtains n j results. Then E = Assume that we know n j a priori. n j=1 E j κ j=1 n j Hence, if n j has not been reached after a prescribed time, then there is a problem with the processor. Dieter W. Heermann (Monte Carlo Methods)The Monte Carlo Method in High Performance Computing 2015 4 / 1

Revisiting the Poor Man s Monte Carlo { 0 does not report nj Let Z j = 1 otherwise Then n j=1 E = Z je j κ j=1 n jz j No bias. Possibility: The master process assigns the starting values (random number seeds). Lost results are re-computed. Dieter W. Heermann (Monte Carlo Methods)The Monte Carlo Method in High Performance Computing 2015 5 / 1

Revisiting the Poor Man s Monte Carlo Dropping of initial non-stationary state values. 1 nj E j := n j B j +1 i=b j X (j) i Then n j=1 E = Z je j κ j=1 n jz j Will not lead to an exact linear speed-up. Dieter W. Heermann (Monte Carlo Methods)The Monte Carlo Method in High Performance Computing 2015 6 / 1

Parallel Tempering Parallel Tempering Basic idea [1, 2, 3]: Run several (possibly as many as there are processors) individual Markov chains. Supplementing local configurational Metropolis moves with global swap moves that update an entire set of configurations. The replica exchanges are accepted with the Metropolis acceptance probability and rejected with the remaining probability. Induce mixing between the Markov chains. Detailed balance is enforced and the sampled distributions remain stationary. This can be used, for example as a means to accelerate the convergence of Monte Carlo simulations. Widely employed in chemistry, physics, biology, engineering and materials science [4]. Dieter W. Heermann (Monte Carlo Methods)The Monte Carlo Method in High Performance Computing 2015 7 / 1

Parallel Tempering x 0 (1) P (1) x 0 (10) P (10) Swap the elements of the Markov chains. The swap moves induce a random walk in for a given swap element. Dieter W. Heermann (Monte Carlo Methods)The Monte Carlo Method in High Performance Computing 2015 8 / 1

Parallel Tempering Let r uniform(0, 1) be a uniformly distributed random number. Accept or reject a swap between the Markov chains as r < P i(x j )P j (X i ) P i (X i )P j (X j ) Exchange step is not capable of destroying or creating new configurations. New configurations are generated within one Markov chain. Dieter W. Heermann (Monte Carlo Methods)The Monte Carlo Method in High Performance Computing 2015 9 / 1

Parallel Tempering Data Model Prior information D M I Target Posterior p X Α D,M,I Control system diagnostics Hybrid n no. of iterations X Α iterations parallel tempering X Α init start parameters Summary statistics MCMC Σ Α init start proposal Σ 's Best fit model & residuals Nonlinear model Β Tempering levels X Α marginals fitting program X Α 68.3% credible regions p D M,I marginal likelihood for model comparison Adaptive Two Stage Control System 1 Automates selection of an efficient set of Gaussian proposal distribution Σ 's using an annealing operation. 2 Monitors MCMC for emergence of significantly improved parameter set and resets MCMC. Includes a gene crossover algorithm to breed higher probability chains. Figure 1: Schematic of the Bayesian nonlinear model fitting program which employs the hybrid MCMC algorithm to carry out the Bayesian integrals. Taken from: Detecting Extra-solar Planets with a Bayesian hybrid MCMC Kepler periodogram P. C. Gregory, in JSM Proceedings 2008, Planets Around Other Suns Section. Denver, CO: American Statistical Association. Dieter W. Heermann (Monte Carlo Methods)The Monte Carlo Method in High Performance Computing 2015 10 / 1

Simulated Annealing Simulated Annealing Initialization i=0 xopt=x0; Fopt(xi)=Fobj(x0) i=i+1 xi+1=xi+ Fobj,i+1=Fobj(xi+1) No Is xi+1 feasible? Yes Fobj,i+1<Fopt Yes xopt=xi+1 No No Fopt=Fobj,i+1 Stopping criteria satisfied? Yes End Figure 2.8 Flow diagram of the Localized Random Search (LRS) Algorithm Figure taken from Plantwide Optimizing Control for the Continuous Bio-Ethanol Production Process (PhD Thesis) Silvia Mercedes Ochoa Cáceres Dieter W. Heermann (Monte Carlo Methods)The Monte Carlo Method in High Performance Computing 2015 11 / 1

Simulated Annealing PRODUCT F D2 PC Q C F D1 PC LC LC RECTIFICATION COLUMN FEED BEER COLUMN F B2 LC Q R2 F B1 Q R1 Figure 3.11 Flow diagram of a two-distillation column system for ethanol purification. Beer column: no condenser. Rectification column: partial condenser. Figure taken from Plantwide Optimizing Control for the Continuous Bio-Ethanol Production Process (PhD Thesis) Silvia Mercedes Ochoa Cáceres Dieter W. Heermann (Monte Carlo Methods)The Monte Carlo Method in High Performance Computing 2015 12 / 1

Literature Literature Swendsen, R. H. and Wang, J.-S. Replica Monte Carlo Simulation of Spin-Glasses, Phys. Rev. Lett., 57(21):2607?2609, (1986). C.J. Geyer, Markov Chain Monte Carlo Maximum Likelihood, in Computing Science and Statistics: Proceedings of the 23rd Symposium on the Interface, E.M. Keramidas, ed., American Statistical Association, New York (1991). Hukushima K and Nemoto K, Exchange Monte Carlo method and application to spin glass simulations, J. Phys. Soc. Japan 65 1604 (1996) David J. Earl and Michael W. Deem, Phys. Chem. Chem. Phys., 2005, 7, 3910-3916 Dieter W. Heermann (Monte Carlo Methods)The Monte Carlo Method in High Performance Computing 2015 13 / 1