Tutorial 11: Limit Theorems. Baoxiang Wang & Yihan Zhang bxwang, April 10, 2017

Similar documents
4.3 Normal distribution

Normal distribution Approximating binomial distribution by normal 2.10 Central Limit Theorem

Lecture 23. STAT 225 Introduction to Probability Models April 4, Whitney Huang Purdue University. Normal approximation to Binomial

Engineering Statistics ECIV 2305

Central Limit Theorem, Joint Distributions Spring 2018

Statistical Tables Compiled by Alan J. Terry

Version A. Problem 1. Let X be the continuous random variable defined by the following pdf: 1 x/2 when 0 x 2, f(x) = 0 otherwise.

Normal Distribution. Notes. Normal Distribution. Standard Normal. Sums of Normal Random Variables. Normal. approximation of Binomial.

The Normal Distribution

Homework Assignments

Math489/889 Stochastic Processes and Advanced Mathematical Finance Homework 5

UQ, STAT2201, 2017, Lectures 3 and 4 Unit 3 Probability Distributions.

Probability Theory and Simulation Methods. April 9th, Lecture 20: Special distributions

Problems from 9th edition of Probability and Statistical Inference by Hogg, Tanis and Zimmerman:

continuous rv Note for a legitimate pdf, we have f (x) 0 and f (x)dx = 1. For a continuous rv, P(X = c) = c f (x)dx = 0, hence

Unit 5: Sampling Distributions of Statistics

Unit 5: Sampling Distributions of Statistics

ECE 340 Probabilistic Methods in Engineering M/W 3-4:15. Lecture 10: Continuous RV Families. Prof. Vince Calhoun

IEOR 3106: Introduction to OR: Stochastic Models. Fall 2013, Professor Whitt. Class Lecture Notes: Tuesday, September 10.

MATH 3200 Exam 3 Dr. Syring

Central Limit Theorem (cont d) 7/28/2006

ME3620. Theory of Engineering Experimentation. Spring Chapter III. Random Variables and Probability Distributions.

Chapter 7: Point Estimation and Sampling Distributions

Chapter 4 Continuous Random Variables and Probability Distributions

Business Statistics 41000: Probability 4

Statistics 6 th Edition

5.3 Statistics and Their Distributions

Chapter 3 Common Families of Distributions. Definition 3.4.1: A family of pmfs or pdfs is called exponential family if it can be expressed as

Elementary Statistics Lecture 5

Chapter 4 Continuous Random Variables and Probability Distributions

STA258H5. Al Nosedal and Alison Weir. Winter Al Nosedal and Alison Weir STA258H5 Winter / 41

NORMAL APPROXIMATION. In the last chapter we discovered that, when sampling from almost any distribution, e r2 2 rdrdϕ = 2π e u du =2π.

IEOR 165 Lecture 1 Probability Review

Chapter 5. Sampling Distributions

STAT 241/251 - Chapter 7: Central Limit Theorem

Point Estimators. STATISTICS Lecture no. 10. Department of Econometrics FEM UO Brno office 69a, tel

STAT Chapter 7: Central Limit Theorem

Tutorial 6. Sampling Distribution. ENGG2450A Tutors. 27 February The Chinese University of Hong Kong 1/6

Sampling and sampling distribution

Sampling. Marc H. Mehlman University of New Haven. Marc Mehlman (University of New Haven) Sampling 1 / 20.

Midterm Exam III Review

BIO5312 Biostatistics Lecture 5: Estimations

Probability Distributions II

Random Variables Handout. Xavier Vilà

1. Covariance between two variables X and Y is denoted by Cov(X, Y) and defined by. Cov(X, Y ) = E(X E(X))(Y E(Y ))

Chapter 7. Sampling Distributions and the Central Limit Theorem

Lecture Stat 302 Introduction to Probability - Slides 15

Statistics for Business and Economics

Central Limit Thm, Normal Approximations

STAT/MATH 395 PROBABILITY II

STAT 111 Recitation 3

8.1 Estimation of the Mean and Proportion

Basic Data Analysis. Stephen Turnbull Business Administration and Public Policy Lecture 4: May 2, Abstract

Central limit theorems

15.063: Communicating with Data Summer Recitation 4 Probability III

AMS 7 Sampling Distributions, Central limit theorem, Confidence Intervals Lecture 4

STAT 830 Convergence in Distribution

4-1. Chapter 4. Commonly Used Distributions by The McGraw-Hill Companies, Inc. All rights reserved.

Review. Binomial random variable

What was in the last lecture?

MATH MW Elementary Probability Course Notes Part IV: Binomial/Normal distributions Mean and Variance

ECO220Y Continuous Probability Distributions: Normal Readings: Chapter 9, section 9.10

. (i) What is the probability that X is at most 8.75? =.875

MA131 Lecture 9.1. = µ = 25 and σ X P ( 90 < X < 100 ) = = /// σ X

Chapter 7. Sampling Distributions and the Central Limit Theorem

Chapter 5: Statistical Inference (in General)

Confidence Intervals Introduction

Chapter 8: The Binomial and Geometric Distributions

Sampling Distribution

Exam 2 Spring 2015 Statistics for Applications 4/9/2015

Statistics Class 15 3/21/2012

Random Variables and Probability Functions

Chapter 8: Sampling distributions of estimators Sections

MTH6154 Financial Mathematics I Stochastic Interest Rates

18.05 Problem Set 3, Spring 2014 Solutions

Chapter 4 Random Variables & Probability. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables

4.2 Probability Distributions

Exam M Fall 2005 PRELIMINARY ANSWER KEY

Lecture 9 - Sampling Distributions and the CLT

Probability. An intro for calculus students P= Figure 1: A normal integral

Chapter 5. Continuous Random Variables and Probability Distributions. 5.1 Continuous Random Variables

UNIVERSITY OF VICTORIA Midterm June 2014 Solutions

STAT 111 Recitation 4

Statistics 251: Statistical Methods Sampling Distributions Module

Populations and Samples Bios 662

STATISTICS and PROBABILITY

Lecture 17. The model is parametrized by the time period, δt, and three fixed constant parameters, v, σ and the riskless rate r.

Class 16. Daniel B. Rowe, Ph.D. Department of Mathematics, Statistics, and Computer Science. Marquette University MATH 1700

AMS7: WEEK 4. CLASS 3

Chapter 7.2: Large-Sample Confidence Intervals for a Population Mean and Proportion. Instructor: Elvan Ceyhan

Chapter 9: Sampling Distributions

Basic notions of probability theory: continuous probability distributions. Piero Baraldi

Lecture 9 - Sampling Distributions and the CLT. Mean. Margin of error. Sta102/BME102. February 6, Sample mean ( X ): x i

Commonly Used Distributions

Week 2 Quantitative Analysis of Financial Markets Hypothesis Testing and Confidence Intervals

Law of Large Numbers, Central Limit Theorem

Probability Theory. Mohamed I. Riffi. Islamic University of Gaza

Another Look at Normal Approximations in Cryptanalysis

Probability is the tool used for anticipating what the distribution of data should look like under a given model.

Chapter 4 Probability Distributions

Transcription:

Tutorial 11: Limit Theorems Baoxiang Wang & Yihan Zhang bxwang, yhzhang@cse.cuhk.edu.hk April 10, 2017 1

Outline The Central Limit Theorem (CLT) Normal Approximation Based on CLT De Moivre-Laplace Approximation to the Binomial Problems and solutions

Formally Let S n = X 1 + + X n, where the X i are i.i.d. random variables with mean μ, variance σ 2. Define Z n = S n nμ σ n = X 1 + + X n nμ σ n.

Zero mean and unit variance An easy calculation yields E[Z n ] = E X 1 + + X n σ n nμ = 0 For variance, we have var Z n = var X 1 + + X n σ n 2 = nσ2 nσ 2 = 1

The Central Limit Theorem (CLT) The Central Limit Theorem The CDF of Z n = X 1+ +X n nμ σ n converges to standard normal CDF Φ z = 1 z 2π න e x2 /2 dx in the sense that lim P Z n z = Φ(z) n The distribution of the r.v.z n approaches a normal distribution.

Normal Approximation Based on CLT Given S n = X 1 + + X n, where the X i s are i.i.d. random variables with mean μ and variance σ 2. If n is large, the probability P S n c can be approximated by treating S n as if it were normal, according to the following procedure. I. Calculate the mean nμ and variance nσ 2. II. Calculate z = (c nμ)/σ n. III. Use the approximation P S n c Φ z.

De Moivre-Laplace Approximation to the Binomial Plugging μ = p, σ = p(1 p), we get the following de Moivre-Laplace Approximation to the Binomial. If S n is a binomial random variable with parameters n and p, n is large, and k, l are nonnegative integers, then P k S n l l + 1/2 np k 1/2 np Φ Φ np(1 p) np(1 p)

Problem 8. Before starting to play the roulette in a casino, you want to look for biases that you can exploit. You therefore watch 100 rounds that result in a number between 1 and 36, and count the number of rounds for which the result is odd. If the count exceeds 55, you decide that the roulette is not fair.

Question Assuming that the roulette is fair, find an approximation for the probability that you will make the wrong decision.

Solution Let S be the number of times that the result was odd, which is a binomial random variable, with n = 100 and p = 0.5, so that and E[S] = 100 0.5 = 50 σ s = 100 0.5 0.5 = 5

Using the normal approximation to the binomial, we find P S > 55 = P S 50 5 = 1 P(z 1) 1 Φ 1 = 1 0.8413 = 0.1587 > 55 50 5

Alternative solution A better approximation can be obtained by using the de Moivre- Laplace approximation, which yields P S > 55 = P S 55.5 = P S 50 5 1 Φ 1.1 = 1 0.8643 = 0.1357. > 55.5 50 5

Problem 9 During each day, the probability that your computer's operating system crashes at least once is 5%, independent of every other day. You are interested in the probability of at least 45 crash-free days out of the next 50 days.

Question (a) Find the probability of interest by using the normal approximation to the binomial. (b) Repeat part (a), this time using the Poisson approximation to the binomial.

Solution (a) Let S be the number of crash-free days, which is a binomial random variable with n = 50 and p = 0.95, so that And E[S] = 50 0.95 = 47.5 σ s = 50 0.95 0.05 = 1.54

Using the normal approximation to the binomial, we find S 47.5 45 47.5 P S 45 = P 1.54 1.54 1 Φ 1.62 = Φ 1.62 = 0.9474.

Using the de Moivre-Laplace approximation, we yields P S 45 = P S > 44.5 = P S 47.5 44.5 47.5 1.54 1.54 = 1 P(z 1.95) 1 Φ 1.95 = Φ 1.95 = 0.9744.

Solution (b) The random variable S is binomial with parameter p = 0.95. However, the random variable 50 S (the number of crashes) is also binomial with parameter p = 0.05. Since the Poisson approximation is exact in the limit of small p and large n, it will give more accurate results if applied to 50 S.

We will therefore approximate 50 S by a Poisson random variable with parameter λ = 50 0.05 = 2.5. Thus, P S 45 = P 50 S 5 = σ5 k=0 P(n S = k) = σ k=0 5 e λ λk k! = 0.958.

It is instructive to compare with the exact probability which is 5 k=0 50 k 0.05k 0.95 50 k = 0.962

Interpretation The Poisson approximation is closer. This is consistent with the intuition that the normal approximation to the binomial works well when p is close to 0.5 or n is very large, which is not the case here. On the other hand, the calculations based on the normal approximation are generally less tedious.

Problem 11 Let X 1, Y 1, X 2, Y 2, be independent random variables, uniformly distributed in the unit interval [0, 1], and let W = X 1 + + X 16 Y 1 + + Y 16 16

Question Find a numerical approximation to the quantity P( W E[W] < 0.001)

Solution Note that W is the sample mean of 16 independent identically distributed random variables of the form X i Y i, and a normal approximation is appropriate. The random variables X i Y i have zero mean, and variance equal to 2/12. Therefore, the mean of W is zero, and its variance is (2/12)/16 = 1/96.

Thus, P W < 0.001 = P Φ 0.001 96 Φ 0.001 96 W 1 96 < 0.001 1 96 = 2Φ 0.001 96 1 = 2Φ 0.0098 1 2 0.504 1 = 0.008

Alternative solution Let us also point out a different approach that bypasses the need for the normal table. Let Z be a normal random variable with zero mean and standard deviation equal to 1/ 96. The standard deviation of Z, which is about 0.1, is much larger than 0.001. Thus, within the interval [ 0.001, 0.001], the PDF of Z is approximately constant.

P(z δ Z z + δ) f Z (z) 2δ, z = 0, δ = 0.001, P W < 0.001 = P( 0.001 z 0.001) f Z 0 0.002 = 0.002 2π( 1 96 ) = 0.0078