STAT/MATH 395 PROBABILITY II

Similar documents
Chapter 2. Random variables. 2.3 Expectation

IEOR 165 Lecture 1 Probability Review

Lecture Notes 6. Assume F belongs to a family of distributions, (e.g. F is Normal), indexed by some parameter θ.

Math489/889 Stochastic Processes and Advanced Mathematical Finance Homework 5

Engineering Statistics ECIV 2305

Chapter 3 - Lecture 4 Moments and Moment Generating Funct

Tutorial 6. Sampling Distribution. ENGG2450A Tutors. 27 February The Chinese University of Hong Kong 1/6

UQ, STAT2201, 2017, Lectures 3 and 4 Unit 3 Probability Distributions.

Normal Distribution. Notes. Normal Distribution. Standard Normal. Sums of Normal Random Variables. Normal. approximation of Binomial.

Homework Assignments

Asymptotic results discrete time martingales and stochastic algorithms

Chapter 5. Statistical inference for Parametric Models

Random Variables Handout. Xavier Vilà

Chapter 3 Common Families of Distributions. Definition 3.4.1: A family of pmfs or pdfs is called exponential family if it can be expressed as

Probability Theory. Mohamed I. Riffi. Islamic University of Gaza

Version A. Problem 1. Let X be the continuous random variable defined by the following pdf: 1 x/2 when 0 x 2, f(x) = 0 otherwise.

CPSC 540: Machine Learning

Math 489/Math 889 Stochastic Processes and Advanced Mathematical Finance Dunbar, Fall 2007

MTH6154 Financial Mathematics I Stochastic Interest Rates

5. In fact, any function of a random variable is also a random variable

508-B (Statistics Camp, Wash U, Summer 2016) Asymptotics. Author: Andrés Hincapié and Linyi Cao. This Version: August 9, 2016

CPSC 540: Machine Learning

Tutorial 11: Limit Theorems. Baoxiang Wang & Yihan Zhang bxwang, April 10, 2017

6. Continous Distributions

Statistical analysis and bootstrapping

Chapter 8: Sampling distributions of estimators Sections

Lecture 23. STAT 225 Introduction to Probability Models April 4, Whitney Huang Purdue University. Normal approximation to Binomial

Drunken Birds, Brownian Motion, and Other Random Fun

Probability Theory and Simulation Methods. April 9th, Lecture 20: Special distributions

Point Estimation. Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage

Point Estimators. STATISTICS Lecture no. 10. Department of Econometrics FEM UO Brno office 69a, tel

Statistics, Measures of Central Tendency I

18.440: Lecture 32 Strong law of large numbers and Jensen s inequality

ECSE B Assignment 5 Solutions Fall (a) Using whichever of the Markov or the Chebyshev inequalities is applicable, estimate

Central Limit Theorem, Joint Distributions Spring 2018

Central Limit Thm, Normal Approximations

Outline Brownian Process Continuity of Sample Paths Differentiability of Sample Paths Simulating Sample Paths Hitting times and Maximum

Elementary Statistics Lecture 5

Welcome to Stat 410!

Chapter 7. Sampling Distributions and the Central Limit Theorem

Chapter 8: Sampling distributions of estimators Sections

Probability without Measure!

IEOR 3106: Introduction to OR: Stochastic Models. Fall 2013, Professor Whitt. Class Lecture Notes: Tuesday, September 10.

ECE 295: Lecture 03 Estimation and Confidence Interval

arxiv: v1 [math.st] 18 Sep 2018

. (i) What is the probability that X is at most 8.75? =.875

MATH 3200 Exam 3 Dr. Syring

Estimating the Greeks

Chapter 2: Random Variables (Cont d)

Exercise. Show the corrected sample variance is an unbiased estimator of population variance. S 2 = n i=1 (X i X ) 2 n 1. Exercise Estimation

4.3 Normal distribution

Convergence. Any submartingale or supermartingale (Y, F) converges almost surely if it satisfies E Y n <. STAT2004 Martingale Convergence

Central Limit Theorem (cont d) 7/28/2006

CS134: Networks Spring Random Variables and Independence. 1.2 Probability Distribution Function (PDF) Number of heads Probability 2 0.

Sampling Distribution

Expectations. Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or

STOR Lecture 7. Random Variables - I

ASSIGNMENT - 1, MAY M.Sc. (PREVIOUS) FIRST YEAR DEGREE STATISTICS. Maximum : 20 MARKS Answer ALL questions.

Chapter 7: Point Estimation and Sampling Distributions

MA : Introductory Probability

The Normal Distribution

Theoretical Statistics. Lecture 4. Peter Bartlett

Monte Carlo and Empirical Methods for Stochastic Inference (MASM11/FMSN50)

Chapter 7. Sampling Distributions and the Central Limit Theorem

9 Expectation and Variance

Chapter 3 - Lecture 3 Expected Values of Discrete Random Va

Definition 9.1 A point estimate is any function T (X 1,..., X n ) of a random sample. We often write an estimator of the parameter θ as ˆθ.

Chapter 4: Asymptotic Properties of MLE (Part 3)

PROBABILITY. Wiley. With Applications and R ROBERT P. DOBROW. Department of Mathematics. Carleton College Northfield, MN

An Introduction to Stochastic Calculus

Introduction to Statistics I

Probability & Statistics

Binomial Random Variables. Binomial Random Variables

Applied Statistics I

Exam M Fall 2005 PRELIMINARY ANSWER KEY

A class of coherent risk measures based on one-sided moments

AMH4 - ADVANCED OPTION PRICING. Contents

INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY. Lecture -5 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc.

STAT Chapter 7: Central Limit Theorem

Lecture 10: Point Estimation

Lecture Neyman Allocation vs Proportional Allocation and Stratified Random Sampling vs Simple Random Sampling

continuous rv Note for a legitimate pdf, we have f (x) 0 and f (x)dx = 1. For a continuous rv, P(X = c) = c f (x)dx = 0, hence

Bernoulli and Binomial Distributions

I. Time Series and Stochastic Processes

Chapter 4: Commonly Used Distributions. Statistics for Engineers and Scientists Fourth Edition William Navidi

Lecture 22. Survey Sampling: an Overview

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER

Chapter 3 - Lecture 5 The Binomial Probability Distribution

Modelling financial data with stochastic processes

6 If and then. (a) 0.6 (b) 0.9 (c) 2 (d) Which of these numbers can be a value of probability distribution of a discrete random variable

Martingales. by D. Cox December 2, 2009

Chapter 7: Estimation Sections

Homework Problems Stat 479

Point Estimation. Edwin Leuven


Week 2 Quantitative Analysis of Financial Markets Hypothesis Testing and Confidence Intervals

UNIVERSITY OF VICTORIA Midterm June 2014 Solutions

MATH3075/3975 FINANCIAL MATHEMATICS TUTORIAL PROBLEMS

Review for Final Exam Spring 2014 Jeremy Orloff and Jonathan Bloom

Basic notions of probability theory: continuous probability distributions. Piero Baraldi

Transcription:

STAT/MATH 395 PROBABILITY II Distribution of Random Samples & Limit Theorems Néhémy Lim University of Washington Winter 2017

Outline Distribution of i.i.d. Samples Convergence of random variables The Laws of Large Numbers The Central Limit Theorem

Outline Distribution of i.i.d. Samples Convergence of random variables The Laws of Large Numbers The Central Limit Theorem

Internet usage time in the United States 2010-2015 Time spent on selected platforms by digital population in the United States as of December 2015 (in billion minutes)

Definition (i.i.d. Sample) Let X 1,..., X n be collection of n N random variables on probability space (Ω, A, P). (X 1,..., X n ) is a random sample of size n if and only if: (1) X 1,..., X n are mutually independent (2) X 1,..., X n are identically distributed We say that X 1,..., X n are independent and identically distributed (abbreviated i.i.d.). Definition (Sample mean) Let (X 1,..., X n ) be a random sample.then the random variable is called the sample mean. X n = 1 n X i (1) n i=1

Definition (Independence) Let X 1,..., X n be n N rrvs on probability space (Ω, A, P). Then X 1,..., X n are said to be independent if and only if : (Discrete case) For all (x 1,..., x n ) X 1 (Ω)... X n (Ω) n p X1...X n (x 1,..., x n ) = p Xi (x i ), (2) i=1 (Continuous case) For all (x 1,..., x n ) R n n f X1...X n (x 1,..., x n ) = f Xi (x i ), (3) i=1

Property (Distribution of iid sample) Let (X 1,..., X n ) be a random sample of size n N on probability space (Ω, A, P). Then (Discrete case) For all (x 1,..., x n ) X 1 (Ω)... X n (Ω) n p X1...X n (x 1,..., x n ) = p X1 (x i ), (4) i=1 (Continuous case) For all (x 1,..., x n ) R n n f X1...X n (x 1,..., x n ) = f X1 (x i ), (5) i=1 Example. Let (X 1,..., X n ) be a random sample of size n N from an exponential distribution with parameter λ. Give the joint pdf of the sample.

Definition (Expected Value) Let X 1,..., X n be n N rrvs on probability space (Ω, A, P) and let g : R n R. Then, the mathematical expectation of g(x 1... X n ), if it exists, is : (Discrete case) E[g(X 1... X n )] = (Continuous case) E[g(X 1... X n )] = x 1 X 1 (Ω)... x n X n(ω) g(x 1,..., x n )p X1...X n (x 1,..., x n ) g(x 1,..., x n )f X1...X n (x 1,..., x n ) R n (6) (7)

Theorem Let X 1,..., X n be n N independent rrvs on probability space (Ω, A, P) and let g 1,..., g n be n real-valued functions on R. Then, E[g 1 (X 1 )... g n (X n )] = E[g 1 (X 1 )]... E[g n (X n )] (8) provided that the expectations exist. Theorem (Variance of independent rrvs) Let X 1,..., X n be n N independent rrvs on probability space (Ω, A, P). Then, provided that the variances exist. ( n ) n Var X i = Var(X i ) (9) i=1 i=1

Property Let (X 1,..., X n ) be a random sample of size n N on probability space (Ω, A, P) with mean µ = E[X 1 ] and variance σ 2 = Var(X 1 ) <. Then E[X n ] = µ (10) and Var(X n ) = σ2 n (11)

Outline Distribution of i.i.d. Samples Convergence of random variables The Laws of Large Numbers The Central Limit Theorem

Definition Let (X n ) n N be a sequence of rrvs on probability space (Ω, A, P) and X be a rrv on the same probability space. Sequence (X n ) is said to converge in probability towards X if, for all ɛ > 0 : lim P ( X n X > ɛ) = 0 (12) n Convergence in probability is denoted as follows : X n P X Example Let X be a discrete rrv with pmf p X defined by : 1/3 if x = 1 p X (x) = 2/3 if x = 0 0 otherwise and let X n = (1 + 1 n )X.

Definition Let (X n ) n N be a sequence of rrvs on probability space (Ω, A, P) and X be a rrv on the same probability space. Sequence (X n ) is said to converge almost surely or almost everywhere or with probability 1 or strongly towards X if: ( ) P lim X n = X = 1 (13) n Almost sure convergence is denoted as follows : X n a.s. X

Definition Let (X n ) n N be a sequence of rrvs on probability space (Ω, A, P). For any n, the distribution function of X n is denoted by F n. Let X be a rrv with distribution function F X. Sequence (X n ) is said to converge in distribution or converge weakly towards X if: lim F n (x) = F X (x) (14) n for all x R at which F X is continuous. Convergence in distribution is denoted as follows : X n D X Example. Let (X n ) n N be a sequence of rrvs with cdf F n ( ( F n (x) = 1 1 1 ) nx ) 1 n (0, ) (x)

Theorem Let (X n ) n N be a sequence of rrvs on probability space (Ω, A, P) with respective mgfs M n. Let X be a rrv with mgf M X. If the following holds : lim M n (x) = M X (x) (15) n for all x R where M n (x) and M X (x) exist, then sequence (X n ) converges in distribution to X.

Outline Distribution of i.i.d. Samples Convergence of random variables The Laws of Large Numbers The Central Limit Theorem

Property (Markov s Inequality) Let X be a rrv that takes only on nonnegative values. Then, for any a > 0, we have : P(X a) E[X] a (16) Property (Bienaymé-Chebyshev s Inequality) Let X be a rrv that has expectation and variance. Then, for any α > 0, we have : P ( X E[X] α) Var(X) α 2 (17)

Theorem (Weak law of large numbers) Let (X n ) n N be a sequence of i.i.d. rrvs, each having finite expectation. The weak law of large numbers (also called Khintchine s law) states that the sample mean X n converges in probability towards E[X 1 ], that is, for all ɛ > 0 : ( ) lim P X n E[X 1 ] > ɛ = 0 (18) n

Theorem (Strong law of large numbers) Let (X n ) n N be a sequence of i.i.d. rrvs, each having finite expectation. The strong law of large numbers (also called Kolmogorov s strong law) states that the sample mean X n converges almost surely towards E[X 1 ], that is: ( ) P lim X n = E[X 1 ] = 1 (19) n

Outline Distribution of i.i.d. Samples Convergence of random variables The Laws of Large Numbers The Central Limit Theorem

Property Let X 1,..., X n be n N independent rrvs on probability space (Ω, A, P) with respective moment generating functions M 1,..., M n. Then the moment generating function of is : for all x R where M n (x) exist. n S n = X i i=1 n M Sn (x) = M i (x) (20) i=1

Corollary (Mgf of iid sample) Let (X 1,..., X n ) be a random sample of size n N on probability space (Ω, A, P) with moment generating function M = M X1. Then the moment generating function of n S n = X i i=1 is : M Sn (x) = (M(x)) n (21) for all x R where M(x) exists.

Theorem (Central Limit Theorem) Let (X n ) n N be a sequence of i.i.d. rrvs, each having expectation E[X 1 ] = µ and finite variance Var(X 1 ) = σ 2 <. The Central Limit Theorem states that the sequence of variables (Z n ) n N defined by: Z n = X n µ σ 2 n converges in distribution towards Z following a standard normal distribution N (0, 1), that is: lim F Zn (x) = Φ(x), for all x R (22) n

Example Let (X 1,..., X 15 ) be a random sample with probability density function : f (x) = 3 2 x 2 1 ( 1,1) (x) What is the approximate probability that the sample mean X 15 falls between -2/5 and 1/5?

Example Let (X n ) be a sequence of i.i.d. Poisson random variables with mean 3. Estimate approximately how large n must be such that: ( P 3 + 1 ) n X i > 0.1 = 0.1. n i=1