IEOR 165 Lecture 1 Probability Review

Similar documents
UQ, STAT2201, 2017, Lectures 3 and 4 Unit 3 Probability Distributions.

Random Variables Handout. Xavier Vilà

Lecture 23. STAT 225 Introduction to Probability Models April 4, Whitney Huang Purdue University. Normal approximation to Binomial

STAT/MATH 395 PROBABILITY II

continuous rv Note for a legitimate pdf, we have f (x) 0 and f (x)dx = 1. For a continuous rv, P(X = c) = c f (x)dx = 0, hence

ME3620. Theory of Engineering Experimentation. Spring Chapter III. Random Variables and Probability Distributions.

ECE 340 Probabilistic Methods in Engineering M/W 3-4:15. Lecture 10: Continuous RV Families. Prof. Vince Calhoun

2. The sum of all the probabilities in the sample space must add up to 1

Commonly Used Distributions

Statistical Methods for NLP LT 2202

Chapter 2. Random variables. 2.3 Expectation

Normal Distribution. Notes. Normal Distribution. Standard Normal. Sums of Normal Random Variables. Normal. approximation of Binomial.

Lesson 3: Basic theory of stochastic processes

Probability and Random Variables A FINANCIAL TIMES COMPANY

Probability Theory and Simulation Methods. April 9th, Lecture 20: Special distributions

Review for Final Exam Spring 2014 Jeremy Orloff and Jonathan Bloom

Business Statistics 41000: Probability 3

CS134: Networks Spring Random Variables and Independence. 1.2 Probability Distribution Function (PDF) Number of heads Probability 2 0.

MA : Introductory Probability

MATH 3200 Exam 3 Dr. Syring

Lecture Notes 6. Assume F belongs to a family of distributions, (e.g. F is Normal), indexed by some parameter θ.

Central Limit Theorem, Joint Distributions Spring 2018

Tutorial 11: Limit Theorems. Baoxiang Wang & Yihan Zhang bxwang, April 10, 2017

Probability Theory. Probability and Statistics for Data Science CSE594 - Spring 2016

Welcome to Stat 410!

6. Continous Distributions

Statistics for Business and Economics

Unit 5: Sampling Distributions of Statistics

Unit 5: Sampling Distributions of Statistics

The Normal Distribution

Chapter 3 Common Families of Distributions. Definition 3.4.1: A family of pmfs or pdfs is called exponential family if it can be expressed as

Lecture Stat 302 Introduction to Probability - Slides 15

IEOR 3106: Introduction to OR: Stochastic Models. Fall 2013, Professor Whitt. Class Lecture Notes: Tuesday, September 10.

Statistical Tables Compiled by Alan J. Terry

4-2 Probability Distributions and Probability Density Functions. Figure 4-2 Probability determined from the area under f(x).

Chapter 7. Sampling Distributions and the Central Limit Theorem

TOPIC: PROBABILITY DISTRIBUTIONS

EE641 Digital Image Processing II: Purdue University VISE - October 29,

CS 237: Probability in Computing

Statistics 6 th Edition

Central Limit Theorem (CLT) RLS

Introduction to Stochastic Calculus and Financial Derivatives. Simone Calogero

Counting Basics. Venn diagrams

14.30 Introduction to Statistical Methods in Economics Spring 2009

Review of the Topics for Midterm I

PROBABILITY. Wiley. With Applications and R ROBERT P. DOBROW. Department of Mathematics. Carleton College Northfield, MN

Basic notions of probability theory: continuous probability distributions. Piero Baraldi

Binomial Random Variables. Binomial Random Variables

5. In fact, any function of a random variable is also a random variable

Theoretical Statistics. Lecture 3. Peter Bartlett

Chapter 5. Continuous Random Variables and Probability Distributions. 5.1 Continuous Random Variables

Chapter 7. Sampling Distributions and the Central Limit Theorem

Normal Distribution. Definition A continuous rv X is said to have a normal distribution with. the pdf of X is

Binomial Approximation and Joint Distributions Chris Piech CS109, Stanford University

(Practice Version) Midterm Exam 1

Chapter 4: Commonly Used Distributions. Statistics for Engineers and Scientists Fourth Edition William Navidi

Exam P September 2014 Study Sheet! copylefted by Jared Nakamura, 9/4/2014!

STATISTICS and PROBABILITY

Chapter 3 Discrete Random Variables and Probability Distributions

Asymptotic results discrete time martingales and stochastic algorithms

Central Limit Theorem (cont d) 7/28/2006

Random variables. Contents

Probability Theory. Mohamed I. Riffi. Islamic University of Gaza

4-1. Chapter 4. Commonly Used Distributions by The McGraw-Hill Companies, Inc. All rights reserved.

STOR Lecture 7. Random Variables - I

STOR Lecture 15. Jointly distributed Random Variables - III

Random Samples. Mathematics 47: Lecture 6. Dan Sloughter. Furman University. March 13, 2006

Chapter 4: Asymptotic Properties of MLE (Part 3)

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER

Point Estimators. STATISTICS Lecture no. 10. Department of Econometrics FEM UO Brno office 69a, tel

2017 Fall QMS102 Tip Sheet 2

Chapter 3 - Lecture 5 The Binomial Probability Distribution

Expected Value and Variance

2011 Pearson Education, Inc

PROBABILITY AND STATISTICS

CIVL Discrete Distributions

Discrete Random Variables and Probability Distributions

Chapter 7 1. Random Variables

The Bernoulli distribution

χ 2 distributions and confidence intervals for population variance

MATH3075/3975 FINANCIAL MATHEMATICS TUTORIAL PROBLEMS

Drunken Birds, Brownian Motion, and Other Random Fun

Discrete Random Variables and Probability Distributions. Stat 4570/5570 Based on Devore s book (Ed 8)

Some Discrete Distribution Families

4 Random Variables and Distributions

Theoretical Foundations

Lecture 2. Probability Distributions Theophanis Tsandilas

Definition 9.1 A point estimate is any function T (X 1,..., X n ) of a random sample. We often write an estimator of the parameter θ as ˆθ.

Lecture 2. Main Topics: (Part II) Chapter 2 (2-7), Chapter 3. Bayes Theorem: Let A, B be two events, then. The probabilities P ( B), probability of B.

Continuous random variables

Discrete Probability Distribution

4.3 Normal distribution

Sampling Distribution

Simulation Wrap-up, Statistics COS 323

Statistics for Managers Using Microsoft Excel 7 th Edition

Engineering Statistics ECIV 2305

Two hours UNIVERSITY OF MANCHESTER. 23 May :00 16:00. Answer ALL SIX questions The total number of marks in the paper is 90.

ECE 295: Lecture 03 Estimation and Confidence Interval

UNIT 4 MATHEMATICAL METHODS

Favorite Distributions

Transcription:

IEOR 165 Lecture 1 Probability Review 1 Definitions in Probability and Their Consequences 1.1 Defining Probability A probability space (Ω, F, P) consists of three elements: A sample space Ω is the set of all possible outcomes. The σ-algebra F is a set of events, where an event is a set of outcomes. The measure P is a function that gives the probability of an event. This function P satisfies certain properties, including: P(A) 0 for an event A, P(Ω) = 1, and P(A 1 A 2...) = P(A 1 ) + P(A 2 ) +... for any countable collection A 1, A 2,... of mutually exclusive events. Some useful consequences of this definition are: For a sample space Ω = o 1,..., o n } in which each outcome o i is equally likely, it holds that P(o i ) = 1/n for all i = 1,..., n. P(A) = 1 P(A), where A denotes the complement of event A. For any two events A and B, P(A B) = P(A) + P(B) P(A B). If A B, then P(A) P(B). Consider a finite collection of mutually exclusive events B 1,..., B m such that B 1... B m = Ω and P(B i ) > 0. For any event A, we have P(A) = m k=1 P(A B k). 1.2 Conditional Probability The conditional probability of A given B is defined as P[A B] = P(A B). P(B) Some useful consequences of this definition are: 1

Law of Total Probability: Consider a finite collection of mutually exclusive events B 1,..., B m such that B 1... B m = Ω and P(B i ) > 0. For any event A, we have Bayes Theorem: It holds that 1.3 Independence P(A) = m k=1 P[A B k]p(b k ). P[B A] = P[A B]P(B). P(A) Two events A 1 and A 2 are defined to be independent if and only if P(A 1 A 2 ) = P(A 1 )P(A 2 ). Multiple events A 1, A 2,..., A m are mutually independent if and only if for every subset of events A i1,..., A in } A 1,..., A m }, the following holds: P( n k=1a ik ) = Π n k=1p(a ik ). Multiple events A 1, A 2,..., A m are pairwise independent if and only if every pair of events is independent, meaning P(A n A k ) = P(A n )P(A k ) for all distinct pairs of indices n, k. Note that pairwise independence does not always imply mutual independence! Lastly, an important property is that if A and B are independent and P(B) > 0, then P[A B] = P(A). 1.4 Random Variables A random variable is a function X(ω) : Ω B that maps the sample space Ω to a subset of the real numbers B R, with the property that the set w : X(ω) b} = X 1 (b) is an event for every b B. The cumulative distribution function (cdf) of a random variable X is defined by F X (u) = P(ω : X(ω) u). The probability density function (pdf) of a random variable X is any function f X (u) such that P(X A) = f X (u)du, for any well-behaved set A. 1.5 Expectation The expectation of g(x), where X is a random variable and g( ) is a function, is given by E(g(X)) = g(u)f X (u)du. A 2

Two important cases are the mean µ(x) = E(X) = uf X (u)du, and variance σ 2 (X) = E((X µ) 2 ) = (u µ) 2 f X (u)du. Two useful properties are that if λ is a constant then 2 Common Distributions 2.1 Uniform Distribution E(λX) = λe(x) σ 2 (λx) = λ 2 σ 2 (X). A random variable X with uniform distribution over support [a, b] is denoted by X U(a, b), and it is the distribution with pdf 1, if u [a, b] b a f X (u) = 0, otherwise. The mean is µ = (a + b)/2, and the variance is σ 2 = (b a) 2 /12. 2.2 Bernoulli Distribution A random variable X with a Bernoulli distribution with parameter p has the pdf:p(x = 1) = p and P(X = 0) = 1 p. The mean is µ = p, and the variance is σ 2 = p(1 p). 2.3 Binomial Distribution A random variable X with a binomial distribution with n trials and success probability p has the pdf ( ) n P(X = k) = p k (1 p) n k, for k Z. k This distribution gives the probability of having k successes (choosing the value 1) after running n trials of a Bernoulli distribution. The mean is µ = np, and the variance is σ 2 = np(1 p). 3

2.4 Gaussian/Normal Distribution A random variable X with Guassian/normal distribution and mean µ and variance σ 2 is denoted by X N (µ, σ 2 ), and it is the distribution with pdf ( ) 1 (u µ) 2 f X (u) = exp. 2πσ 2 2σ 2 For a set of iid (mutually independent and identically distributed) Gaussian random variables X 1, X 2,..., X n N (µ, σ 2 ), consider any linear combination of the random variables. The mean of the linear combination is S = λ 1 X 1 + λ 2 X 2 +... + λ n X n. E(S) = µ and the variance of the linear combination is σ 2 (S) = σ 2 λ i, λ 2 i. Note that in the special case where λ i = 1/n (which is also called a sample average): X = 1/n we have that E(X) = E(X) and σ 2 (X) = σ 2 /n (which also implies that lim n σ 2 (X) = 0). 2.5 Chi-Squared Distribution A random variable X with chi-squared distribution and k-degrees of freedom is denoted by X χ 2 (k), and it is the distribution of the random variable defined by Zi 2, where Z i N (0, 1). The mean is E(X) = k, and the variance is σ 2 (X) = 2k. 2.6 Exponential Distribution A random variable X with exponential distribution is denoted by X E(λ), where λ > 0 is the rate, and it is the distribution with pdf λ exp( λu), if u 0, f X (u) = 0 otherwise. 4 X i

The cdf is given by F X (u) = 1 exp( λu), if u 0, 0 otherwise and so P(X > u) = exp( λu) for u 0. The mean is µ = 1 λ, and the variance is σ2 = 1 λ 2. One of the most important aspects of an exponential distribution is that is satisfies the memoryless property: P[X > s + t X > t] = P(X > s), for all values of s, t 0. 2.7 Poisson Distribution A random variable X with a Poission distribution with parameter λ has a pdf P(X = k) = λk exp( λ), for k Z. k! The mean is µ = λ, and the variance is σ 2 = λ. 5