Probability Distributions for Discrete RV

Similar documents
Expectations. Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or

Discrete Random Variables and Probability Distributions. Stat 4570/5570 Based on Devore s book (Ed 8)

Probability mass function; cumulative distribution function

MA : Introductory Probability

Binomial Random Variables. Binomial Random Variables

The Binomial distribution

Chapter 3 Discrete Random Variables and Probability Distributions

ECON 214 Elements of Statistics for Economists 2016/2017

4 Random Variables and Distributions

Econ 6900: Statistical Problems. Instructor: Yogesh Uppal

The binomial distribution

Chapter 4. Section 4.1 Objectives. Random Variables. Random Variables. Chapter 4: Probability Distributions

Probability Distributions: Discrete

Math 14 Lecture Notes Ch Mean

STOR Lecture 7. Random Variables - I

5. In fact, any function of a random variable is also a random variable

STA258H5. Al Nosedal and Alison Weir. Winter Al Nosedal and Alison Weir STA258H5 Winter / 41

Statistical Methods for NLP LT 2202

Probability Theory. Mohamed I. Riffi. Islamic University of Gaza

CHAPTER 6 Random Variables

Chapter 3 Discrete Random Variables and Probability Distributions

Probability and Statistics

Chapter 3 - Lecture 5 The Binomial Probability Distribution

Bernoulli and Binomial Distributions

Chapter 3 - Lecture 3 Expected Values of Discrete Random Va

Discrete Random Variables

Lesson 97 - Binomial Distributions IBHL2 - SANTOWSKI

Opening Exercise: Lesson 91 - Binomial Distributions IBHL2 - SANTOWSKI

4.2 Bernoulli Trials and Binomial Distributions

Stat511 Additional Materials

12. THE BINOMIAL DISTRIBUTION

12. THE BINOMIAL DISTRIBUTION

Chapter 6: Random Variables

Random Variable: Definition

Normal Distribution. Definition A continuous rv X is said to have a normal distribution with. the pdf of X is

Commonly Used Distributions

A random variable (r. v.) is a variable whose value is a numerical outcome of a random phenomenon.

Probability Distributions

Central Limit Theorem (cont d) 7/28/2006

Random Variables. 6.1 Discrete and Continuous Random Variables. Probability Distribution. Discrete Random Variables. Chapter 6, Section 1

LECTURE CHAPTER 3 DESCRETE RANDOM VARIABLE

CHAPTER 10: Introducing Probability

Central Limit Theorem 11/08/2005

Statistics for Business and Economics

Statistics for Managers Using Microsoft Excel 7 th Edition

Chapter 8 Solutions Page 1 of 15 CHAPTER 8 EXERCISE SOLUTIONS

Binomial population distribution X ~ B(

Mean of a Discrete Random variable. Suppose that X is a discrete random variable whose distribution is : :

Section 7.5 The Normal Distribution. Section 7.6 Application of the Normal Distribution

A random variable (r. v.) is a variable whose value is a numerical outcome of a random phenomenon.

Consider the following examples: ex: let X = tossing a coin three times and counting the number of heads

Part V - Chance Variability

Some Discrete Distribution Families

Chapter 7. Sampling Distributions and the Central Limit Theorem

Chapter 4: Commonly Used Distributions. Statistics for Engineers and Scientists Fourth Edition William Navidi

continuous rv Note for a legitimate pdf, we have f (x) 0 and f (x)dx = 1. For a continuous rv, P(X = c) = c f (x)dx = 0, hence

ECO220Y Introduction to Probability Readings: Chapter 6 (skip section 6.9) and Chapter 9 (section )

Random Variables Handout. Xavier Vilà

Statistics for Business and Economics: Random Variables (1)

Probability & Sampling The Practice of Statistics 4e Mostly Chpts 5 7

5.4 Normal Approximation of the Binomial Distribution

15.063: Communicating with Data Summer Recitation 3 Probability II

Random Variables and Probability Functions

Mathacle. PSet Stats, Concepts In Statistics Level Number Name: Date: Distribution Distribute in anyway but normal

Chapter 5 Discrete Probability Distributions. Random Variables Discrete Probability Distributions Expected Value and Variance

Chapter 7. Sampling Distributions and the Central Limit Theorem

Section Distributions of Random Variables

Stat 211 Week Five. The Binomial Distribution

The Binomial Probability Distribution

UQ, STAT2201, 2017, Lectures 3 and 4 Unit 3 Probability Distributions.

Statistical Methods in Practice STAT/MATH 3379

VIDEO 1. A random variable is a quantity whose value depends on chance, for example, the outcome when a die is rolled.

Part 1 In which we meet the law of averages. The Law of Averages. The Expected Value & The Standard Error. Where Are We Going?

Discrete Random Variables

Discrete Random Variables and Probability Distributions

Lecture 23. STAT 225 Introduction to Probability Models April 4, Whitney Huang Purdue University. Normal approximation to Binomial

Sec$on 6.1: Discrete and Con.nuous Random Variables. Tuesday, November 14 th, 2017

Statistics 6 th Edition

Probability Theory and Simulation Methods. April 9th, Lecture 20: Special distributions

Probability and Random Variables A FINANCIAL TIMES COMPANY

2011 Pearson Education, Inc

Version A. Problem 1. Let X be the continuous random variable defined by the following pdf: 1 x/2 when 0 x 2, f(x) = 0 otherwise.

Chapter 5 Basic Probability

STAT Chapter 4/6: Random Variables and Probability Distributions

The Binomial Distribution

Math 14 Lecture Notes Ch The Normal Approximation to the Binomial Distribution. P (X ) = nc X p X q n X =

Examples: Random Variables. Discrete and Continuous Random Variables. Probability Distributions

Binomial and multinomial distribution

Introduction to Probability and Inference HSSP Summer 2017, Instructor: Alexandra Ding July 19, 2017

Section Distributions of Random Variables

MA 1125 Lecture 12 - Mean and Standard Deviation for the Binomial Distribution. Objectives: Mean and standard deviation for the binomial distribution.

Chapter 7. Random Variables

Review for Final Exam Spring 2014 Jeremy Orloff and Jonathan Bloom

Chapter 3 Discrete Random Variables and Probability Distributions

HHH HHT HTH THH HTT THT TTH TTT

Math 14 Lecture Notes Ch. 4.3

Section Random Variables and Histograms

Stochastic Calculus, Application of Real Analysis in Finance

Lecture 8. The Binomial Distribution. Binomial Distribution. Binomial Distribution. Probability Distributions: Normal and Binomial

6. Continous Distributions

Transcription:

Probability Distributions for Discrete RV

Probability Distributions for Discrete RV Definition The probability distribution or probability mass function (pmf) of a discrete rv is defined for every number x by p(x) = P(X = x) = P(all s S : X (s) = x). In words, for every possible value x of the random variable, the pmf specifies the probability of observing that value when the experiment is performed. (The conditions p(x) 0 and all possible x p(x) = 1 are required for any pmf.)

Probability Distributions for Discrete RV

Probability Distributions for Discrete RV Definition The cumulative distribution function (cdf) F (x) of a discrete rv X with pmf p(x) is defined for every number x by F (x) = P(X x) = y:y x p(y) For any number x, F(x) is the probability that the observed value of X will be at most x.

Probability Distributions for Discrete RV Definition The cumulative distribution function (cdf) F (x) of a discrete rv X with pmf p(x) is defined for every number x by F (x) = P(X x) = y:y x p(y) For any number x, F(x) is the probability that the observed value of X will be at most x. F (x) = P(X x) = P(X is less than or equal to x) p(x) = P(X = x) = P(X is exactly equal to x)

Probability Distributions for Discrete RV

Probability Distributions for Discrete RV pmf = cdf: F (x) = P(X x) = p(y) y:y x

Probability Distributions for Discrete RV pmf = cdf: F (x) = P(X x) = p(y) It is also possible cdf = pmf: y:y x

Probability Distributions for Discrete RV pmf = cdf: F (x) = P(X x) = It is also possible cdf = pmf: y:y x p(x) = F (x) F (x ) p(y) where x represents the largest possible X value that is strictly less than x.

Probability Distributions for Discrete RV

Probability Distributions for Discrete RV Proposition For any two numbers a and b with a b, P(a X b) = F (b) F (a ) where a represents the largest possible X value that is strictly less than a. In particular, if the only possible values are integers and if a and b are integers, then P(a X b) = P(X = a or a + 1 or... or b) = F (b) F (a 1) Taking a = b yields P(X = a) = F (a) F (a 1) in this case.

Expectations

Expectations Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or µ X, is E(X ) = µ X = x D x p(x)

Expectations Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or µ X, is E(X ) = µ X = x D x p(x) e.g (Problem 30) A group of individuals who have automobile insurance from a certain company is randomly selected. Let Y be the number of moving violations for which the individual was cited during the last 3 years. The pmf of Y is y 0 1 2 3 Then the expected value of p(y) 0.60 0.25 0.10 0.05 moving violations for that group is µ Y = E(Y ) = 0 0.60 + 1 0.25 + 2 0.10 + 3 0.05 = 0.60

Expectations

Expectations y 0 1 2 3 p(y) 0.60 0.25 0.10 0.05 Assume the total number of individuals in that group is 100, then there are 60 individuals without moving violation, 25 with 1 moving violation, 10 with 2 moving violations and 5 with 3 moving violations.

Expectations y 0 1 2 3 p(y) 0.60 0.25 0.10 0.05 Assume the total number of individuals in that group is 100, then there are 60 individuals without moving violation, 25 with 1 moving violation, 10 with 2 moving violations and 5 with 3 moving violations. The population mean is calculated as µ = 0 60 + 1 25 + 2 10 + 3 5 100 = 0.60

Expectations y 0 1 2 3 p(y) 0.60 0.25 0.10 0.05 Assume the total number of individuals in that group is 100, then there are 60 individuals without moving violation, 25 with 1 moving violation, 10 with 2 moving violations and 5 with 3 moving violations. The population mean is calculated as µ = 0 60 + 1 25 + 2 10 + 3 5 100 = 0.60 60 µ = 0 100 + 1 25 100 + 2 10 100 + 3 5 100 = 0 0.60 + 1 0.25 + 2 0.10 + 3 0.05 = 0.60

Expectations y 0 1 2 3 p(y) 0.60 0.25 0.10 0.05 Assume the total number of individuals in that group is 100, then there are 60 individuals without moving violation, 25 with 1 moving violation, 10 with 2 moving violations and 5 with 3 moving violations. The population mean is calculated as µ = 0 60 + 1 25 + 2 10 + 3 5 100 = 0.60 60 µ = 0 100 + 1 25 100 + 2 10 100 + 3 5 100 = 0 0.60 + 1 0.25 + 2 0.10 + 3 0.05 = 0.60 The population size is irrevelant if we know the pmf!

Expectations

Expectations Examples: Let X be a Bernoulli rv with pmf 1 p x = 0 p(x) = p x = 1 0 x 0, or 1

Expectations Examples: Let X be a Bernoulli rv with pmf 1 p x = 0 p(x) = p x = 1 0 x 0, or 1 Then the expected value for X is E(X ) = 0 p(0) + 1 p(1) = p

Expectations Examples: Let X be a Bernoulli rv with pmf 1 p x = 0 p(x) = p x = 1 0 x 0, or 1 Then the expected value for X is E(X ) = 0 p(0) + 1 p(1) = p We see that the expected value of a Bernoulli rv X is just the probability that X takes on the value 1.

Expectations

Expectations Examples: Consider the cards drawing example again and assume we have infinitely many cards this time. Let X = the number of drawings until we get a. If the probability for getting a is α, then the pmf for X is { α(1 α) x 1 x = 1, 2, 3,... p(x) = 0 otherwise

Expectations Examples: Consider the cards drawing example again and assume we have infinitely many cards this time. Let X = the number of drawings until we get a. If the probability for getting a is α, then the pmf for X is { α(1 α) x 1 x = 1, 2, 3,... p(x) = 0 otherwise The expected value for X is E(X ) = D x p(x) = xα(1 α) x 1 = α [ d dα (1 α)x ] x=1 x=1

Expectations Examples: Consider the cards drawing example again and assume we have infinitely many cards this time. Let X = the number of drawings until we get a. If the probability for getting a is α, then the pmf for X is { α(1 α) x 1 x = 1, 2, 3,... p(x) = 0 otherwise The expected value for X is E(X ) = D x p(x) = xα(1 α) x 1 = α [ d dα (1 α)x ] x=1 x=1 E(X ) = α{ d dα [ (1 α) x ]} = α{ d dα (1 α α )} = 1 α x=1

Expectations

Expectations Examples 3.20 Let X be the number of interviews a student has prior to getting a job. The pmf for X is { k x = 1, 2, 3,... p(x) = x 2 0 otherwise where k is chosen so that x=1 (k/x 2 ) = 1. (It can be showed that x=1 (1/x 2 ) <, which implies that such a k exists.) The expected value of X is µ = E(X ) = x=1 The expected value is NOT finite! x k x 2 = k 1 x =! x=1

Expectations Examples 3.20 Let X be the number of interviews a student has prior to getting a job. The pmf for X is { k x = 1, 2, 3,... p(x) = x 2 0 otherwise where k is chosen so that x=1 (k/x 2 ) = 1. (It can be showed that x=1 (1/x 2 ) <, which implies that such a k exists.) The expected value of X is µ = E(X ) = x=1 The expected value is NOT finite! Heavy Tail: x k x 2 = k 1 x =! x=1

Expectations Examples 3.20 Let X be the number of interviews a student has prior to getting a job. The pmf for X is { k x = 1, 2, 3,... p(x) = x 2 0 otherwise where k is chosen so that x=1 (k/x 2 ) = 1. (It can be showed that x=1 (1/x 2 ) <, which implies that such a k exists.) The expected value of X is µ = E(X ) = x=1 x k x 2 = k 1 x =! x=1 The expected value is NOT finite! Heavy Tail: distribution with a large amount of probability far from µ

Expectations

Expectations Example (Problem 38) Let X = the outcome when a fair die is rolled once. If before the 1 die is rolled you are offered either 3.5 dollars or 1 X dollars, would you accept the guaranteed amount or would you gamble?

Expectations Example (Problem 38) Let X = the outcome when a fair die is rolled once. If before the 1 die is rolled you are offered either 3.5 dollars or 1 X dollars, would you accept the guaranteed amount or would you gamble? x 1 2 3 4 5 6 1 1 1 1 1 1 p(x) 6 6 6 6 6 6

Expectations Example (Problem 38) Let X = the outcome when a fair die is rolled once. If before the 1 die is rolled you are offered either 3.5 dollars or 1 X dollars, would you accept the guaranteed amount or would you gamble? x 1 2 3 4 5 6 p(x) 1 1 1 1 1 1 6 6 1 1 x 1 2 6 1 3 6 1 4 6 1 5 6 1 6

Expectations Example (Problem 38) Let X = the outcome when a fair die is rolled once. If before the 1 die is rolled you are offered either 3.5 dollars or 1 X dollars, would you accept the guaranteed amount or would you gamble? x 1 2 3 4 5 6 p(x) 1 1 1 1 1 1 6 6 1 1 x 1 2 6 1 3 6 1 4 6 1 5 6 1 6 Then the expected dollars from gambling is E( 1 X ) = 6 x=1 1 x p( 1 x ) = 1 1 6 + 1 2 1 6 + + 1 6 1 6 = 49 120 > 1 3.5

Expectations

Expectations Proposition If the rv X has a set of possible values D and pmf p(x), then the expected value of any function h(x ), denoted by E[h(X )] or µ hx, is computed by E[h(X )] = h(x) p(x) D

Expectations

Expectations Example 3.23 A computer store has purchased three computers of a certain type at $500 apiece. It will sell them for $1000 apiece. The manufacturer has agreed to repurchase any computers still unsold after a specified period at $200 apiece.

Expectations Example 3.23 A computer store has purchased three computers of a certain type at $500 apiece. It will sell them for $1000 apiece. The manufacturer has agreed to repurchase any computers still unsold after a specified period at $200 apiece. Let X denote the number of computers sold, and suppose that p(0) = 0.1, p(1) = 0.2, p(2) = 0.3, p(3) = 0.4. Let h(x ) denote the profit associated with selling X units,

Expectations Example 3.23 A computer store has purchased three computers of a certain type at $500 apiece. It will sell them for $1000 apiece. The manufacturer has agreed to repurchase any computers still unsold after a specified period at $200 apiece. Let X denote the number of computers sold, and suppose that p(0) = 0.1, p(1) = 0.2, p(2) = 0.3, p(3) = 0.4. Let h(x ) denote the profit associated with selling X units, then h(x ) = revenue cost = 1000X + 200(3 X ) 1500 = 800X 900.

Expectations Example 3.23 A computer store has purchased three computers of a certain type at $500 apiece. It will sell them for $1000 apiece. The manufacturer has agreed to repurchase any computers still unsold after a specified period at $200 apiece. Let X denote the number of computers sold, and suppose that p(0) = 0.1, p(1) = 0.2, p(2) = 0.3, p(3) = 0.4. Let h(x ) denote the profit associated with selling X units, then h(x ) = revenue cost = 1000X + 200(3 X ) 1500 = 800X 900. The expected profit is E[h(X )] = h(0) p(0) + h(1) p(1) + h(2) p(2) + h(3) p(3) = ( 900)(0.1) + ( 100)(0.2) + (700)(0.3) + (1500)(0.4) = 700

Expectations

Expectations Proposition E(aX + b) = a E(X ) + b (Or, using alternative notation, µ ax +b = a µ X + b.)

Expectations Proposition E(aX + b) = a E(X ) + b (Or, using alternative notation, µ ax +b = a µ X + b.) e.g. for the previous example, E[h(X )] = E(800X 900) = 800 E(X ) 900 = 700

Expectations Proposition E(aX + b) = a E(X ) + b (Or, using alternative notation, µ ax +b = a µ X + b.) e.g. for the previous example, E[h(X )] = E(800X 900) = 800 E(X ) 900 = 700 Corollary 1. For any constant a, E(aX ) = a E(X ). 2. For any constant b, E(X + b) = E(X ) + b.

Expectations

Expectations Definition Let X have pmf p(x) and expected value µ. Then the variance of X, denoted by V (X ) or σ 2 X, or just σ2 X, is V (X ) = D (x µ) 2 p(x) = E[(X µ) 2 ] The stand deviation (SD) of X is σ X = σx 2

Expectations

Expectations Example: For the previous example, the pmf is given as x 0 1 2 3 p(x) 0.1 0.2 0.3 0.4

Expectations Example: For the previous example, the pmf is given as x 0 1 2 3 p(x) 0.1 0.2 0.3 0.4 then the variance of X is V (X ) = σ 2 = 3 (x 2) 2 p(x) x=0 = (0 2) 2 (0.1) + (1 2) 2 (0.2) + (2 2) 2 (0.3) + (3 2) 2 (0.4) = 1

Expectations

Expectations Recall that for sample variance s 2, we have s 2 = S xx n 1 = x 2 i ( x i ) 2 n 1 n

Expectations Recall that for sample variance s 2, we have s 2 = S xx n 1 = x 2 i ( x i ) 2 n 1 n Proposition V (X ) = σ 2 = [ D x 2 p(x)] µ 2 = E(X 2 ) [E(X )] 2

Expectations Recall that for sample variance s 2, we have s 2 = S xx n 1 = x 2 i ( x i ) 2 n 1 n Proposition V (X ) = σ 2 = [ D x 2 p(x)] µ 2 = E(X 2 ) [E(X )] 2 e.g. for the previous example, the pmf is given as x 0 1 2 3 p(x) 0.1 0.2 0.3 0.4 Then V (X ) = E(X 2 ) [E(X )] 2 = 1 2 0.2 + 2 2 0.3 + 3 2 0.4 (2) 2 = 1

Expectations

Expectations Proposition If h(x ) is a function of a rv X, then V [h(x )] = σ 2 h(x ) = D {h(x) E[h(X )]} 2 p(x) = E[h(X ) 2 ] {E[h(X )]} 2 If h(x ) is linear, i.e. h(x ) = ax + b for some nonrandom constant a and b, then V (ax + b) = σ 2 ax +b = a2 σ 2 X and σ ax +b = a σ X In particular, σ ax = a σ X, σ X +b = σ X

Expectations

Expectations Example 3.23 continued A computer store has purchased three computers of a certain type at $500 apiece. It will sell them for $1000 apiece. The manufacturer has agreed to repurchase any computers still unsold after a specified period at $200 apiece. Let X denote the number of computers sold, and suppose that p(0) = 0.1, p(1) = 0.2, p(2) = 0.3, p(3) = 0.4. Let h(x ) denote the profit associated with selling X units, then h(x ) = revenue cost = 1000X + 200(3 X ) 1500 = 800X 900.

Expectations Example 3.23 continued A computer store has purchased three computers of a certain type at $500 apiece. It will sell them for $1000 apiece. The manufacturer has agreed to repurchase any computers still unsold after a specified period at $200 apiece. Let X denote the number of computers sold, and suppose that p(0) = 0.1, p(1) = 0.2, p(2) = 0.3, p(3) = 0.4. Let h(x ) denote the profit associated with selling X units, then h(x ) = revenue cost = 1000X + 200(3 X ) 1500 = 800X 900. The variance of h(x ) is V [h(x )] = V [800X 900] = 800 2 V [X ] = 640, 000 And the SD is σ h(x ) = V [h(x )] = 800.

Binomial Distribution

Binomial Distribution 1. The experiment consists of a sequence of n smaller experiments called trials, where n is fixed in advance of the experiment;

Binomial Distribution 1. The experiment consists of a sequence of n smaller experiments called trials, where n is fixed in advance of the experiment; 2. Each trial can result in one of the same two possible outcomes (dichotomous trials), which we denote by success (S) and failure (F );

Binomial Distribution 1. The experiment consists of a sequence of n smaller experiments called trials, where n is fixed in advance of the experiment; 2. Each trial can result in one of the same two possible outcomes (dichotomous trials), which we denote by success (S) and failure (F ); 3. The trials are independent, so that the outcome on any particular trial dose not influence the outcome on any other trial;

Binomial Distribution 1. The experiment consists of a sequence of n smaller experiments called trials, where n is fixed in advance of the experiment; 2. Each trial can result in one of the same two possible outcomes (dichotomous trials), which we denote by success (S) and failure (F ); 3. The trials are independent, so that the outcome on any particular trial dose not influence the outcome on any other trial; 4. The probability of success is constant from trial; we denote this probability by p.

Binomial Distribution 1. The experiment consists of a sequence of n smaller experiments called trials, where n is fixed in advance of the experiment; 2. Each trial can result in one of the same two possible outcomes (dichotomous trials), which we denote by success (S) and failure (F ); 3. The trials are independent, so that the outcome on any particular trial dose not influence the outcome on any other trial; 4. The probability of success is constant from trial; we denote this probability by p. Definition An experiment for which Conditions 1 4 are satisfied is called a binomial experiment.

Binomial Distribution

Binomial Distribution Examples: 1. If we toss a coin 10 times, then this is a binomial experiment with n = 10, S = Head, and F = Tail.

Binomial Distribution Examples: 1. If we toss a coin 10 times, then this is a binomial experiment with n = 10, S = Head, and F = Tail. 2. If we draw a card from a deck of well-shulffed cards with replacement, do this 5 times and record whether the outcome is or not, then this is also a binomial experiment. In this case, n = 5, S = and F = not.

Binomial Distribution Examples: 1. If we toss a coin 10 times, then this is a binomial experiment with n = 10, S = Head, and F = Tail. 2. If we draw a card from a deck of well-shulffed cards with replacement, do this 5 times and record whether the outcome is or not, then this is also a binomial experiment. In this case, n = 5, S = and F = not. 3. Again we draw a card from a deck of well-shulffed cards but without replacement, do this 5 times and record whether the outcome is or not. However this time it is NO LONGER a binomial experiment.

Binomial Distribution Examples: 1. If we toss a coin 10 times, then this is a binomial experiment with n = 10, S = Head, and F = Tail. 2. If we draw a card from a deck of well-shulffed cards with replacement, do this 5 times and record whether the outcome is or not, then this is also a binomial experiment. In this case, n = 5, S = and F = not. 3. Again we draw a card from a deck of well-shulffed cards but without replacement, do this 5 times and record whether the outcome is or not. However this time it is NO LONGER a binomial experiment. P( on second on first) = 12 = 0.235 0.25 = P( on second) 51 We do not have independence here!

Binomial Distribution

Binomial Distribution Examples: 4. This time we draw a card from 100 decks of well-shulffed cards without replacement, do this 5 times and record whether the outcome is or not. Is it a binomial experiment?

Binomial Distribution Examples: 4. This time we draw a card from 100 decks of well-shulffed cards without replacement, do this 5 times and record whether the outcome is or not. Is it a binomial experiment? P( on second draw on first draw) = 1299 = 0.2499 0.25 5199 P( on sixth draw on first five draw) = 1295 = 0.2492 0.25 5195 P( on tenth draw not on first nine draw) = 1300 = 0.2504 0.25 5191... Although we still do not have independence, the conditional probabilities differ so slightly that we can regard these trials as independent with P( ) = 0.25.

Binomial Distribution

Binomial Distribution Rule Consider sampling without replacement from a dichotomous population of size N. If the sample size (number of trials) n is at most 5% of the population size, the experiment can be analyzed as though it wre exactly a binomial experiment.

Binomial Distribution Rule Consider sampling without replacement from a dichotomous population of size N. If the sample size (number of trials) n is at most 5% of the population size, the experiment can be analyzed as though it wre exactly a binomial experiment. e.g. for the previous example, the population size is N = 5200 and the sample size is n = 5. We have n N 0.1%. So we can apply the above rule.

Binomial Distribution

Binomial Distribution Definition The binomial random variable X associated with a binomial experiment consisting of n trials is defined as X = the number of S s among the n trials

Binomial Distribution Definition The binomial random variable X associated with a binomial experiment consisting of n trials is defined as X = the number of S s among the n trials Possible values for X in an n-trial experiment are x = 0, 1, 2,..., n.

Binomial Distribution Definition The binomial random variable X associated with a binomial experiment consisting of n trials is defined as X = the number of S s among the n trials Possible values for X in an n-trial experiment are x = 0, 1, 2,..., n. Notation We use X Bin(n, p) to indicate that X is a binomial rv based on n trials with success probability p. We use b(x; n, p) to denote the pmf of X, and B(x; n, p) to denote the cdf of X, where B(x; n, p) = P(X x) = x b(x; n, p) y=0

Binomial Distribution

Binomial Distribution Example: Assume we toss a coin 3 times and the probability for getting a head for each toss is p. Let X be the binomial random variable associated with this experiment. We tabulate all the possible outcomes, corresponding X values and probabilities in the following table: Outcome X Probability Outcome X Probability HHH 3 p 3 TTT 0 (1 p) 3 HHT 2 p 2 (1 p) TTH 1 (1 p) 2 p HTH 2 p 2 (1 p) THT 1 (1 p) 2 p HTT 1 p (1 p) 2 THH 2 (1 p) p 2

Binomial Distribution Example: Assume we toss a coin 3 times and the probability for getting a head for each toss is p. Let X be the binomial random variable associated with this experiment. We tabulate all the possible outcomes, corresponding X values and probabilities in the following table: Outcome X Probability Outcome X Probability HHH 3 p 3 TTT 0 (1 p) 3 HHT 2 p 2 (1 p) TTH 1 (1 p) 2 p HTH 2 p 2 (1 p) THT 1 (1 p) 2 p HTT 1 p (1 p) 2 THH 2 (1 p) p 2 e.g. b(2; 3, p) = P(HHT ) + P(HTH) + P(THH) = 3p 2 (1 p).

Binomial Distribution

Binomial Distribution More generally, for the binomial pmf b(x; n, p), we have { } { } number of sequences of probability of any b(x; n, p) = length n consisting of x S s particular such sequence

Binomial Distribution More generally, for the binomial pmf b(x; n, p), we have { } { } number of sequences of probability of any b(x; n, p) = length n consisting of x S s particular such sequence { } ( ) number of sequences of n = and length n consisting of x S s x { } probability of any = p x (1 p) n x particular such sequence

Binomial Distribution More generally, for the binomial pmf b(x; n, p), we have { } { } number of sequences of probability of any b(x; n, p) = length n consisting of x S s particular such sequence Theorem { } ( ) number of sequences of n = and length n consisting of x S s x { } probability of any = p x (1 p) n x particular such sequence {( n ) b(x; n, p) = x p x (1 p) n x x = 0, 1, 2,..., n 0 otherwise

Binomial Distribution

Binomial Distribution Example: (Problem 55) Twenty percent of all telephones of a certain type are submitted for service while under warranty. Of these, 75% can be repaired, whereas the other 25% must be replaced with new units. if a company purchases ten of these telephones, what is the probability that exactly two will end up being replaced under warranty?

Binomial Distribution Example: (Problem 55) Twenty percent of all telephones of a certain type are submitted for service while under warranty. Of these, 75% can be repaired, whereas the other 25% must be replaced with new units. if a company purchases ten of these telephones, what is the probability that exactly two will end up being replaced under warranty? Let X = number of telephones which need replace. Then p = P(service and replace) = P(replace service) P(service) = 0.25 0.2 =

Binomial Distribution Example: (Problem 55) Twenty percent of all telephones of a certain type are submitted for service while under warranty. Of these, 75% can be repaired, whereas the other 25% must be replaced with new units. if a company purchases ten of these telephones, what is the probability that exactly two will end up being replaced under warranty? Let X = number of telephones which need replace. Then p = P(service and replace) = P(replace service) P(service) = 0.25 0.2 = Now, P(X = 2) = b(2; 10, 0.05) = ( ) 10 0.05 2 (1 0.05) 10 2 = 0.0746 2

Binomial Distribution

Binomial Distribution Binomial Tables Table A.1 Cumulative Binomial Probabilities (Page 664) B(x; n, p) = x y=0 b(x; n, p)... b. n = 10 p 0.01 0.05 0.10... 0.904.599.349... 1.996.914.736... 2 1.000.988.930... 3 1.000.999.987...............

Binomial Distribution Binomial Tables Table A.1 Cumulative Binomial Probabilities (Page 664) B(x; n, p) = x y=0 b(x; n, p)... b. n = 10 p 0.01 0.05 0.10... 0.904.599.349... 1.996.914.736... 2 1.000.988.930... 3 1.000.999.987............... Then for b(2; 10, 0.05), we have b(2; 10, 0.05) = B(2; 10, 0.05) B(1; 10, 0.05) =.988.914 =.074

Binomial Distribution

Binomial Distribution Mean and Variance Theorem If X Bin(n, p), then E(X ) = np, V (X ) = np(1 p) = npq, and σ X = npq (where q = 1 p).

Binomial Distribution Mean and Variance Theorem If X Bin(n, p), then E(X ) = np, V (X ) = np(1 p) = npq, and σ X = npq (where q = 1 p). The idea is that X = n i=1 Y + Y + + Y 1 2 n, where Y i s are independent Bernoulli random variable with probability p for one outcome, i.e. Y = { 1, with probabilityp 0, with probability1 p

Binomial Distribution Mean and Variance Theorem If X Bin(n, p), then E(X ) = np, V (X ) = np(1 p) = npq, and σ X = npq (where q = 1 p). The idea is that X = n i=1 Y + Y + + Y 1 2 n, where Y i s are independent Bernoulli random variable with probability p for one outcome, i.e. Y = { 1, with probabilityp 0, with probability1 p E(Y ) = p and V (Y ) = (1 p) 2 p + ( p) 2 (1 p) = p(1 p).

Binomial Distribution Mean and Variance Theorem If X Bin(n, p), then E(X ) = np, V (X ) = np(1 p) = npq, and σ X = npq (where q = 1 p). The idea is that X = n i=1 Y + Y + + Y 1 2 n, where Y i s are independent Bernoulli random variable with probability p for one outcome, i.e. Y = { 1, with probabilityp 0, with probability1 p E(Y ) = p and V (Y ) = (1 p) 2 p + ( p) 2 (1 p) = p(1 p). Therefore E(X ) = np and V (X ) = np(1 p) = npq.

Binomial Distribution

Binomial Distribution Example: (Problem 60) A toll bridge charges $1.00 for passenger cars and $2.50 for other vehicles. Suppose that during daytime hours, 60% of all vehicles are passenger cars. If 25 vehicles cross the bridge during a particular daytime period, what is the resulting expected toll revenue? What is the variance

Binomial Distribution Example: (Problem 60) A toll bridge charges $1.00 for passenger cars and $2.50 for other vehicles. Suppose that during daytime hours, 60% of all vehicles are passenger cars. If 25 vehicles cross the bridge during a particular daytime period, what is the resulting expected toll revenue? What is the variance Let X = the number of passenger cars and Y = revenue. Then Y = 1.00X + 2.50(25 X ) = 62.5 1.50X.

Binomial Distribution Example: (Problem 60) A toll bridge charges $1.00 for passenger cars and $2.50 for other vehicles. Suppose that during daytime hours, 60% of all vehicles are passenger cars. If 25 vehicles cross the bridge during a particular daytime period, what is the resulting expected toll revenue? What is the variance Let X = the number of passenger cars and Y = revenue. Then Y = 1.00X + 2.50(25 X ) = 62.5 1.50X. E(Y ) = E(62.5 1.5X ) = 62.5 1.5E(X ) = 62.5 1.5 (25 0.6) = 40

Binomial Distribution Example: (Problem 60) A toll bridge charges $1.00 for passenger cars and $2.50 for other vehicles. Suppose that during daytime hours, 60% of all vehicles are passenger cars. If 25 vehicles cross the bridge during a particular daytime period, what is the resulting expected toll revenue? What is the variance Let X = the number of passenger cars and Y = revenue. Then Y = 1.00X + 2.50(25 X ) = 62.5 1.50X. E(Y ) = E(62.5 1.5X ) = 62.5 1.5E(X ) = 62.5 1.5 (25 0.6) = 40 V (Y ) = V (62.5 1.5X ) = ( 1.5) 2 V (X ) = 2.25 (25 0.6 0.4) = 13.5