Probability Distributions for Discrete RV
Probability Distributions for Discrete RV Definition The probability distribution or probability mass function (pmf) of a discrete rv is defined for every number x by p(x) = P(X = x) = P(all s S : X (s) = x). In words, for every possible value x of the random variable, the pmf specifies the probability of observing that value when the experiment is performed. (The conditions p(x) 0 and all possible x p(x) = 1 are required for any pmf.)
Probability Distributions for Discrete RV
Probability Distributions for Discrete RV Definition The cumulative distribution function (cdf) F (x) of a discrete rv X with pmf p(x) is defined for every number x by F (x) = P(X x) = y:y x p(y) For any number x, F(x) is the probability that the observed value of X will be at most x.
Probability Distributions for Discrete RV Definition The cumulative distribution function (cdf) F (x) of a discrete rv X with pmf p(x) is defined for every number x by F (x) = P(X x) = y:y x p(y) For any number x, F(x) is the probability that the observed value of X will be at most x. F (x) = P(X x) = P(X is less than or equal to x) p(x) = P(X = x) = P(X is exactly equal to x)
Probability Distributions for Discrete RV
Probability Distributions for Discrete RV pmf = cdf: F (x) = P(X x) = p(y) y:y x
Probability Distributions for Discrete RV pmf = cdf: F (x) = P(X x) = p(y) It is also possible cdf = pmf: y:y x
Probability Distributions for Discrete RV pmf = cdf: F (x) = P(X x) = It is also possible cdf = pmf: y:y x p(x) = F (x) F (x ) p(y) where x represents the largest possible X value that is strictly less than x.
Probability Distributions for Discrete RV
Probability Distributions for Discrete RV Proposition For any two numbers a and b with a b, P(a X b) = F (b) F (a ) where a represents the largest possible X value that is strictly less than a. In particular, if the only possible values are integers and if a and b are integers, then P(a X b) = P(X = a or a + 1 or... or b) = F (b) F (a 1) Taking a = b yields P(X = a) = F (a) F (a 1) in this case.
Expectations
Expectations Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or µ X, is E(X ) = µ X = x D x p(x)
Expectations Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or µ X, is E(X ) = µ X = x D x p(x) e.g (Problem 30) A group of individuals who have automobile insurance from a certain company is randomly selected. Let Y be the number of moving violations for which the individual was cited during the last 3 years. The pmf of Y is y 0 1 2 3 Then the expected value of p(y) 0.60 0.25 0.10 0.05 moving violations for that group is µ Y = E(Y ) = 0 0.60 + 1 0.25 + 2 0.10 + 3 0.05 = 0.60
Expectations
Expectations y 0 1 2 3 p(y) 0.60 0.25 0.10 0.05 Assume the total number of individuals in that group is 100, then there are 60 individuals without moving violation, 25 with 1 moving violation, 10 with 2 moving violations and 5 with 3 moving violations.
Expectations y 0 1 2 3 p(y) 0.60 0.25 0.10 0.05 Assume the total number of individuals in that group is 100, then there are 60 individuals without moving violation, 25 with 1 moving violation, 10 with 2 moving violations and 5 with 3 moving violations. The population mean is calculated as µ = 0 60 + 1 25 + 2 10 + 3 5 100 = 0.60
Expectations y 0 1 2 3 p(y) 0.60 0.25 0.10 0.05 Assume the total number of individuals in that group is 100, then there are 60 individuals without moving violation, 25 with 1 moving violation, 10 with 2 moving violations and 5 with 3 moving violations. The population mean is calculated as µ = 0 60 + 1 25 + 2 10 + 3 5 100 = 0.60 60 µ = 0 100 + 1 25 100 + 2 10 100 + 3 5 100 = 0 0.60 + 1 0.25 + 2 0.10 + 3 0.05 = 0.60
Expectations y 0 1 2 3 p(y) 0.60 0.25 0.10 0.05 Assume the total number of individuals in that group is 100, then there are 60 individuals without moving violation, 25 with 1 moving violation, 10 with 2 moving violations and 5 with 3 moving violations. The population mean is calculated as µ = 0 60 + 1 25 + 2 10 + 3 5 100 = 0.60 60 µ = 0 100 + 1 25 100 + 2 10 100 + 3 5 100 = 0 0.60 + 1 0.25 + 2 0.10 + 3 0.05 = 0.60 The population size is irrevelant if we know the pmf!
Expectations
Expectations Examples: Let X be a Bernoulli rv with pmf 1 p x = 0 p(x) = p x = 1 0 x 0, or 1
Expectations Examples: Let X be a Bernoulli rv with pmf 1 p x = 0 p(x) = p x = 1 0 x 0, or 1 Then the expected value for X is E(X ) = 0 p(0) + 1 p(1) = p
Expectations Examples: Let X be a Bernoulli rv with pmf 1 p x = 0 p(x) = p x = 1 0 x 0, or 1 Then the expected value for X is E(X ) = 0 p(0) + 1 p(1) = p We see that the expected value of a Bernoulli rv X is just the probability that X takes on the value 1.
Expectations
Expectations Examples: Consider the cards drawing example again and assume we have infinitely many cards this time. Let X = the number of drawings until we get a. If the probability for getting a is α, then the pmf for X is { α(1 α) x 1 x = 1, 2, 3,... p(x) = 0 otherwise
Expectations Examples: Consider the cards drawing example again and assume we have infinitely many cards this time. Let X = the number of drawings until we get a. If the probability for getting a is α, then the pmf for X is { α(1 α) x 1 x = 1, 2, 3,... p(x) = 0 otherwise The expected value for X is E(X ) = D x p(x) = xα(1 α) x 1 = α [ d dα (1 α)x ] x=1 x=1
Expectations Examples: Consider the cards drawing example again and assume we have infinitely many cards this time. Let X = the number of drawings until we get a. If the probability for getting a is α, then the pmf for X is { α(1 α) x 1 x = 1, 2, 3,... p(x) = 0 otherwise The expected value for X is E(X ) = D x p(x) = xα(1 α) x 1 = α [ d dα (1 α)x ] x=1 x=1 E(X ) = α{ d dα [ (1 α) x ]} = α{ d dα (1 α α )} = 1 α x=1
Expectations
Expectations Examples 3.20 Let X be the number of interviews a student has prior to getting a job. The pmf for X is { k x = 1, 2, 3,... p(x) = x 2 0 otherwise where k is chosen so that x=1 (k/x 2 ) = 1. (It can be showed that x=1 (1/x 2 ) <, which implies that such a k exists.) The expected value of X is µ = E(X ) = x=1 The expected value is NOT finite! x k x 2 = k 1 x =! x=1
Expectations Examples 3.20 Let X be the number of interviews a student has prior to getting a job. The pmf for X is { k x = 1, 2, 3,... p(x) = x 2 0 otherwise where k is chosen so that x=1 (k/x 2 ) = 1. (It can be showed that x=1 (1/x 2 ) <, which implies that such a k exists.) The expected value of X is µ = E(X ) = x=1 The expected value is NOT finite! Heavy Tail: x k x 2 = k 1 x =! x=1
Expectations Examples 3.20 Let X be the number of interviews a student has prior to getting a job. The pmf for X is { k x = 1, 2, 3,... p(x) = x 2 0 otherwise where k is chosen so that x=1 (k/x 2 ) = 1. (It can be showed that x=1 (1/x 2 ) <, which implies that such a k exists.) The expected value of X is µ = E(X ) = x=1 x k x 2 = k 1 x =! x=1 The expected value is NOT finite! Heavy Tail: distribution with a large amount of probability far from µ
Expectations
Expectations Example (Problem 38) Let X = the outcome when a fair die is rolled once. If before the 1 die is rolled you are offered either 3.5 dollars or 1 X dollars, would you accept the guaranteed amount or would you gamble?
Expectations Example (Problem 38) Let X = the outcome when a fair die is rolled once. If before the 1 die is rolled you are offered either 3.5 dollars or 1 X dollars, would you accept the guaranteed amount or would you gamble? x 1 2 3 4 5 6 1 1 1 1 1 1 p(x) 6 6 6 6 6 6
Expectations Example (Problem 38) Let X = the outcome when a fair die is rolled once. If before the 1 die is rolled you are offered either 3.5 dollars or 1 X dollars, would you accept the guaranteed amount or would you gamble? x 1 2 3 4 5 6 p(x) 1 1 1 1 1 1 6 6 1 1 x 1 2 6 1 3 6 1 4 6 1 5 6 1 6
Expectations Example (Problem 38) Let X = the outcome when a fair die is rolled once. If before the 1 die is rolled you are offered either 3.5 dollars or 1 X dollars, would you accept the guaranteed amount or would you gamble? x 1 2 3 4 5 6 p(x) 1 1 1 1 1 1 6 6 1 1 x 1 2 6 1 3 6 1 4 6 1 5 6 1 6 Then the expected dollars from gambling is E( 1 X ) = 6 x=1 1 x p( 1 x ) = 1 1 6 + 1 2 1 6 + + 1 6 1 6 = 49 120 > 1 3.5
Expectations
Expectations Proposition If the rv X has a set of possible values D and pmf p(x), then the expected value of any function h(x ), denoted by E[h(X )] or µ hx, is computed by E[h(X )] = h(x) p(x) D
Expectations
Expectations Example 3.23 A computer store has purchased three computers of a certain type at $500 apiece. It will sell them for $1000 apiece. The manufacturer has agreed to repurchase any computers still unsold after a specified period at $200 apiece.
Expectations Example 3.23 A computer store has purchased three computers of a certain type at $500 apiece. It will sell them for $1000 apiece. The manufacturer has agreed to repurchase any computers still unsold after a specified period at $200 apiece. Let X denote the number of computers sold, and suppose that p(0) = 0.1, p(1) = 0.2, p(2) = 0.3, p(3) = 0.4. Let h(x ) denote the profit associated with selling X units,
Expectations Example 3.23 A computer store has purchased three computers of a certain type at $500 apiece. It will sell them for $1000 apiece. The manufacturer has agreed to repurchase any computers still unsold after a specified period at $200 apiece. Let X denote the number of computers sold, and suppose that p(0) = 0.1, p(1) = 0.2, p(2) = 0.3, p(3) = 0.4. Let h(x ) denote the profit associated with selling X units, then h(x ) = revenue cost = 1000X + 200(3 X ) 1500 = 800X 900.
Expectations Example 3.23 A computer store has purchased three computers of a certain type at $500 apiece. It will sell them for $1000 apiece. The manufacturer has agreed to repurchase any computers still unsold after a specified period at $200 apiece. Let X denote the number of computers sold, and suppose that p(0) = 0.1, p(1) = 0.2, p(2) = 0.3, p(3) = 0.4. Let h(x ) denote the profit associated with selling X units, then h(x ) = revenue cost = 1000X + 200(3 X ) 1500 = 800X 900. The expected profit is E[h(X )] = h(0) p(0) + h(1) p(1) + h(2) p(2) + h(3) p(3) = ( 900)(0.1) + ( 100)(0.2) + (700)(0.3) + (1500)(0.4) = 700
Expectations
Expectations Proposition E(aX + b) = a E(X ) + b (Or, using alternative notation, µ ax +b = a µ X + b.)
Expectations Proposition E(aX + b) = a E(X ) + b (Or, using alternative notation, µ ax +b = a µ X + b.) e.g. for the previous example, E[h(X )] = E(800X 900) = 800 E(X ) 900 = 700
Expectations Proposition E(aX + b) = a E(X ) + b (Or, using alternative notation, µ ax +b = a µ X + b.) e.g. for the previous example, E[h(X )] = E(800X 900) = 800 E(X ) 900 = 700 Corollary 1. For any constant a, E(aX ) = a E(X ). 2. For any constant b, E(X + b) = E(X ) + b.
Expectations
Expectations Definition Let X have pmf p(x) and expected value µ. Then the variance of X, denoted by V (X ) or σ 2 X, or just σ2 X, is V (X ) = D (x µ) 2 p(x) = E[(X µ) 2 ] The stand deviation (SD) of X is σ X = σx 2
Expectations
Expectations Example: For the previous example, the pmf is given as x 0 1 2 3 p(x) 0.1 0.2 0.3 0.4
Expectations Example: For the previous example, the pmf is given as x 0 1 2 3 p(x) 0.1 0.2 0.3 0.4 then the variance of X is V (X ) = σ 2 = 3 (x 2) 2 p(x) x=0 = (0 2) 2 (0.1) + (1 2) 2 (0.2) + (2 2) 2 (0.3) + (3 2) 2 (0.4) = 1
Expectations
Expectations Recall that for sample variance s 2, we have s 2 = S xx n 1 = x 2 i ( x i ) 2 n 1 n
Expectations Recall that for sample variance s 2, we have s 2 = S xx n 1 = x 2 i ( x i ) 2 n 1 n Proposition V (X ) = σ 2 = [ D x 2 p(x)] µ 2 = E(X 2 ) [E(X )] 2
Expectations Recall that for sample variance s 2, we have s 2 = S xx n 1 = x 2 i ( x i ) 2 n 1 n Proposition V (X ) = σ 2 = [ D x 2 p(x)] µ 2 = E(X 2 ) [E(X )] 2 e.g. for the previous example, the pmf is given as x 0 1 2 3 p(x) 0.1 0.2 0.3 0.4 Then V (X ) = E(X 2 ) [E(X )] 2 = 1 2 0.2 + 2 2 0.3 + 3 2 0.4 (2) 2 = 1
Expectations
Expectations Proposition If h(x ) is a function of a rv X, then V [h(x )] = σ 2 h(x ) = D {h(x) E[h(X )]} 2 p(x) = E[h(X ) 2 ] {E[h(X )]} 2 If h(x ) is linear, i.e. h(x ) = ax + b for some nonrandom constant a and b, then V (ax + b) = σ 2 ax +b = a2 σ 2 X and σ ax +b = a σ X In particular, σ ax = a σ X, σ X +b = σ X
Expectations
Expectations Example 3.23 continued A computer store has purchased three computers of a certain type at $500 apiece. It will sell them for $1000 apiece. The manufacturer has agreed to repurchase any computers still unsold after a specified period at $200 apiece. Let X denote the number of computers sold, and suppose that p(0) = 0.1, p(1) = 0.2, p(2) = 0.3, p(3) = 0.4. Let h(x ) denote the profit associated with selling X units, then h(x ) = revenue cost = 1000X + 200(3 X ) 1500 = 800X 900.
Expectations Example 3.23 continued A computer store has purchased three computers of a certain type at $500 apiece. It will sell them for $1000 apiece. The manufacturer has agreed to repurchase any computers still unsold after a specified period at $200 apiece. Let X denote the number of computers sold, and suppose that p(0) = 0.1, p(1) = 0.2, p(2) = 0.3, p(3) = 0.4. Let h(x ) denote the profit associated with selling X units, then h(x ) = revenue cost = 1000X + 200(3 X ) 1500 = 800X 900. The variance of h(x ) is V [h(x )] = V [800X 900] = 800 2 V [X ] = 640, 000 And the SD is σ h(x ) = V [h(x )] = 800.
Binomial Distribution
Binomial Distribution 1. The experiment consists of a sequence of n smaller experiments called trials, where n is fixed in advance of the experiment;
Binomial Distribution 1. The experiment consists of a sequence of n smaller experiments called trials, where n is fixed in advance of the experiment; 2. Each trial can result in one of the same two possible outcomes (dichotomous trials), which we denote by success (S) and failure (F );
Binomial Distribution 1. The experiment consists of a sequence of n smaller experiments called trials, where n is fixed in advance of the experiment; 2. Each trial can result in one of the same two possible outcomes (dichotomous trials), which we denote by success (S) and failure (F ); 3. The trials are independent, so that the outcome on any particular trial dose not influence the outcome on any other trial;
Binomial Distribution 1. The experiment consists of a sequence of n smaller experiments called trials, where n is fixed in advance of the experiment; 2. Each trial can result in one of the same two possible outcomes (dichotomous trials), which we denote by success (S) and failure (F ); 3. The trials are independent, so that the outcome on any particular trial dose not influence the outcome on any other trial; 4. The probability of success is constant from trial; we denote this probability by p.
Binomial Distribution 1. The experiment consists of a sequence of n smaller experiments called trials, where n is fixed in advance of the experiment; 2. Each trial can result in one of the same two possible outcomes (dichotomous trials), which we denote by success (S) and failure (F ); 3. The trials are independent, so that the outcome on any particular trial dose not influence the outcome on any other trial; 4. The probability of success is constant from trial; we denote this probability by p. Definition An experiment for which Conditions 1 4 are satisfied is called a binomial experiment.
Binomial Distribution
Binomial Distribution Examples: 1. If we toss a coin 10 times, then this is a binomial experiment with n = 10, S = Head, and F = Tail.
Binomial Distribution Examples: 1. If we toss a coin 10 times, then this is a binomial experiment with n = 10, S = Head, and F = Tail. 2. If we draw a card from a deck of well-shulffed cards with replacement, do this 5 times and record whether the outcome is or not, then this is also a binomial experiment. In this case, n = 5, S = and F = not.
Binomial Distribution Examples: 1. If we toss a coin 10 times, then this is a binomial experiment with n = 10, S = Head, and F = Tail. 2. If we draw a card from a deck of well-shulffed cards with replacement, do this 5 times and record whether the outcome is or not, then this is also a binomial experiment. In this case, n = 5, S = and F = not. 3. Again we draw a card from a deck of well-shulffed cards but without replacement, do this 5 times and record whether the outcome is or not. However this time it is NO LONGER a binomial experiment.
Binomial Distribution Examples: 1. If we toss a coin 10 times, then this is a binomial experiment with n = 10, S = Head, and F = Tail. 2. If we draw a card from a deck of well-shulffed cards with replacement, do this 5 times and record whether the outcome is or not, then this is also a binomial experiment. In this case, n = 5, S = and F = not. 3. Again we draw a card from a deck of well-shulffed cards but without replacement, do this 5 times and record whether the outcome is or not. However this time it is NO LONGER a binomial experiment. P( on second on first) = 12 = 0.235 0.25 = P( on second) 51 We do not have independence here!
Binomial Distribution
Binomial Distribution Examples: 4. This time we draw a card from 100 decks of well-shulffed cards without replacement, do this 5 times and record whether the outcome is or not. Is it a binomial experiment?
Binomial Distribution Examples: 4. This time we draw a card from 100 decks of well-shulffed cards without replacement, do this 5 times and record whether the outcome is or not. Is it a binomial experiment? P( on second draw on first draw) = 1299 = 0.2499 0.25 5199 P( on sixth draw on first five draw) = 1295 = 0.2492 0.25 5195 P( on tenth draw not on first nine draw) = 1300 = 0.2504 0.25 5191... Although we still do not have independence, the conditional probabilities differ so slightly that we can regard these trials as independent with P( ) = 0.25.
Binomial Distribution
Binomial Distribution Rule Consider sampling without replacement from a dichotomous population of size N. If the sample size (number of trials) n is at most 5% of the population size, the experiment can be analyzed as though it wre exactly a binomial experiment.
Binomial Distribution Rule Consider sampling without replacement from a dichotomous population of size N. If the sample size (number of trials) n is at most 5% of the population size, the experiment can be analyzed as though it wre exactly a binomial experiment. e.g. for the previous example, the population size is N = 5200 and the sample size is n = 5. We have n N 0.1%. So we can apply the above rule.
Binomial Distribution
Binomial Distribution Definition The binomial random variable X associated with a binomial experiment consisting of n trials is defined as X = the number of S s among the n trials
Binomial Distribution Definition The binomial random variable X associated with a binomial experiment consisting of n trials is defined as X = the number of S s among the n trials Possible values for X in an n-trial experiment are x = 0, 1, 2,..., n.
Binomial Distribution Definition The binomial random variable X associated with a binomial experiment consisting of n trials is defined as X = the number of S s among the n trials Possible values for X in an n-trial experiment are x = 0, 1, 2,..., n. Notation We use X Bin(n, p) to indicate that X is a binomial rv based on n trials with success probability p. We use b(x; n, p) to denote the pmf of X, and B(x; n, p) to denote the cdf of X, where B(x; n, p) = P(X x) = x b(x; n, p) y=0
Binomial Distribution
Binomial Distribution Example: Assume we toss a coin 3 times and the probability for getting a head for each toss is p. Let X be the binomial random variable associated with this experiment. We tabulate all the possible outcomes, corresponding X values and probabilities in the following table: Outcome X Probability Outcome X Probability HHH 3 p 3 TTT 0 (1 p) 3 HHT 2 p 2 (1 p) TTH 1 (1 p) 2 p HTH 2 p 2 (1 p) THT 1 (1 p) 2 p HTT 1 p (1 p) 2 THH 2 (1 p) p 2
Binomial Distribution Example: Assume we toss a coin 3 times and the probability for getting a head for each toss is p. Let X be the binomial random variable associated with this experiment. We tabulate all the possible outcomes, corresponding X values and probabilities in the following table: Outcome X Probability Outcome X Probability HHH 3 p 3 TTT 0 (1 p) 3 HHT 2 p 2 (1 p) TTH 1 (1 p) 2 p HTH 2 p 2 (1 p) THT 1 (1 p) 2 p HTT 1 p (1 p) 2 THH 2 (1 p) p 2 e.g. b(2; 3, p) = P(HHT ) + P(HTH) + P(THH) = 3p 2 (1 p).
Binomial Distribution
Binomial Distribution More generally, for the binomial pmf b(x; n, p), we have { } { } number of sequences of probability of any b(x; n, p) = length n consisting of x S s particular such sequence
Binomial Distribution More generally, for the binomial pmf b(x; n, p), we have { } { } number of sequences of probability of any b(x; n, p) = length n consisting of x S s particular such sequence { } ( ) number of sequences of n = and length n consisting of x S s x { } probability of any = p x (1 p) n x particular such sequence
Binomial Distribution More generally, for the binomial pmf b(x; n, p), we have { } { } number of sequences of probability of any b(x; n, p) = length n consisting of x S s particular such sequence Theorem { } ( ) number of sequences of n = and length n consisting of x S s x { } probability of any = p x (1 p) n x particular such sequence {( n ) b(x; n, p) = x p x (1 p) n x x = 0, 1, 2,..., n 0 otherwise
Binomial Distribution
Binomial Distribution Example: (Problem 55) Twenty percent of all telephones of a certain type are submitted for service while under warranty. Of these, 75% can be repaired, whereas the other 25% must be replaced with new units. if a company purchases ten of these telephones, what is the probability that exactly two will end up being replaced under warranty?
Binomial Distribution Example: (Problem 55) Twenty percent of all telephones of a certain type are submitted for service while under warranty. Of these, 75% can be repaired, whereas the other 25% must be replaced with new units. if a company purchases ten of these telephones, what is the probability that exactly two will end up being replaced under warranty? Let X = number of telephones which need replace. Then p = P(service and replace) = P(replace service) P(service) = 0.25 0.2 =
Binomial Distribution Example: (Problem 55) Twenty percent of all telephones of a certain type are submitted for service while under warranty. Of these, 75% can be repaired, whereas the other 25% must be replaced with new units. if a company purchases ten of these telephones, what is the probability that exactly two will end up being replaced under warranty? Let X = number of telephones which need replace. Then p = P(service and replace) = P(replace service) P(service) = 0.25 0.2 = Now, P(X = 2) = b(2; 10, 0.05) = ( ) 10 0.05 2 (1 0.05) 10 2 = 0.0746 2
Binomial Distribution
Binomial Distribution Binomial Tables Table A.1 Cumulative Binomial Probabilities (Page 664) B(x; n, p) = x y=0 b(x; n, p)... b. n = 10 p 0.01 0.05 0.10... 0.904.599.349... 1.996.914.736... 2 1.000.988.930... 3 1.000.999.987...............
Binomial Distribution Binomial Tables Table A.1 Cumulative Binomial Probabilities (Page 664) B(x; n, p) = x y=0 b(x; n, p)... b. n = 10 p 0.01 0.05 0.10... 0.904.599.349... 1.996.914.736... 2 1.000.988.930... 3 1.000.999.987............... Then for b(2; 10, 0.05), we have b(2; 10, 0.05) = B(2; 10, 0.05) B(1; 10, 0.05) =.988.914 =.074
Binomial Distribution
Binomial Distribution Mean and Variance Theorem If X Bin(n, p), then E(X ) = np, V (X ) = np(1 p) = npq, and σ X = npq (where q = 1 p).
Binomial Distribution Mean and Variance Theorem If X Bin(n, p), then E(X ) = np, V (X ) = np(1 p) = npq, and σ X = npq (where q = 1 p). The idea is that X = n i=1 Y + Y + + Y 1 2 n, where Y i s are independent Bernoulli random variable with probability p for one outcome, i.e. Y = { 1, with probabilityp 0, with probability1 p
Binomial Distribution Mean and Variance Theorem If X Bin(n, p), then E(X ) = np, V (X ) = np(1 p) = npq, and σ X = npq (where q = 1 p). The idea is that X = n i=1 Y + Y + + Y 1 2 n, where Y i s are independent Bernoulli random variable with probability p for one outcome, i.e. Y = { 1, with probabilityp 0, with probability1 p E(Y ) = p and V (Y ) = (1 p) 2 p + ( p) 2 (1 p) = p(1 p).
Binomial Distribution Mean and Variance Theorem If X Bin(n, p), then E(X ) = np, V (X ) = np(1 p) = npq, and σ X = npq (where q = 1 p). The idea is that X = n i=1 Y + Y + + Y 1 2 n, where Y i s are independent Bernoulli random variable with probability p for one outcome, i.e. Y = { 1, with probabilityp 0, with probability1 p E(Y ) = p and V (Y ) = (1 p) 2 p + ( p) 2 (1 p) = p(1 p). Therefore E(X ) = np and V (X ) = np(1 p) = npq.
Binomial Distribution
Binomial Distribution Example: (Problem 60) A toll bridge charges $1.00 for passenger cars and $2.50 for other vehicles. Suppose that during daytime hours, 60% of all vehicles are passenger cars. If 25 vehicles cross the bridge during a particular daytime period, what is the resulting expected toll revenue? What is the variance
Binomial Distribution Example: (Problem 60) A toll bridge charges $1.00 for passenger cars and $2.50 for other vehicles. Suppose that during daytime hours, 60% of all vehicles are passenger cars. If 25 vehicles cross the bridge during a particular daytime period, what is the resulting expected toll revenue? What is the variance Let X = the number of passenger cars and Y = revenue. Then Y = 1.00X + 2.50(25 X ) = 62.5 1.50X.
Binomial Distribution Example: (Problem 60) A toll bridge charges $1.00 for passenger cars and $2.50 for other vehicles. Suppose that during daytime hours, 60% of all vehicles are passenger cars. If 25 vehicles cross the bridge during a particular daytime period, what is the resulting expected toll revenue? What is the variance Let X = the number of passenger cars and Y = revenue. Then Y = 1.00X + 2.50(25 X ) = 62.5 1.50X. E(Y ) = E(62.5 1.5X ) = 62.5 1.5E(X ) = 62.5 1.5 (25 0.6) = 40
Binomial Distribution Example: (Problem 60) A toll bridge charges $1.00 for passenger cars and $2.50 for other vehicles. Suppose that during daytime hours, 60% of all vehicles are passenger cars. If 25 vehicles cross the bridge during a particular daytime period, what is the resulting expected toll revenue? What is the variance Let X = the number of passenger cars and Y = revenue. Then Y = 1.00X + 2.50(25 X ) = 62.5 1.50X. E(Y ) = E(62.5 1.5X ) = 62.5 1.5E(X ) = 62.5 1.5 (25 0.6) = 40 V (Y ) = V (62.5 1.5X ) = ( 1.5) 2 V (X ) = 2.25 (25 0.6 0.4) = 13.5