Bernoulli and Binomial Distributions
Bernoulli Distribution a flipped coin turns up either heads or tails an item on an assembly line is either defective or not defective a piece of fruit is either damaged or not damaged a cow is either pregnant or not pregnant a child is either female or male X = We can write the pmf as { 1, if the outcome of the trial is a success 0, if the outcome of the trial is a failure { 1, w.p. p X = 0, w.p. q = 1 p p() = p (1 p) 1, = 0, 1 Arthur Berg Bernoulli and Binomial Distributions 2/ 9
Bernoulli Distribution a flipped coin turns up either heads or tails an item on an assembly line is either defective or not defective a piece of fruit is either damaged or not damaged a cow is either pregnant or not pregnant a child is either female or male X = We can write the pmf as { 1, if the outcome of the trial is a success 0, if the outcome of the trial is a failure { 1, w.p. p X = 0, w.p. q = 1 p p() = p (1 p) 1, = 0, 1 Arthur Berg Bernoulli and Binomial Distributions 2/ 9
Bernoulli Distribution a flipped coin turns up either heads or tails an item on an assembly line is either defective or not defective a piece of fruit is either damaged or not damaged a cow is either pregnant or not pregnant a child is either female or male X = We can write the pmf as { 1, if the outcome of the trial is a success 0, if the outcome of the trial is a failure { 1, w.p. p X = 0, w.p. q = 1 p p() = p (1 p) 1, = 0, 1 Arthur Berg Bernoulli and Binomial Distributions 2/ 9
Bernoulli Distribution a flipped coin turns up either heads or tails an item on an assembly line is either defective or not defective a piece of fruit is either damaged or not damaged a cow is either pregnant or not pregnant a child is either female or male X = We can write the pmf as { 1, if the outcome of the trial is a success 0, if the outcome of the trial is a failure { 1, w.p. p X = 0, w.p. q = 1 p p() = p (1 p) 1, = 0, 1 Arthur Berg Bernoulli and Binomial Distributions 2/ 9
Epectation and Variance of Bernoulli(p) E(X) = 1 p() = 0p(0) + 1p(1) = 0(1 p) + 1(p) = p =0 Noting that when X Bernoulli(p), X 2 X, i.e. X 2 has the same distribution as X. var(x) = E(X 2 ) [E(X)] 2 = p p 2 = p(1 p) = pq Arthur Berg Bernoulli and Binomial Distributions 3/ 9
Epectation and Variance of Bernoulli(p) E(X) = 1 p() = 0p(0) + 1p(1) = 0(1 p) + 1(p) = p =0 Noting that when X Bernoulli(p), X 2 X, i.e. X 2 has the same distribution as X. var(x) = E(X 2 ) [E(X)] 2 = p p 2 = p(1 p) = pq Arthur Berg Bernoulli and Binomial Distributions 3/ 9
Epectation and Variance of Bernoulli(p) E(X) = 1 p() = 0p(0) + 1p(1) = 0(1 p) + 1(p) = p =0 Noting that when X Bernoulli(p), X 2 X, i.e. X 2 has the same distribution as X. var(x) = E(X 2 ) [E(X)] 2 = p p 2 = p(1 p) = pq Arthur Berg Bernoulli and Binomial Distributions 3/ 9
Binomial Distribution We are typically interested in n independent Bernoulli trials, each with a probability p of success. Let Y 1, Y 2,..., Y n denote independent and identically-distributed (iid) Bernoulli(p) random variables. The sum X = n i=1 Y i denotes the number of successes among n sampled items. X is defined to be the binomial distribution with n trials and probability p of success, i.e. X binomial(n, p). The pmf of the binomial(n, p) is p() = p (1 p) n = p q n Arthur Berg Bernoulli and Binomial Distributions 4/ 9
Binomial Distribution We are typically interested in n independent Bernoulli trials, each with a probability p of success. Let Y 1, Y 2,..., Y n denote independent and identically-distributed (iid) Bernoulli(p) random variables. The sum X = n i=1 Y i denotes the number of successes among n sampled items. X is defined to be the binomial distribution with n trials and probability p of success, i.e. X binomial(n, p). The pmf of the binomial(n, p) is p() = p (1 p) n = p q n Arthur Berg Bernoulli and Binomial Distributions 4/ 9
Binomial Distribution We are typically interested in n independent Bernoulli trials, each with a probability p of success. Let Y 1, Y 2,..., Y n denote independent and identically-distributed (iid) Bernoulli(p) random variables. The sum X = n i=1 Y i denotes the number of successes among n sampled items. X is defined to be the binomial distribution with n trials and probability p of success, i.e. X binomial(n, p). The pmf of the binomial(n, p) is p() = p (1 p) n = p q n Arthur Berg Bernoulli and Binomial Distributions 4/ 9
Binomial Distribution We are typically interested in n independent Bernoulli trials, each with a probability p of success. Let Y 1, Y 2,..., Y n denote independent and identically-distributed (iid) Bernoulli(p) random variables. The sum X = n i=1 Y i denotes the number of successes among n sampled items. X is defined to be the binomial distribution with n trials and probability p of success, i.e. X binomial(n, p). The pmf of the binomial(n, p) is p() = p (1 p) n = p q n Arthur Berg Bernoulli and Binomial Distributions 4/ 9
Binomial Distribution We are typically interested in n independent Bernoulli trials, each with a probability p of success. Let Y 1, Y 2,..., Y n denote independent and identically-distributed (iid) Bernoulli(p) random variables. The sum X = n i=1 Y i denotes the number of successes among n sampled items. X is defined to be the binomial distribution with n trials and probability p of success, i.e. X binomial(n, p). The pmf of the binomial(n, p) is p() = p (1 p) n = p q n Arthur Berg Bernoulli and Binomial Distributions 4/ 9
Binomial Distribution We are typically interested in n independent Bernoulli trials, each with a probability p of success. Let Y 1, Y 2,..., Y n denote independent and identically-distributed (iid) Bernoulli(p) random variables. The sum X = n i=1 Y i denotes the number of successes among n sampled items. X is defined to be the binomial distribution with n trials and probability p of success, i.e. X binomial(n, p). The pmf of the binomial(n, p) is p() = p (1 p) n = p q n Arthur Berg Bernoulli and Binomial Distributions 4/ 9
Checking the Probability Mass Function Sums to One The pmf of any discrete random variable should sum to one. Recall the binomial theorem: ( + y) n = n y n i i Therefore p() = =0 =0 i=0 p (1 p) n = p + (1 p) n = 1 n = 1 Arthur Berg Bernoulli and Binomial Distributions 5/ 9
Checking the Probability Mass Function Sums to One The pmf of any discrete random variable should sum to one. Recall the binomial theorem: ( + y) n = n y n i i Therefore p() = =0 =0 i=0 p (1 p) n = p + (1 p) n = 1 n = 1 Arthur Berg Bernoulli and Binomial Distributions 5/ 9
Checking the Probability Mass Function Sums to One The pmf of any discrete random variable should sum to one. Recall the binomial theorem: ( + y) n = n y n i i Therefore p() = =0 =0 i=0 p (1 p) n = p + (1 p) n = 1 n = 1 Arthur Berg Bernoulli and Binomial Distributions 5/ 9
Checking the Probability Mass Function Sums to One The pmf of any discrete random variable should sum to one. Recall the binomial theorem: ( + y) n = n y n i i Therefore p() = =0 =0 i=0 p (1 p) n = p + (1 p) n = 1 n = 1 Arthur Berg Bernoulli and Binomial Distributions 5/ 9
Mean and Variance of Binomial(n, p) Recall X binomial(n, p) can be written as X = n i=1 Y i where iid Y 1,..., Y n Bernoulli(p). Therefore ( ) E(X) = E Y i = E(Y i ) = p = np i=1 i=1 i=1 And because of the independence, we similarly calculate the variance to be ( ) var(x) = var Y i = var(y i ) = pq = npq i=1 i=1 i=1 Arthur Berg Bernoulli and Binomial Distributions 6/ 9
Mean and Variance of Binomial(n, p) Recall X binomial(n, p) can be written as X = n i=1 Y i where iid Y 1,..., Y n Bernoulli(p). Therefore ( ) E(X) = E Y i = E(Y i ) = p = np i=1 i=1 i=1 And because of the independence, we similarly calculate the variance to be ( ) var(x) = var Y i = var(y i ) = pq = npq i=1 i=1 i=1 Arthur Berg Bernoulli and Binomial Distributions 6/ 9
Mean and Variance of Binomial(n, p) Recall X binomial(n, p) can be written as X = n i=1 Y i where iid Y 1,..., Y n Bernoulli(p). Therefore ( ) E(X) = E Y i = E(Y i ) = p = np i=1 i=1 i=1 And because of the independence, we similarly calculate the variance to be ( ) var(x) = var Y i = var(y i ) = pq = npq i=1 i=1 i=1 Arthur Berg Bernoulli and Binomial Distributions 6/ 9
Problem From Last Time discrete r.v. s with mean 0 and variance 1 Consider the discrete random variable { a, w.p. p X = b, w.p. 1 p In class, we saw that a = ±1, b = 1, and p = 1/2 we have E(X) = 0 and var(x) = 1. Are there other choices of a, b, and p that will give E(X) = 0 and var(x) = 1? Arthur Berg Bernoulli and Binomial Distributions 7/ 9
Eercise 4.45 (p.147) Arthur Berg Bernoulli and Binomial Distributions 8/ 9
Eercise 4.55 (p.149) Arthur Berg Bernoulli and Binomial Distributions 9/ 9