STAT Chapter 4/6: Random Variables and Probability Distributions

Size: px
Start display at page:

Download "STAT Chapter 4/6: Random Variables and Probability Distributions"

Transcription

1 STAT Chapter 4/6: Random Variables and Probability Distributions We use random variables (RV) to represent the numerical features of a random experiment. In chapter 3, we defined a random experiment as one where the possible outcomes are known before the experiment is performed (S), but the actual outcome is not known. A random variable is usually denoted by capital letters (eg) X, Y, Z, and its possible realized values denoted by the same lowercase letters (eg) x, y, z. So, X is a random quantity before the experiment is performed and x is the value of the random quantity after the experiment has been performed. Examples: X = # of defective items in a sample. Y = Time till failure of a mechanical part. Z = % yeild of a chemical process. X 2 = # rolled on a die Notation: In this course, we will write things like P(X = x) to mean the probability that the RV X takes on the value x. The Range of a RV is the set of all possible values it can take on (eg) X {0, 1,..., n} Y [0, ) Z [0, 100] X 2 {1, 2, 3, 4, 5, 6} First we will talk about discrete RVs and then continuous RVs. A Discrete RV has finite or countable range and a continuous RV is defined on a continuous range (which may be bounded). 1

2 Discrete Random Variables A discrete RV is a RV that has a finite or countable range. (eg) # defective items, # rolled with a dice, # of sales for a store, # of heads on n coin tosses. Example: Let X = The number of 1 s rolled, when rolling a die 4 times... Probability Mass Function (f(x)): The probability mass function of a Discrete RV, denoted by a lowercase f(x), is a function that gives the probability of occurance for each possible realized value x of the RV X. For a DISC RV, f(x) is a discrete step function. f(x) = P(X = x), for all possible values x of X. (for a discrete RV only!) Properties of f(x): 1. 0 f(x) 1, for all x X 2. allx f(x) = 1, Sum to 100% Probability mass functions can be represented in a frequency table, a histogram or as a mathematical function. Examples below... Combinatorial and Factorial Notation: Suppose we have n distinct objects and we would like to choose k of them. If we can only choose each object once (no replacement) and choosing the objects (a, b) is the same as choosing (b, a) (order they are selected in does not matter), then how many distinct sets of k objects can we select from the n of them? (eg) Select k=10 students from this class of n=110 or a poker hand: the deck has 52 (n) cards, we select 5 (k) of them and we cannot get the same card twice, and K -A -A -A - A is the same as A -A -A -K -A 2

3 How many ways can we choose k objects from n distinct objects? ( n ) k = n! (n k)!k! = n (n 1)... (n k+1) k!, where k! = k (k 1) (k 2) (eg) Poker hand: ( ) 52 5 = 52! = (52 5)!5! Examples: = = 2,598, If we have {1, 2, 3, 4} and we choose 2 of them without replacement, how many different combinations can we get? 2. What is the probability of winning the jackpot in Lotto 6/49? Note: We define 0! = 1. (eg) ( ) n n = n! = n! = 1 as it should! (n n)!n! 0!n! Some Important Discrete Random Variables: First, we will talk about the bernoulli process which gives rise to the binomial, the geometric and the negative binomial random variables. Later in the course (in Chapter 6), we will talk about the poisson process which gives rise to the poisson random variable (as well as the exponential RV). Bernoulli Process: A bernoulli process is a random experiment with the following features: 1) The experiment consists of n independent steps called Trials. 2) Each trial results in either a success or a failure (ie) S = {S, F} 3) The probability of a success is constant across trials. Prob. of success = P(S) = p You should be able to recognize when an experiment is a bernoulli process: 3

4 There is an event that will/will not occur (success/failure). Successes/Failures are measured over a # of trials. The probability of a success is the same for every trial. The terms success and failure are used as the theory has its roots in gambling. It can sometimes be weird to think of a success as finding a defective item, although this is the terminlology we use. Examples: (1) Testing Randomly selected items for defects: event: Find a defective item trial: Inspection of each selected item outcomes: Defective (success) or not-defective (failure) 2) The number of heads on repeated coin tosses: event: Get a H trial: Each coin toss outcomes: get a H (success) or a T (failure) As we mentioned, the bernoulli process gives rise to: Binomial RV: The # of successes in a fixed # of trials Geometric RV: The # of trials completed until the first success is observed. Examples: (1) Testing Randomly selected items for defects: Binomial RV the # of defects out of n = 50 sampled and inspected items Geometric RV the # of items inspected until the first defective item is found (2) The number of heads on repeated coin tosses: Binomial RV the # of H out of n = 100 coin tosses Geometric RV the # of tosses until you get the first head Binomial Random Variables: Experiment: Counting the # of occourances of an event (success) for n independent trials (fixed number of trials). Random Variable: X = # of successes 4

5 Range: {0, 1, 2,...,n} Parameters: n = # of trials, p = probability of success Notation: If X is a Binomial RV, we write X BIN(n, p), which indicates that X follows a binomial distribution with n trials that each have a probability of success p. Prob Mass Function: f(x) = P(X=x) = ( n x) p x (1 p) n x, for x = 0, 1,...,n Example: If we toss a coin 10 times, what is the probability of getting exactly 2 heads? Solution: We have n=10, p=0.5 Let X=the # of heads in 10 tosses Then, X BIN(10, 0.5) and f(2) = P(X=2) = ( ) 10 2 (0.5) 2 (1 0.5) 10 2 = or 4.4% chance of tossing exactly 2 heads on 10 tosses. What is the probability of getting two or more heads on 10 tosses? What is the probability of getting 40 or more heads on 100 tosses?...in chapter 7, we will see how we can get an approximate answer for a question like this. Let s intuitively think about why the probability mass function of a binomial RV makes sense, using the last example. Firstly, how many ways can we get 2H on 10 tosses? There are ( 10 2 ) = 45, since the 2H can occur on any 2 of the 10 tosses. (ie) A head can occur on the 1 st or 2 nd or... or 10 th toss. We want to choose 2 of the 10 possible locations for the Heads. Recall: We are assuming that probability of success is the same for each trial and that the trials are independent. Let s take one of the 45 sequences with 2H, say...(h, H, T, T, T, T, T, T, T, T) or (S, S, F, F, F, F, F, F, F, F) and because of independence, P(S S F F F F F F F F) 5

6 = P(S)P(S)P(F)P(F)P(F)P(F)P(F)P(F)P(F)P(F) = p*p*(1 p)*(1 p)*(1 p)*(1 p)*(1 p)*(1 p)*(1 p)*(1 p) = p 2 (1 p) 8 The above is the probability of getting 2H and then 8T. All sequences of 2H and 8T have the same probability, so the probability of getting any particular sequence with 2H and 8T is... (# possible 2H combos) (probability of a 2H combo) = ( 10 2 ) p 2 (1 p) 8 Example: Suppose that 5% of computer chips made by a company are defective. If you randomly select and inspect 8 chips, what is the probability that you find at least one defective one? What assumption have you made when solving this problem? Example: Suppose you go to the casino and make 5 bets of $1 each, on the colour of the number coming up in the casino game Roulette. What is the probability that you leave the casino with more money than you came with? Geometric Random Variables: Experiment: Counting the # of trials until the first success. Random Variable: Y = # of trials until first success Range: {1, 2,...} (infinite but countable) Parameters: p = P(S) = probability of success Notation: If Y is a geometric RV, we write Y GEO(p), which indicates that y follows a geometric distribution with probability of success p. Prob. Mass Function.: f(y) = P(Y =y) = p(1 p) y 1, for y = 0, 1,... Example: If we toss a coin, what is the probability that we have to toss it exactly 5 times to see the first H show up? Solution: We have p=0.5 Let Y =the # of tosses until the first H Then, Y GEO(0.5) and f(5) = P(Y = 5) = 0.5 (1 0.5) 5 1 =

7 Let s intuitively think about why the prob mass function of a geometric RV makes sense using the last example. To have the first success on the 5 th trial, we must have the first 4 trials be failures...(f, F, F, F, S) Recall: We assume constant probability of success and independence of trials. So, P(first success on 5 th trial) = P(F F F F S) = by independence = P(F)P(F)P(F)P(F)P(S) = (1 p)(1 p)(1 p)(1 p)p = p(1 p) 4 Examples: 1. If 5% of the computer chips manufactured by a company are defective, what is the probability that we have to check exactly 8 chips before we find one defective one? 2. You go to the casino and play Roulette, betting on the colour of the number. What is the probability that the first bet you win is on the fourth bet you make? Sum of Independent Binomial Random Variables: If X 1, X 2,...,X k are k independent binomial RVs, where X i BIN(n i, p) (have same p but may have different n) Then Y = X 1 + X X k BIN(n 1 + n n k, p) (eg) X 1 BIN(10, 0.25) and X 2 BIN(25, 0.25), and X 1 and X 2 are independent, then Y = X 1 + X 2 BIN(35, 0.25) (ie) A BIN(n, p) is the sum of n independent BIN(1, p) trials (a BIN RV is the sum of Bernoulli RVs) Sum of Independent Geometric Random Variables: Summing more that one geometric RV results in a new type of random variable; the Negative Binomial RV! If X 1, X 2,...,X k are k independent geometric RVs, where X i GEO(p) (have same p) 7

8 Then, Y = X 1 + X X k NEGBIN(k, p) Y = # of trials until the k th success Negative Bin. Prob Mass Fcn: f(y) = P (Y = y) = ( ) y 1 p k (1 p) y k Where y = the # of trials and k = the # of successes. Note: We now have the binomial, geometric and negative binomial random variables (for discrete RVs) binomial # of successes in fixed # of trials geometric # of trials until the 1 st success negative binomial # of trials until the k th success Example: We will toss a coin until we get 2 heads. Suppose that the coin is biased and flips heads 40% of the time. What is the probability that the coin will be tossed exactly 3 times? What is the probability that it will be tossed at least 4 times? Solution: Let Y = # of tosses until you get 2 heads NEGBIN(2, 0.4) For the first question, we have y = 3 and k = 2 P(Y = 3) = f(3) = ( ) (0.4) 2 (1-0.4) 3 2 = For the second question, P(Y 4) = 1 P(Y 3) = = 1 P(Y = 3) P(Y = 2) P(Y = 1) = 1 ( ) (0.4) 2 (1-0.4) 3 2 ( 2 1) (0.4) 2 (1-0.4) = = So far, we have introduced 3 different types of Discrete RVs. In chapter 6, we will introduce one more (The Poisson RV). We will now introduce what we call a cumulative distribution function (cdf, F(x)) and then begin talking about the mean and standard deviation for these discrete RVs. Then we will move into doing the same things, but for a few Continuous RVs. k 1 8

9 Cumulative Distribution Functions (CDF): The cumulative distribution function (F (x)) is defined as F (x) = P(X x) = k x f(k) Let s consider the example of rolling a die 4 times, and let X = the # of 1 s rolled. x f(x) = P (X = x) F (x) = P (X x) Properties of f(x) and F (x): In principal, a Discrete RV can take its value on any countable or finite set of real # s. In this course, we will only deal with integer valued discrete RVs. (1) 0 f(x) 1, 0 F (x) 1 (2) F (x) is non-decreasing and right-continuous (3) allx X f(x) = 1, F ( ) = 0 and F ( ) = 1 (4) f(x) = F (x) - F (x 1), for all discrete integer valued x. ( = P (X x) P (X x 1) = P (x 1 < X x) ) (5) P (a < X b) = F (b) F (a), for all integers a and b (6) P (X < a) = P (X a 1) = F (a 1), for all integers a Note: In the discrete case, P (X < x) P (X x). 9

10 The Geometric Distribution Function (F (x)): Example: Derive the cumulative distribution function (F (x)) of a geometric RV. Fact: For a geometric series, 1 + r + r r m = 1 rm+1 1 r Recall: For Y GEO(p), f(y) = P (Y = y) = p(1 p) y 1 Therefore, F (y) = y k=1 p(1 p)k 1 = p(1 p) 0 + p(1 p) p(1 p) y 1 = p( 1 + (1 p) 1 + (1 p) (1 p) y 1 ) which is a geometric series where r=(1 p) and m=(y 1), so... = p 1 (1 p)(y 1)+1 1 (1 p) = p 1 (1 p)y p = 1 (1 p) y Therefore, if Y GEO(p), then F (y) = P (Y y) = 1 (1 p) y The Binomial Distribution Function (F (x)): There is no closed form expression for F (x) when X BIN(n, p). Instead, one can (a) Use the binomial table in the course notes...we won t do this! (b) Calculate F (x) = P(X x) = f(0) + f(1) f(x) (c) In chapter 7, we will learn some approximation methods we can use, when the number of trials (n) is large. Examples: Let X BIN(18, 0.4). Find... (a) P(5 < X 10) (b) P(5 X < 10) (c) P(X > 4), and P(X 4) Expected Values: In this section, we will talk about how to find the expected value for a random variable. By expected, we mean the most representative value. The value that would be expected in the long run. Expected Values make sense in the long run, but often not in the short run... let s discuss this... Let X be a discrete RV with prob mass function f(x) and range R. Let g(x) be a function of x. (eg) g(x)=x or g(x)=x 2 or g(x)=e x 10

11 Then the Expected Value of g(x) is... E(g(x)) = x R g(x)f(x) Note: Often we are interested in the expected value of g(x)=x, although we leave it general, as there are instances where we want to find the expected value of things such as g(x)=x 2 and so on... Note: That if g(x)=x, then the expected value, E(g(x)), is just the most likely value for the random variable X (in the long run). (eg) If X=# of defects and we use g(x)=x, then E(g(x)) = E(x) = The expected # of defects. Examples: 1. Consider the following. (a) Find the expected value of X. (b) Find the expected value of X 2. x f(x) Consider rolling a standard die one time. Let X be the number rolled. Find (a) E(X) and E(X 2 ) 3. Suppose you will make a $1 bet on the colour of the number in roulette. What is the expected gain/loss for the bet? Note: In general, E(X 2 ) E(X) 2... check for the previous example! (ie) The expected value of the square the square of the expected value! Mean of X = µ x = E(X) = x R xf(x) = allx R xp(x=x) Note: The expected value/mean of X is simply a weighted average of the possible values X may take on, weighted by how often that value will be observed. 11

12 We use the mean as our measure of center Variance of X = E( (X µ x ) 2 ) = σ 2 x = x R (x µ x) 2 f(x) Here, we are using g(x)=(x µ x ) 2...we are finding the expected squared deviation from the mean. This gives us an estimate of the expected squared deviation from the mean, or spread. YES, this is the same as the variance we saw in chapters 1/2...the only difference is that there it was for a SAMPLE of data, here it is for the POPULATION/true variance. The above definition of the variance is in a form that helps us to understand what it is estimating. Below is a definition of the variance that is more useful when actually calculating a variance. σ 2 x = E(X 2 ) E(X) 2 = E(X 2 ) µ 2 x Standard Deviation of X = σ x = σ 2 x = square root of the variance (we work with this one most often) Properties of The Mean, Variance and Standard Deviation: If you recall, in our review of Chapter 1/2 we discussed a few rules of what happens to the mean/variance/sd when you add or multiply a variable by a constant. These rules are exactly the same, only stated in notational form here. Let X be a random variable with mean µ x and variance σ 2 x, and let a and b be any real-valued constants. (1) If Y = ax + b, then µ y = E(Y ) = E(aX + b) = E(aX) + E(b) = ae(x) + b = aµ x + b σ 2 y = VAR(Y ) = VAR(aX + b) = VAR(aX) + VAR(b) = a 2 VAR(X) = a 2 σ 2 x. σ y = a σ x Recall: Adding a constant changes the mean by that constant, but the SD remains the same. Multiplying by a constant changes both the mean and SD by that constant. 12

13 Let X and Y be random variables with means µ x, µ y and variances σ 2 x, σ 2 y (2) If we add/subtract the two random variables X and Y, then... Mean of X + Y = µ x+y = E(X + Y ) = E(X) + E(Y ) = µ x + µ y Mean of X Y = µ x y = E(X Y ) = E(X) E(Y ) = µ x µ y In general, E(aX + by ) = aµ x + bµ y (3) If X and Y are independent, then... VAR(aX + by ) = a 2 σ 2 x + b 2 σ 2 y VAR(aX by ) = a 2 σ 2 x+b 2 σ 2 y When X and Y are independent, the variance of the sum/difference is equal to the sum of the variances. (ie) VAR(X ± Y ) = VAR(X) + VAR(Y ) (4) If X and Y are independent, then... E(XY ) = E(X)E(Y ) Examples: 1. You own 4 machines and you want to have them inspected. It costs $50 for all 4 inspections, and a defective machine is repaired at a cost of $25 each. The probability of finding 0, 1, 2, 3, 4 defective machines is summarized in the table below. x f(x) xf(x) X 2 f(x) sum (a) What is the expected # of defective machines? (b) What is the expected total cost? (c) What is the standard deviation for the total cost? 13

14 2. Suppose that in the month of July in Vancouver, the temperature has µ=25 o C with σ 2 = 4. What are the mean and SD of temperatures in degrees fahrenheit? Hint: Fahrenheit = Celsius* Now that we know how to find means and variances, let s go thru and find the means and variances for some distributions we already know. Mean and Variance of Binomial Random Variables: If X BIN(n, p) then... Mean of X = E(X) = µ x = np Variance of X = VAR(X) = σ 2 x = np(1 p) Proof: Let X = X 1 + X X n, where X i iid BIN(1, p), then X BIN(n, p) Then E(X i ) = x R xf(x) = x R xp (X = x) = 1p + 0(1 p) = p and E(X 2 i ) = x R x2 f(x) = x R x2 P (X = x) = 1 2 p (1 p) = p Therefore, σ 2 x i = VAR(X i ) = E(X 2 i ) E(X i ) 2 = p p 2 = p(1 p) So, E(X) = E(X 1 + X X n ) = E(X 1 ) + E(X 2 ) E(X n ) = = p + p p = np σ 2 x = VAR(X) = VAR(X 1 + X X n ) =(because of Indep.)= VAR(X 1 ) + VAR(X 2 ) VAR(X n ) = p(1 p) + p(1 p) p(1 p) = np(1 p) 14

15 Example: You toss a coin 75 times. If we let X be the # of heads for the 75 tosses, then what is the expected # of heads and what is its variance? Example: You make 80 bets on the colour of the number in Roulette. What is the mean and variance for the number of bets won? Mean and Variance of Geometric Random Variables: If Y GEO(p) then... Mean of Y = E(Y ) = µ y = 1 p Variance of Y = VAR(Y ) = σ 2 y = 1 p p 2 ** See course notes for proof! Example: A telemarketer makes successive independent phone calls and gets a sale with a probability of 5% each call. Each phone call costs 25 cents. Find (a) What is the expected cost of making one sale? (b) What is the corresponding standard deviation? Mean and Variance of Negative Binomial RVs: If Y NEGBIN(k, p) then...(recall: k=# successes, p=prob. of success) Mean of Y = E(Y ) = µ y = k p Variance of Y = VAR(Y ) = σ 2 y = k(1 p) p 2 Proof: Recall: If Y = Y 1 + Y Y k where Y i iid GEO(p), then Y NEGBIN(k,p) E(Y ) = E(Y 1 + Y Y k ) = E(Y 1 ) + E(Y 2 ) E(Y k ) = 15

16 = 1 p + 1 p p = k p VAR(Y ) = VAR(Y 1 + Y Y k ) =(by Indep.)= = VAR(Y 1 ) + VAR(Y 2 ) VAR(Y k ) = 1 p p 2 Example Continued: (c) What is the expected cost of making 3 sales? (d) What is the SD of this? + 1 p p p p 2 = k(1 p) p 2 Example: A hockey team needs to sign two free agents before the season starts. Suppose that each player they speak with will join the team with a 20% probability, and assume that the player s probabilities of joining are independent of each other. (a) What is the expected # of players they talk to before signing 2 players? (b) What is the SD of this? 16

17 Examples: 1. Suppose that it is known that the probability of being able to log on to a computer from a remote terminal is 0.7 at any given time. (a) What is the probability that out of 10 attempts, you are able to log on exactly 6 times? (b) What is the probability that out of 8 attempts, you are able to log on at least 6 times? (c) How many times you you expect to have to try to log on before being successfully logging on? (d) What is the probability that you must attempt to log on at least 3 times before successfully logging on? *Note that we can solve this by defining a BIN or GEO RV (e) What is the probability that it takes you more than 4 attempts to successfully log on twice? *Note that we can solve this by defining a BIN of BEGBIN RV 2. Let s play a game. You have to pay me $60 to play the game. you then roll 2 dice and look at the sum of those 2 dice. I will give you back whatever the sum of the 2 dice is squared. (a) What is the expected gain/loss for playing this game? (b) What is the variance for the gain/loss when playing this game? (c) What is the probability that you make money on a single play of the game? 3. The probability that a wildcat well will be productive is 1, regardless of the location 13 being drilled. We will assume that the locations being drilled are far enough from each other so that the probability of it being productive is independent for each of the wells drilled. (a) How many wells do they expect to have to drill before finding 1 that is productive? (b) How many wells do they expect to have to drill before finding 3 that are productive? (c) What is the probability that the first productive well they find is on the 13th well they drill? (d) If they drill 20 wells, what is the probability that at least 2 are found to be productive? (e) If it costs the company $1000 start-up for drilling plus $100 for each well they drill, what is the expected cost and the variance of the cost, of finding 3 productive wells? 17

18 Continuous Random Variables Things in this section will look very similar to the section on Discrete RVs. Conceptually, all the ideas are the same. A Continuous random variable is one that has an infinite or uncountable range. (eg) Weight of an item, time until failure of an mechanical component, length of an object,... (Probability) Density Function (f(x)): The density function is a function that alows us to work out the probability of occurance over a range of x-values. It does not have the exact same definition as in the discrete case. (That is, f(x) P (X = x), for a CONT RV) P(a X b) = b a f(x)dx Note: In the discrete case, P(a X b) = b a f(x). Now that it is continuous we must integrate instead of sum. Note: In the continuous case, P (X = a) = 0, for any a. (ie) the probability of X taking on any particular value is 0. Note: f(x)dx = 1. This is similar to what we saw in the discrete casae, except there we summed over all values X could take on. (Cumulative) Distribution Function (F (x)): The c.d.f. gives the probability of being less than or equal to a particular value. F (x) = P (X x) = x f(t)dt Note: Again, this is what we saw in the discrete case, except that now we have an integration instead of a summation. Properties of F (x): 1. 0 F (x) 1 18

19 2. F ( ) = 0 and F ( ) = 1 3. F (x) is continuous and non-decreasing 4. P (a < X < b) = P (a X < b) = P (a < X b) = P (a X b) = F (b) F (a) Including or excluding the endpoints does not matter for continuous RVs as the probability of X taking on any given value is F (x) = f(x). The derivative of the distribution gives the density. f(x) F (x) by integration, and F (x) f(x) by differentiation. Now we will introduce two continuous RVs; the Uniform RV and the Exponential RV. The next chapter will be devoted to the Normal RV (bell curve), which is the most useful and most common continuous distribution in statistics. Uniform Random Variables: Notation: If X is a Uniform RV, then we write X UNI(a, b), indicating that X is uniformly (or evenly) distributed over the interval [a, b] Density Function: f(x) = 1 b a, for a x b Distribution Function: F (x) = x a Mean of X = E(X) = µ x = a+b 2 Variance of X = VAR(x) = σ 2 x = (b a)2 12 f(u)du = x a b a Exercise: On your own, show that the above are true...you will be able to do this after the section on expected values (on p. 21)! Example: Let X UNI(0, 10). Find... (a) P(X = 5) (b) P(X 2) (c) P(3 X 7) (d) P(X < 2) 19

20 Exponential Random Variables: The Exponential RV is often used to model the time until an event occurs. As a consequence, it takes on values 0. In chapter 6 we will come back to this, but it arises from something called the poisson process, which we will discuss more in depth then. Notation: If X is an Exponential RV with a failure rate of λ, then we write X EXP(λ) λ is a positive constant and is equal to the reciprocal of the mean life-time. (ie) If the lightbulb has a mean lifetime of 5 years, then λ = 1 5 Density Function: f(x) = λe λx, for x 0 Distribution Function: F (x) = 1 e λx, for x 0 Mean of X = E(X) = µ x = 1 λ Variance of X = VAR(x) = σ 2 x = 1 λ 2 Example: Suppose that a lightbulb has an expected lifetime of 10 years. Find... (a) The failure rate of the lightbulb (b) The probability that a bulb lasts more than 15 years? (c) The probability that a bulb lasts 4-6 years? 20

21 Expected Values: Again, finding the expected values is very similar to the discrete case, except now we use integrals instead of summations. Let X be a continuous RV with density function f(x). Let g(x) be a given function of x (eg) g(x) = x or g(x) = x 2 or,... Then the expected value is E(g(x)) = g(x)f(x)dx We integrate over the range of X. Here we write (, ) to be general...what we mean is over the range of the RV. Note: If we use g(x) = x, then we are finding the expected value of X. Mean of X: µ x = E(X) = xf(x)dx Variance of X: σ 2 x = VAR(X) = E(x 2 ) E(x) 2 Note: The properties of µ x, σ 2 x and σ x are the same as in the discrete case (outlined on page 12/13) Example: Suppose that X UNI(0, 5). Find... (a) The expected value of X (b) The expected value of X 2 (c) The SD of X Example: The density function for X, the lead concentration in gasoline in grams per liter is given by f(x) = 12.5x 1.25, for 0.1 x 0.5 (a) What is the expected concentration of lead? (b) What is the variance of the concentration of lead? 21

22 The Median/Half-life of a Continuous RV: The median of a continuous RV is defined as F (m) = 1 2 (ie) The probability that x is above or below m, the median, is 1 2 P(X < m) = P(X > m) = 1 2 When X is the measure of a lifetime, we often refer to the median (m) as the Half-Life Example: Suppose that the lifetime of your TV (in years) follows an exponential distribution with a failure rate of Find... (a) The expected lifetime of the TV. (b) The SD of the lifetime of the TV. (c) Calculate the half-life of your TV. (d) What % of TVs like yours will exceed their expected lifetimes? (e) What % of TVs like yours will exceed their half-lives? 22

23 Sum and Average of Independent RVs: Note: We will use many of the ideas in this section once we come to Chapter 7. Random experiments are often independently repeated creating a sequence X 1, X 2,..., X n of n independent RVs. Typically these n RVs, X i where i = 1, 2,..., n, will have a common mean (µ xi ) and a common variance (σ 2 x i ). In this case, {X 1, X 2,..., X n } is called an Independent Random Sample (eg) Roll a die repeatedly, measure the lifetime of a type of component repeatedly, crash test a type of car repeatedly,... So, we have independent RVs X 1, X 2,..., X n, where µ xi = µ and σ 2 x i = σ 2 (common mean/variance) Often, we are interested in the sum or mean of the sample values. S = n i=1 X i = X 1 + X X n X = ni=1 X i n = X 1+X X n n Example: Suppose we have a series of 9 independent measurements on the compressive strength of steel fibre reinforced concrete. Let X i UNI(500, 1500), where X i is the compressive strength of block i, i=1, 2,...,9. Question: do you think the breaking strength would follow a uniform distribution? Recall: E(X i ) = a+b 2 = = 1000 VAR(X i ) = (b a)2 12 = ( )2 12 = BUT, a more accurate measure of the true compressive strength of the steel fibre reinforced concrete would be the average/mean strength of the 9 blocks, X. So, when we have independent RVs X 1, X 2,..., X n then... with common mean and variance, 23

24 Sum: µ S = E(S) = nµ xi. σ 2 S = VAR(S) = nσ2 x i. σ S = SD(S) = nσ xi Mean: µ X = E( X) = µ xi. σ 2 X = VAR( X) = σ2 x i n. σ X = SD( X) = σ x i n Square Root Rule: The SD of the sum is proportional to the square root of n. The SD of the average is proportional to the inverse of the square root of n Example (continued): Recall our example of measurements on the compressive strength of 9 concrete blocks. Find... (a) The expected value of the mean of the 9 blocks (b) The SD of the mean of the 9 blocks (SD X) Note: We can see that the mean is the same, but the SD is 3-times smaller square root rule! Note: As the number of measurements or observations grows large (n ), the variability of the mean of the measurements gets very small (SD( X) = σ X 0) More measurements means greater accuracy! Example: 20 randomly selected UBC students are asked if they smoke and 6 say that they do and the other 14 do not. Find... (a) The estimated proportion of smokers among UBC students? Is this a valid estimate? (b) The SD of this estimate In general, when we are working with proportions (which arise from categorical p(1 p) variables), we estimate SD(p) = n 24

25 Max/Min of Independent RVs As we just discussed, we often conduct a series of random experiments to get a sequence of Independent RVs {X 1, X 2,..., X n }, known as a random sample. Often, we are interested in the sum or average of all the observations, but there are also instances where we are interested in the Maximum or Minimum value in this random sample. If we let = max{x 1, X 2,..., X n }. Then can be used to model: 1) The lifetime of a system of n independent parallel components, where X i = the lifetime of component i. 2) The completion time of a project of n independent sub-projects, which can be done simultaneously and X i = completion time of project i If we let = min{x 1, X 2,..., X n }. Then can be used to model: 1) The lifetime of a system of n independent components in a series, where X i = the lifetime of component i. 2) The completion time of a project pursued by n independent competing teams and X i = completion time of project i 25

26 The Maximum: Suppose that we have an independent random sample, X 1, X 2,..., X n, and we wish to find the maximum of them; = max{x 1, X 2,..., X n } Also, suppose that F Xi (x) and f Xi (x) are the distribution and density functions for the RV X i. Define F W (ν) and f W (ν) to be the distribution and density functions of the maximum ( ) Then, F (ν) = P( ν) = P((X 1 ν) (X 2 ν)... (X n ν) ) =(by indep)= P(X 1 ν)*p(x 2 ν)*...*p(x n ν) = F X1 (ν) F X2 (ν)... F Xn (ν) Here, we will only discuss the cases when the X i distributed), as this is most often the case. are iid (=independent and identically Distribution: F (ν) = (F Xi (ν)) n Density: f (ν) = F (ν) = n(f Xi (ν)) n 1 f Xi (ν) Example: A system consists of 3 components arranged in parallel. Components are independent with a mean lifetime of 5 years. The lifetimes are thought to follow an exponential distribution. Find... (a) The median/half-life for component 1 (b) The probability that the first component fails before 5.5 years have passed (c) The probability that the system fails before 5.5 years have passed (d) The median/half-life of the system Note: How we can solve some of these problems using the simple rules of probability that we learned back in chapter 3. 26

27 The Minimum: As before, Suppose that we have a random sample, X 1, X 2,..., X n, and we wish to find the minimum of them; = min{x 1, X 2,..., X n } Also, suppose that F Xi (x) and f Xi (x) are the distribution and density functions for the RV X i. Define F V (υ) and f V (υ) to be the distribution and density functions of the minimum ( ) Then, F (υ) = P( υ) = 1 P ( > υ) = 1 P( (X 1 > υ) (X 2 > υ)... (X n > υ) ) =(by indep)= 1 P(X 1 > υ)*p(x 2 > υ)*...*p(x n > υ) = 1 (1 F X1 (υ))(1 F X2 (υ))...(1 F Xn (υ)) Distribution: F (υ) = 1 (1 F Xi (υ)) n Density: f (υ) = F (υ) = n(1 F Xi (υ)) n 1 f Xi (υ) Example: A system consists of 3 components in a series. Components are independent with a mean lifetime of 5 years, and are thought to follow an exponential distribution. Find... (a) The probability of component 1 failing before 5.5 years have passed (b) The probability that the system fails before 5.5 years have passed (c) The median/half-life for the system 27

Random Variables Handout. Xavier Vilà

Random Variables Handout. Xavier Vilà Random Variables Handout Xavier Vilà Course 2004-2005 1 Discrete Random Variables. 1.1 Introduction 1.1.1 Definition of Random Variable A random variable X is a function that maps each possible outcome

More information

STAT 241/251 - Chapter 7: Central Limit Theorem

STAT 241/251 - Chapter 7: Central Limit Theorem STAT 241/251 - Chapter 7: Central Limit Theorem In this chapter we will introduce the most important theorem in statistics; the central limit theorem. What have we seen so far? First, we saw that for an

More information

5. In fact, any function of a random variable is also a random variable

5. In fact, any function of a random variable is also a random variable Random Variables - Class 11 October 14, 2012 Debdeep Pati 1 Random variables 1.1 Expectation of a function of a random variable 1. Expectation of a function of a random variable 2. We know E(X) = x xp(x)

More information

Discrete Random Variables and Probability Distributions. Stat 4570/5570 Based on Devore s book (Ed 8)

Discrete Random Variables and Probability Distributions. Stat 4570/5570 Based on Devore s book (Ed 8) 3 Discrete Random Variables and Probability Distributions Stat 4570/5570 Based on Devore s book (Ed 8) Random Variables We can associate each single outcome of an experiment with a real number: We refer

More information

Commonly Used Distributions

Commonly Used Distributions Chapter 4: Commonly Used Distributions 1 Introduction Statistical inference involves drawing a sample from a population and analyzing the sample data to learn about the population. We often have some knowledge

More information

Probability Distributions for Discrete RV

Probability Distributions for Discrete RV Probability Distributions for Discrete RV Probability Distributions for Discrete RV Definition The probability distribution or probability mass function (pmf) of a discrete rv is defined for every number

More information

Binomial Random Variables. Binomial Random Variables

Binomial Random Variables. Binomial Random Variables Bernoulli Trials Definition A Bernoulli trial is a random experiment in which there are only two possible outcomes - success and failure. 1 Tossing a coin and considering heads as success and tails as

More information

Chapter 3 Discrete Random Variables and Probability Distributions

Chapter 3 Discrete Random Variables and Probability Distributions Chapter 3 Discrete Random Variables and Probability Distributions Part 3: Special Discrete Random Variable Distributions Section 3.5 Discrete Uniform Section 3.6 Bernoulli and Binomial Others sections

More information

Statistics for Managers Using Microsoft Excel 7 th Edition

Statistics for Managers Using Microsoft Excel 7 th Edition Statistics for Managers Using Microsoft Excel 7 th Edition Chapter 5 Discrete Probability Distributions Statistics for Managers Using Microsoft Excel 7e Copyright 014 Pearson Education, Inc. Chap 5-1 Learning

More information

Chapter 7: Random Variables

Chapter 7: Random Variables Chapter 7: Random Variables 7.1 Discrete and Continuous Random Variables 7.2 Means and Variances of Random Variables 1 Introduction A random variable is a function that associates a unique numerical value

More information

STAT Chapter 7: Central Limit Theorem

STAT Chapter 7: Central Limit Theorem STAT 251 - Chapter 7: Central Limit Theorem In this chapter we will introduce the most important theorem in statistics; the central limit theorem. What have we seen so far? First, we saw that for an i.i.d

More information

Welcome to Stat 410!

Welcome to Stat 410! Welcome to Stat 410! Personnel Instructor: Liang, Feng TA: Gan, Gary (Lingrui) Instructors/TAs from two other sessions Websites: Piazza and Compass Homework When, where and how to submit your homework

More information

Chapter 4: Commonly Used Distributions. Statistics for Engineers and Scientists Fourth Edition William Navidi

Chapter 4: Commonly Used Distributions. Statistics for Engineers and Scientists Fourth Edition William Navidi Chapter 4: Commonly Used Distributions Statistics for Engineers and Scientists Fourth Edition William Navidi 2014 by Education. This is proprietary material solely for authorized instructor use. Not authorized

More information

A random variable (r. v.) is a variable whose value is a numerical outcome of a random phenomenon.

A random variable (r. v.) is a variable whose value is a numerical outcome of a random phenomenon. Chapter 14: random variables p394 A random variable (r. v.) is a variable whose value is a numerical outcome of a random phenomenon. Consider the experiment of tossing a coin. Define a random variable

More information

Probability. An intro for calculus students P= Figure 1: A normal integral

Probability. An intro for calculus students P= Figure 1: A normal integral Probability An intro for calculus students.8.6.4.2 P=.87 2 3 4 Figure : A normal integral Suppose we flip a coin 2 times; what is the probability that we get more than 2 heads? Suppose we roll a six-sided

More information

Chapter 5. Sampling Distributions

Chapter 5. Sampling Distributions Lecture notes, Lang Wu, UBC 1 Chapter 5. Sampling Distributions 5.1. Introduction In statistical inference, we attempt to estimate an unknown population characteristic, such as the population mean, µ,

More information

2011 Pearson Education, Inc

2011 Pearson Education, Inc Statistics for Business and Economics Chapter 4 Random Variables & Probability Distributions Content 1. Two Types of Random Variables 2. Probability Distributions for Discrete Random Variables 3. The Binomial

More information

4 Random Variables and Distributions

4 Random Variables and Distributions 4 Random Variables and Distributions Random variables A random variable assigns each outcome in a sample space. e.g. called a realization of that variable to Note: We ll usually denote a random variable

More information

TOPIC: PROBABILITY DISTRIBUTIONS

TOPIC: PROBABILITY DISTRIBUTIONS TOPIC: PROBABILITY DISTRIBUTIONS There are two types of random variables: A Discrete random variable can take on only specified, distinct values. A Continuous random variable can take on any value within

More information

UQ, STAT2201, 2017, Lectures 3 and 4 Unit 3 Probability Distributions.

UQ, STAT2201, 2017, Lectures 3 and 4 Unit 3 Probability Distributions. UQ, STAT2201, 2017, Lectures 3 and 4 Unit 3 Probability Distributions. Random Variables 2 A random variable X is a numerical (integer, real, complex, vector etc.) summary of the outcome of the random experiment.

More information

Random Variable: Definition

Random Variable: Definition Random Variables Random Variable: Definition A Random Variable is a numerical description of the outcome of an experiment Experiment Roll a die 10 times Inspect a shipment of 100 parts Open a gas station

More information

Statistics 6 th Edition

Statistics 6 th Edition Statistics 6 th Edition Chapter 5 Discrete Probability Distributions Chap 5-1 Definitions Random Variables Random Variables Discrete Random Variable Continuous Random Variable Ch. 5 Ch. 6 Chap 5-2 Discrete

More information

Simple Random Sample

Simple Random Sample Simple Random Sample A simple random sample (SRS) of size n consists of n elements from the population chosen in such a way that every set of n elements has an equal chance to be the sample actually selected.

More information

STA Module 3B Discrete Random Variables

STA Module 3B Discrete Random Variables STA 2023 Module 3B Discrete Random Variables Learning Objectives Upon completing this module, you should be able to 1. Determine the probability distribution of a discrete random variable. 2. Construct

More information

Random variables. Discrete random variables. Continuous random variables.

Random variables. Discrete random variables. Continuous random variables. Random variables Discrete random variables. Continuous random variables. Discrete random variables. Denote a discrete random variable with X: It is a variable that takes values with some probability. Examples:

More information

Probability and Random Variables A FINANCIAL TIMES COMPANY

Probability and Random Variables A FINANCIAL TIMES COMPANY Probability Basics Probability and Random Variables A FINANCIAL TIMES COMPANY 2 Probability Probability of union P[A [ B] =P[A]+P[B] P[A \ B] Conditional Probability A B P[A B] = Bayes Theorem P[A \ B]

More information

6. Continous Distributions

6. Continous Distributions 6. Continous Distributions Chris Piech and Mehran Sahami May 17 So far, all random variables we have seen have been discrete. In all the cases we have seen in CS19 this meant that our RVs could only take

More information

Unit 5: Sampling Distributions of Statistics

Unit 5: Sampling Distributions of Statistics Unit 5: Sampling Distributions of Statistics Statistics 571: Statistical Methods Ramón V. León 6/12/2004 Unit 5 - Stat 571 - Ramon V. Leon 1 Definitions and Key Concepts A sample statistic used to estimate

More information

Statistical Methods in Practice STAT/MATH 3379

Statistical Methods in Practice STAT/MATH 3379 Statistical Methods in Practice STAT/MATH 3379 Dr. A. B. W. Manage Associate Professor of Mathematics & Statistics Department of Mathematics & Statistics Sam Houston State University Overview 6.1 Discrete

More information

Unit 5: Sampling Distributions of Statistics

Unit 5: Sampling Distributions of Statistics Unit 5: Sampling Distributions of Statistics Statistics 571: Statistical Methods Ramón V. León 6/12/2004 Unit 5 - Stat 571 - Ramon V. Leon 1 Definitions and Key Concepts A sample statistic used to estimate

More information

4.2 Bernoulli Trials and Binomial Distributions

4.2 Bernoulli Trials and Binomial Distributions Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan 4.2 Bernoulli Trials and Binomial Distributions A Bernoulli trial 1 is an experiment with exactly two outcomes: Success and

More information

Discrete Random Variables

Discrete Random Variables Discrete Random Variables In this chapter, we introduce a new concept that of a random variable or RV. A random variable is a model to help us describe the state of the world around us. Roughly, a RV can

More information

A useful modeling tricks.

A useful modeling tricks. .7 Joint models for more than two outcomes We saw that we could write joint models for a pair of variables by specifying the joint probabilities over all pairs of outcomes. In principal, we could do this

More information

MA : Introductory Probability

MA : Introductory Probability MA 320-001: Introductory Probability David Murrugarra Department of Mathematics, University of Kentucky http://www.math.uky.edu/~dmu228/ma320/ Spring 2017 David Murrugarra (University of Kentucky) MA 320:

More information

4.1 Probability Distributions

4.1 Probability Distributions Probability and Statistics Mrs. Leahy Chapter 4: Discrete Probability Distribution ALWAYS KEEP IN MIND: The Probability of an event is ALWAYS between: and!!!! 4.1 Probability Distributions Random Variables

More information

STA Rev. F Learning Objectives. What is a Random Variable? Module 5 Discrete Random Variables

STA Rev. F Learning Objectives. What is a Random Variable? Module 5 Discrete Random Variables STA 2023 Module 5 Discrete Random Variables Learning Objectives Upon completing this module, you should be able to: 1. Determine the probability distribution of a discrete random variable. 2. Construct

More information

STOR Lecture 7. Random Variables - I

STOR Lecture 7. Random Variables - I STOR 435.001 Lecture 7 Random Variables - I Shankar Bhamidi UNC Chapel Hill 1 / 31 Example 1a: Suppose that our experiment consists of tossing 3 fair coins. Let Y denote the number of heads that appear.

More information

Chapter 3 Discrete Random Variables and Probability Distributions

Chapter 3 Discrete Random Variables and Probability Distributions Chapter 3 Discrete Random Variables and Probability Distributions Part 4: Special Discrete Random Variable Distributions Sections 3.7 & 3.8 Geometric, Negative Binomial, Hypergeometric NOTE: The discrete

More information

Statistics for Business and Economics

Statistics for Business and Economics Statistics for Business and Economics Chapter 5 Continuous Random Variables and Probability Distributions Ch. 5-1 Probability Distributions Probability Distributions Ch. 4 Discrete Continuous Ch. 5 Probability

More information

Mathematics of Randomness

Mathematics of Randomness Ch 5 Probability: The Mathematics of Randomness 5.1.1 Random Variables and Their Distributions A random variable is a quantity that (prior to observation) can be thought of as dependent on chance phenomena.

More information

Statistics and Probability

Statistics and Probability Statistics and Probability Continuous RVs (Normal); Confidence Intervals Outline Continuous random variables Normal distribution CLT Point estimation Confidence intervals http://www.isrec.isb-sib.ch/~darlene/geneve/

More information

(Practice Version) Midterm Exam 1

(Practice Version) Midterm Exam 1 EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 19, 2014 (Practice Version) Midterm Exam 1 Last name First name SID Rules. DO NOT open

More information

CS145: Probability & Computing

CS145: Probability & Computing CS145: Probability & Computing Lecture 8: Variance of Sums, Cumulative Distribution, Continuous Variables Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,

More information

STA 6166 Fall 2007 Web-based Course. Notes 10: Probability Models

STA 6166 Fall 2007 Web-based Course. Notes 10: Probability Models STA 6166 Fall 2007 Web-based Course 1 Notes 10: Probability Models We first saw the normal model as a useful model for the distribution of some quantitative variables. We ve also seen that if we make a

More information

A random variable (r. v.) is a variable whose value is a numerical outcome of a random phenomenon.

A random variable (r. v.) is a variable whose value is a numerical outcome of a random phenomenon. Chapter 14: random variables p394 A random variable (r. v.) is a variable whose value is a numerical outcome of a random phenomenon. Consider the experiment of tossing a coin. Define a random variable

More information

5.2 Random Variables, Probability Histograms and Probability Distributions

5.2 Random Variables, Probability Histograms and Probability Distributions Chapter 5 5.2 Random Variables, Probability Histograms and Probability Distributions A random variable (r.v.) can be either continuous or discrete. It takes on the possible values of an experiment. It

More information

5.3 Statistics and Their Distributions

5.3 Statistics and Their Distributions Chapter 5 Joint Probability Distributions and Random Samples Instructor: Lingsong Zhang 1 Statistics and Their Distributions 5.3 Statistics and Their Distributions Statistics and Their Distributions Consider

More information

Problems from 9th edition of Probability and Statistical Inference by Hogg, Tanis and Zimmerman:

Problems from 9th edition of Probability and Statistical Inference by Hogg, Tanis and Zimmerman: Math 224 Fall 207 Homework 5 Drew Armstrong Problems from 9th edition of Probability and Statistical Inference by Hogg, Tanis and Zimmerman: Section 3., Exercises 3, 0. Section 3.3, Exercises 2, 3, 0,.

More information

Homework Assignments

Homework Assignments Homework Assignments Week 1 (p. 57) #4.1, 4., 4.3 Week (pp 58 6) #4.5, 4.6, 4.8(a), 4.13, 4.0, 4.6(b), 4.8, 4.31, 4.34 Week 3 (pp 15 19) #1.9, 1.1, 1.13, 1.15, 1.18 (pp 9 31) #.,.6,.9 Week 4 (pp 36 37)

More information

Business Statistics 41000: Probability 3

Business Statistics 41000: Probability 3 Business Statistics 41000: Probability 3 Drew D. Creal University of Chicago, Booth School of Business February 7 and 8, 2014 1 Class information Drew D. Creal Email: dcreal@chicagobooth.edu Office: 404

More information

CS134: Networks Spring Random Variables and Independence. 1.2 Probability Distribution Function (PDF) Number of heads Probability 2 0.

CS134: Networks Spring Random Variables and Independence. 1.2 Probability Distribution Function (PDF) Number of heads Probability 2 0. CS134: Networks Spring 2017 Prof. Yaron Singer Section 0 1 Probability 1.1 Random Variables and Independence A real-valued random variable is a variable that can take each of a set of possible values in

More information

March 21, Fractal Friday: Sign up & pay now!

March 21, Fractal Friday: Sign up & pay now! Welcome to the Fourth Quarter. We did Discrete Random Variables at the start of HL1 (1516) Q4 We did Continuous RVs at the end of HL2 (1617) Q3 and start of Q4 7½ weeks plus finals Three Units: Discrete

More information

15.063: Communicating with Data Summer Recitation 3 Probability II

15.063: Communicating with Data Summer Recitation 3 Probability II 15.063: Communicating with Data Summer 2003 Recitation 3 Probability II Today s Goal Binomial Random Variables (RV) Covariance and Correlation Sums of RV Normal RV 15.063, Summer '03 2 Random Variables

More information

Chapter 7. Sampling Distributions and the Central Limit Theorem

Chapter 7. Sampling Distributions and the Central Limit Theorem Chapter 7. Sampling Distributions and the Central Limit Theorem 1 Introduction 2 Sampling Distributions related to the normal distribution 3 The central limit theorem 4 The normal approximation to binomial

More information

Learning Objec0ves. Statistics for Business and Economics. Discrete Probability Distribu0ons

Learning Objec0ves. Statistics for Business and Economics. Discrete Probability Distribu0ons Statistics for Business and Economics Discrete Probability Distribu0ons Learning Objec0ves In this lecture, you learn: The proper0es of a probability distribu0on To compute the expected value and variance

More information

Chapter 5. Statistical inference for Parametric Models

Chapter 5. Statistical inference for Parametric Models Chapter 5. Statistical inference for Parametric Models Outline Overview Parameter estimation Method of moments How good are method of moments estimates? Interval estimation Statistical Inference for Parametric

More information

Probability Theory. Probability and Statistics for Data Science CSE594 - Spring 2016

Probability Theory. Probability and Statistics for Data Science CSE594 - Spring 2016 Probability Theory Probability and Statistics for Data Science CSE594 - Spring 2016 What is Probability? 2 What is Probability? Examples outcome of flipping a coin (seminal example) amount of snowfall

More information

Chapter 7: Point Estimation and Sampling Distributions

Chapter 7: Point Estimation and Sampling Distributions Chapter 7: Point Estimation and Sampling Distributions Seungchul Baek Department of Statistics, University of South Carolina STAT 509: Statistics for Engineers 1 / 20 Motivation In chapter 3, we learned

More information

II. Random Variables

II. Random Variables II. Random Variables Random variables operate in much the same way as the outcomes or events in some arbitrary sample space the distinction is that random variables are simply outcomes that are represented

More information

Review for Final Exam Spring 2014 Jeremy Orloff and Jonathan Bloom

Review for Final Exam Spring 2014 Jeremy Orloff and Jonathan Bloom Review for Final Exam 18.05 Spring 2014 Jeremy Orloff and Jonathan Bloom THANK YOU!!!! JON!! PETER!! RUTHI!! ERIKA!! ALL OF YOU!!!! Probability Counting Sets Inclusion-exclusion principle Rule of product

More information

Chapter 7. Sampling Distributions and the Central Limit Theorem

Chapter 7. Sampling Distributions and the Central Limit Theorem Chapter 7. Sampling Distributions and the Central Limit Theorem 1 Introduction 2 Sampling Distributions related to the normal distribution 3 The central limit theorem 4 The normal approximation to binomial

More information

Elementary Statistics Lecture 5

Elementary Statistics Lecture 5 Elementary Statistics Lecture 5 Sampling Distributions Chong Ma Department of Statistics University of South Carolina Chong Ma (Statistics, USC) STAT 201 Elementary Statistics 1 / 24 Outline 1 Introduction

More information

VIDEO 1. A random variable is a quantity whose value depends on chance, for example, the outcome when a die is rolled.

VIDEO 1. A random variable is a quantity whose value depends on chance, for example, the outcome when a die is rolled. Part 1: Probability Distributions VIDEO 1 Name: 11-10 Probability and Binomial Distributions A random variable is a quantity whose value depends on chance, for example, the outcome when a die is rolled.

More information

Theoretical Foundations

Theoretical Foundations Theoretical Foundations Probabilities Monia Ranalli monia.ranalli@uniroma2.it Ranalli M. Theoretical Foundations - Probabilities 1 / 27 Objectives understand the probability basics quantify random phenomena

More information

BIOL The Normal Distribution and the Central Limit Theorem

BIOL The Normal Distribution and the Central Limit Theorem BIOL 300 - The Normal Distribution and the Central Limit Theorem In the first week of the course, we introduced a few measures of center and spread, and discussed how the mean and standard deviation are

More information

4.3 Normal distribution

4.3 Normal distribution 43 Normal distribution Prof Tesler Math 186 Winter 216 Prof Tesler 43 Normal distribution Math 186 / Winter 216 1 / 4 Normal distribution aka Bell curve and Gaussian distribution The normal distribution

More information

II - Probability. Counting Techniques. three rules of counting. 1multiplication rules. 2permutations. 3combinations

II - Probability. Counting Techniques. three rules of counting. 1multiplication rules. 2permutations. 3combinations II - Probability Counting Techniques three rules of counting 1multiplication rules 2permutations 3combinations Section 2 - Probability (1) II - Probability Counting Techniques 1multiplication rules In

More information

Lecture 23. STAT 225 Introduction to Probability Models April 4, Whitney Huang Purdue University. Normal approximation to Binomial

Lecture 23. STAT 225 Introduction to Probability Models April 4, Whitney Huang Purdue University. Normal approximation to Binomial Lecture 23 STAT 225 Introduction to Probability Models April 4, 2014 approximation Whitney Huang Purdue University 23.1 Agenda 1 approximation 2 approximation 23.2 Characteristics of the random variable:

More information

MLLunsford 1. Activity: Central Limit Theorem Theory and Computations

MLLunsford 1. Activity: Central Limit Theorem Theory and Computations MLLunsford 1 Activity: Central Limit Theorem Theory and Computations Concepts: The Central Limit Theorem; computations using the Central Limit Theorem. Prerequisites: The student should be familiar with

More information

4-1. Chapter 4. Commonly Used Distributions by The McGraw-Hill Companies, Inc. All rights reserved.

4-1. Chapter 4. Commonly Used Distributions by The McGraw-Hill Companies, Inc. All rights reserved. 4-1 Chapter 4 Commonly Used Distributions 2014 by The Companies, Inc. All rights reserved. Section 4.1: The Bernoulli Distribution 4-2 We use the Bernoulli distribution when we have an experiment which

More information

Probability Theory. Mohamed I. Riffi. Islamic University of Gaza

Probability Theory. Mohamed I. Riffi. Islamic University of Gaza Probability Theory Mohamed I. Riffi Islamic University of Gaza Table of contents 1. Chapter 2 Discrete Distributions The binomial distribution 1 Chapter 2 Discrete Distributions Bernoulli trials and the

More information

Econ 250 Fall Due at November 16. Assignment 2: Binomial Distribution, Continuous Random Variables and Sampling

Econ 250 Fall Due at November 16. Assignment 2: Binomial Distribution, Continuous Random Variables and Sampling Econ 250 Fall 2010 Due at November 16 Assignment 2: Binomial Distribution, Continuous Random Variables and Sampling 1. Suppose a firm wishes to raise funds and there are a large number of independent financial

More information

MATH MW Elementary Probability Course Notes Part IV: Binomial/Normal distributions Mean and Variance

MATH MW Elementary Probability Course Notes Part IV: Binomial/Normal distributions Mean and Variance MATH 2030 3.00MW Elementary Probability Course Notes Part IV: Binomial/Normal distributions Mean and Variance Tom Salisbury salt@yorku.ca York University, Dept. of Mathematics and Statistics Original version

More information

Chapter 7. Random Variables

Chapter 7. Random Variables Chapter 7 Random Variables Making quantifiable meaning out of categorical data Toss three coins. What does the sample space consist of? HHH, HHT, HTH, HTT, TTT, TTH, THT, THH In statistics, we are most

More information

The Binomial and Geometric Distributions. Chapter 8

The Binomial and Geometric Distributions. Chapter 8 The Binomial and Geometric Distributions Chapter 8 8.1 The Binomial Distribution A binomial experiment is statistical experiment that has the following properties: The experiment consists of n repeated

More information

Part 1 In which we meet the law of averages. The Law of Averages. The Expected Value & The Standard Error. Where Are We Going?

Part 1 In which we meet the law of averages. The Law of Averages. The Expected Value & The Standard Error. Where Are We Going? 1 The Law of Averages The Expected Value & The Standard Error Where Are We Going? Sums of random numbers The law of averages Box models for generating random numbers Sums of draws: the Expected Value Standard

More information

Examples: Random Variables. Discrete and Continuous Random Variables. Probability Distributions

Examples: Random Variables. Discrete and Continuous Random Variables. Probability Distributions Random Variables Examples: Random variable a variable (typically represented by x) that takes a numerical value by chance. Number of boys in a randomly selected family with three children. Possible values:

More information

Mean of a Discrete Random variable. Suppose that X is a discrete random variable whose distribution is : :

Mean of a Discrete Random variable. Suppose that X is a discrete random variable whose distribution is : : Dr. Kim s Note (December 17 th ) The values taken on by the random variable X are random, but the values follow the pattern given in the random variable table. What is a typical value of a random variable

More information

Normal Distribution. Notes. Normal Distribution. Standard Normal. Sums of Normal Random Variables. Normal. approximation of Binomial.

Normal Distribution. Notes. Normal Distribution. Standard Normal. Sums of Normal Random Variables. Normal. approximation of Binomial. Lecture 21,22, 23 Text: A Course in Probability by Weiss 8.5 STAT 225 Introduction to Probability Models March 31, 2014 Standard Sums of Whitney Huang Purdue University 21,22, 23.1 Agenda 1 2 Standard

More information

MA 1125 Lecture 14 - Expected Values. Wednesday, October 4, Objectives: Introduce expected values.

MA 1125 Lecture 14 - Expected Values. Wednesday, October 4, Objectives: Introduce expected values. MA 5 Lecture 4 - Expected Values Wednesday, October 4, 27 Objectives: Introduce expected values.. Means, Variances, and Standard Deviations of Probability Distributions Two classes ago, we computed the

More information

Chapter 16. Random Variables. Copyright 2010, 2007, 2004 Pearson Education, Inc.

Chapter 16. Random Variables. Copyright 2010, 2007, 2004 Pearson Education, Inc. Chapter 16 Random Variables Copyright 2010, 2007, 2004 Pearson Education, Inc. Expected Value: Center A random variable is a numeric value based on the outcome of a random event. We use a capital letter,

More information

Normal distribution Approximating binomial distribution by normal 2.10 Central Limit Theorem

Normal distribution Approximating binomial distribution by normal 2.10 Central Limit Theorem 1.1.2 Normal distribution 1.1.3 Approimating binomial distribution by normal 2.1 Central Limit Theorem Prof. Tesler Math 283 Fall 216 Prof. Tesler 1.1.2-3, 2.1 Normal distribution Math 283 / Fall 216 1

More information

Chapter 4 Continuous Random Variables and Probability Distributions

Chapter 4 Continuous Random Variables and Probability Distributions Chapter 4 Continuous Random Variables and Probability Distributions Part 2: More on Continuous Random Variables Section 4.5 Continuous Uniform Distribution Section 4.6 Normal Distribution 1 / 27 Continuous

More information

Chapter 4 Continuous Random Variables and Probability Distributions

Chapter 4 Continuous Random Variables and Probability Distributions Chapter 4 Continuous Random Variables and Probability Distributions Part 2: More on Continuous Random Variables Section 4.5 Continuous Uniform Distribution Section 4.6 Normal Distribution 1 / 28 One more

More information

Econ 6900: Statistical Problems. Instructor: Yogesh Uppal

Econ 6900: Statistical Problems. Instructor: Yogesh Uppal Econ 6900: Statistical Problems Instructor: Yogesh Uppal Email: yuppal@ysu.edu Lecture Slides 4 Random Variables Probability Distributions Discrete Distributions Discrete Uniform Probability Distribution

More information

Chapter 4 Probability Distributions

Chapter 4 Probability Distributions Slide 1 Chapter 4 Probability Distributions Slide 2 4-1 Overview 4-2 Random Variables 4-3 Binomial Probability Distributions 4-4 Mean, Variance, and Standard Deviation for the Binomial Distribution 4-5

More information

Part V - Chance Variability

Part V - Chance Variability Part V - Chance Variability Dr. Joseph Brennan Math 148, BU Dr. Joseph Brennan (Math 148, BU) Part V - Chance Variability 1 / 78 Law of Averages In Chapter 13 we discussed the Kerrich coin-tossing experiment.

More information

Probability Models.S2 Discrete Random Variables

Probability Models.S2 Discrete Random Variables Probability Models.S2 Discrete Random Variables Operations Research Models and Methods Paul A. Jensen and Jonathan F. Bard Results of an experiment involving uncertainty are described by one or more random

More information

Section Distributions of Random Variables

Section Distributions of Random Variables Section 8.1 - Distributions of Random Variables Definition: A random variable is a rule that assigns a number to each outcome of an experiment. Example 1: Suppose we toss a coin three times. Then we could

More information

Introduction to Probability and Inference HSSP Summer 2017, Instructor: Alexandra Ding July 19, 2017

Introduction to Probability and Inference HSSP Summer 2017, Instructor: Alexandra Ding July 19, 2017 Introduction to Probability and Inference HSSP Summer 2017, Instructor: Alexandra Ding July 19, 2017 Please fill out the attendance sheet! Suggestions Box: Feedback and suggestions are important to the

More information

Discrete Random Variables and Probability Distributions

Discrete Random Variables and Probability Distributions Chapter 4 Discrete Random Variables and Probability Distributions 4.1 Random Variables A quantity resulting from an experiment that, by chance, can assume different values. A random variable is a variable

More information

Chapter 4 and 5 Note Guide: Probability Distributions

Chapter 4 and 5 Note Guide: Probability Distributions Chapter 4 and 5 Note Guide: Probability Distributions Probability Distributions for a Discrete Random Variable A discrete probability distribution function has two characteristics: Each probability is

More information

Chapter 5 Student Lecture Notes 5-1. Department of Quantitative Methods & Information Systems. Business Statistics

Chapter 5 Student Lecture Notes 5-1. Department of Quantitative Methods & Information Systems. Business Statistics Chapter 5 Student Lecture Notes 5-1 Department of Quantitative Methods & Information Systems Business Statistics Chapter 5 Discrete Probability Distributions QMIS 120 Dr. Mohammad Zainal Chapter Goals

More information

Point Estimation. Some General Concepts of Point Estimation. Example. Estimator quality

Point Estimation. Some General Concepts of Point Estimation. Example. Estimator quality Point Estimation Some General Concepts of Point Estimation Statistical inference = conclusions about parameters Parameters == population characteristics A point estimate of a parameter is a value (based

More information

Chapter 16. Random Variables. Copyright 2010 Pearson Education, Inc.

Chapter 16. Random Variables. Copyright 2010 Pearson Education, Inc. Chapter 16 Random Variables Copyright 2010 Pearson Education, Inc. Expected Value: Center A random variable assumes a value based on the outcome of a random event. We use a capital letter, like X, to denote

More information

Random variables. Contents

Random variables. Contents Random variables Contents 1 Random Variable 2 1.1 Discrete Random Variable............................ 3 1.2 Continuous Random Variable........................... 5 1.3 Measures of Location...............................

More information

PROBABILITY DISTRIBUTIONS

PROBABILITY DISTRIBUTIONS CHAPTER 3 PROBABILITY DISTRIBUTIONS Page Contents 3.1 Introduction to Probability Distributions 51 3.2 The Normal Distribution 56 3.3 The Binomial Distribution 60 3.4 The Poisson Distribution 64 Exercise

More information

Business Statistics 41000: Probability 4

Business Statistics 41000: Probability 4 Business Statistics 41000: Probability 4 Drew D. Creal University of Chicago, Booth School of Business February 14 and 15, 2014 1 Class information Drew D. Creal Email: dcreal@chicagobooth.edu Office:

More information

9 Expectation and Variance

9 Expectation and Variance 9 Expectation and Variance Two numbers are often used to summarize a probability distribution for a random variable X. The mean is a measure of the center or middle of the probability distribution, and

More information

Chapter 3 - Lecture 5 The Binomial Probability Distribution

Chapter 3 - Lecture 5 The Binomial Probability Distribution Chapter 3 - Lecture 5 The Binomial Probability October 12th, 2009 Experiment Examples Moments and moment generating function of a Binomial Random Variable Outline Experiment Examples A binomial experiment

More information