variance risk Alice & Bob are gambling (again). X = Alice s gain per flip: E[X] = Time passes... Alice (yawning) says let s raise the stakes

Similar documents
Version A. Problem 1. Let X be the continuous random variable defined by the following pdf: 1 x/2 when 0 x 2, f(x) = 0 otherwise.

Discrete Random Variables

Chapter 16. Random Variables. Copyright 2010, 2007, 2004 Pearson Education, Inc.

A useful modeling tricks.

SECTION 4.4: Expected Value

Learning Goals: * Determining the expected value from a probability distribution. * Applying the expected value formula to solve problems.

MA 1125 Lecture 14 - Expected Values. Wednesday, October 4, Objectives: Introduce expected values.

M3S1 - Binomial Distribution

Chapter 3 Discrete Random Variables and Probability Distributions

Week 7. Texas A& M University. Department of Mathematics Texas A& M University, College Station Section 3.2, 3.3 and 3.4

Statistics for Managers Using Microsoft Excel 7 th Edition

TOPIC: PROBABILITY DISTRIBUTIONS

MA 1125 Lecture 12 - Mean and Standard Deviation for the Binomial Distribution. Objectives: Mean and standard deviation for the binomial distribution.

Part 1 In which we meet the law of averages. The Law of Averages. The Expected Value & The Standard Error. Where Are We Going?

Theoretical Foundations

Binomial Random Variables. Binomial Random Variables

ECO220Y Introduction to Probability Readings: Chapter 6 (skip section 6.9) and Chapter 9 (section )

Discrete probability distributions

VIDEO 1. A random variable is a quantity whose value depends on chance, for example, the outcome when a die is rolled.

Expectations. Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or

Business Statistics 41000: Homework # 2

Random variables. Discrete random variables. Continuous random variables.

Statistics 511 Additional Materials

II. Random Variables

15.063: Communicating with Data Summer Recitation 3 Probability II

Discrete Random Variables

Discrete Random Variables

Have you ever wondered whether it would be worth it to buy a lottery ticket every week, or pondered on questions such as If I were offered a choice

Chapter 3 - Lecture 5 The Binomial Probability Distribution

Statistics for Business and Economics: Random Variables (1)

Chapter 3: Probability Distributions and Statistics

Section Random Variables and Histograms

Lecture 23. STAT 225 Introduction to Probability Models April 4, Whitney Huang Purdue University. Normal approximation to Binomial

Hedging and Regression. Hedging and Regression

Figure 1: 2πσ is said to have a normal distribution with mean µ and standard deviation σ. This is also denoted

Chapter 6: Random Variables. Ch. 6-3: Binomial and Geometric Random Variables

5. In fact, any function of a random variable is also a random variable

Chapter 8: The Binomial and Geometric Distributions

Statistics, Their Distributions, and the Central Limit Theorem

Section Distributions of Random Variables

Chapter 3 - Lecture 3 Expected Values of Discrete Random Va

Statistics 6 th Edition

Binomial Random Variables

Introduction to Business Statistics QM 120 Chapter 6

Central Limit Theorem, Joint Distributions Spring 2018

Econ 6900: Statistical Problems. Instructor: Yogesh Uppal

FINAL REVIEW W/ANSWERS

The Binomial Distribution

Random Variables. Copyright 2009 Pearson Education, Inc.

Part V - Chance Variability

STAT 201 Chapter 6. Distribution

Probability Distributions

Central Limit Theorem 11/08/2005

MULTIPLE CHOICE. Choose the one alternative that best completes the statement or answers the question.

Fall 2015 Math 141:505 Exam 3 Form A

Prof. Thistleton MAT 505 Introduction to Probability Lecture 3

Discrete Random Variables

Economics 430 Handout on Rational Expectations: Part I. Review of Statistics: Notation and Definitions

Section 0: Introduction and Review of Basic Concepts

STAT Chapter 7: Central Limit Theorem

Shifting our focus. We were studying statistics (data, displays, sampling...) The next few lectures focus on probability (randomness) Why?

Chapter 9. Idea of Probability. Randomness and Probability. Basic Practice of Statistics - 3rd Edition. Chapter 9 1. Introducing Probability

Learning Objec0ves. Statistics for Business and Economics. Discrete Probability Distribu0ons

First Exam for MTH 23

Normal Distribution. Notes. Normal Distribution. Standard Normal. Sums of Normal Random Variables. Normal. approximation of Binomial.

Business Statistics 41000: Probability 4

Statistics for Business and Economics

2. Modeling Uncertainty

6 If and then. (a) 0.6 (b) 0.9 (c) 2 (d) Which of these numbers can be a value of probability distribution of a discrete random variable

STT315 Chapter 4 Random Variables & Probability Distributions AM KM

4 Random Variables and Distributions

Elementary Statistics Lecture 5

Chapter 16. Random Variables. Copyright 2010 Pearson Education, Inc.

CS134: Networks Spring Random Variables and Independence. 1.2 Probability Distribution Function (PDF) Number of heads Probability 2 0.

MA : Introductory Probability

6.1 Binomial Theorem

Problems from 9th edition of Probability and Statistical Inference by Hogg, Tanis and Zimmerman:

II - Probability. Counting Techniques. three rules of counting. 1multiplication rules. 2permutations. 3combinations

CHAPTER 4 DISCRETE PROBABILITY DISTRIBUTIONS

STA Module 3B Discrete Random Variables

Midterm Exam III Review

STAT 241/251 - Chapter 7: Central Limit Theorem

Stat511 Additional Materials

Random Variables. 6.1 Discrete and Continuous Random Variables. Probability Distribution. Discrete Random Variables. Chapter 6, Section 1

Sample means and random variables

5.2 Random Variables, Probability Histograms and Probability Distributions

ECE 295: Lecture 03 Estimation and Confidence Interval

18.05 Problem Set 3, Spring 2014 Solutions

The Bernoulli distribution

Introduction to Statistics I

Math 361. Day 8 Binomial Random Variables pages 27 and 28 Inv Do you have ESP? Inv. 1.3 Tim or Bob?

Mathematics of Randomness

Week 1 Quantitative Analysis of Financial Markets Basic Statistics A

Probability. An intro for calculus students P= Figure 1: A normal integral

Chapter 8 Solutions Page 1 of 15 CHAPTER 8 EXERCISE SOLUTIONS

Binomal and Geometric Distributions

STAT Chapter 5: Continuous Distributions. Probability distributions are used a bit differently for continuous r.v. s than for discrete r.v. s.

Probability Basics. Part 1: What is Probability? INFO-1301, Quantitative Reasoning 1 University of Colorado Boulder. March 1, 2017 Prof.

5.7 Probability Distributions and Variance

THE UNIVERSITY OF TEXAS AT AUSTIN Department of Information, Risk, and Operations Management

Transcription:

Alice & Bob are gambling (again). X = Alice s gain per flip: risk E[X] = 0... Time passes... Alice (yawning) says let s raise the stakes E[Y] = 0, as before. Are you (Bob) equally happy to play the new game? 1 2 E[X] measures the average or central tendency of X. What about its variability? E[X] measures the average or central tendency of X. What about its variability or spread? If E[X] = μ, then E[ X-μ ] seems like a natural quantity to look at: how much do we expect (on average) X to deviate from its average. Unfortunately, it s a bit inconvenient mathematically; following is nicer/easier/much more common. 3 4

Definitions Alice & Bob are gambling (again). X = Alice s gain per flip: risk The of a random variable X with mean E[X] = μ is Var[X] = E[(X-μ) 2 ], often denoted σ 2. E[X] = 0 Var[X] = 1... Time passes... Alice (yawning) says let s raise the stakes The standard deviation of X is σ = Var[X] 5 E[Y] = 0, as before. Var[Y] = 1,000,000 Are you (Bob) equally happy to play the new game? 6 what does tell us? The of a random variable X with mean E[X] = μ is Var[X] = E[(X-μ) 2 ], often denoted σ 2. 1: Square always 0, and exaggerated as X moves away from μ, so Var[X] emphasizes deviation from the mean. mean and μ = E[X] is about location; σ = Var(X) is about spread σ 2.2 # heads in 20 flips, p=.5 II: Numbers vary a lot depending on exact distribution of X, but it is common that X is within μ ± σ ~66% of the time, and within μ ± 2σ ~95% of the time. (We ll see the reasons for this soon.) µ = 0 μ # heads in 150 flips, p=.5 σ 6.1 σ = 1 μ 7 Blue arrows denote the interval μ ± σ (and note σ bigger in absolute terms in second ex., but smaller as a proportion of μ or max.) 8

example Two games: a) flip 1 coin, win Y = $100 if heads, $-100 if tails b) flip 100 coins, win Z = (#(heads) - #(tails)) dollars Same expectation in both: E[Y] = E[Z] = 0 Same extremes in both: max gain = $100; max loss = $100 But variability is very different: 0.00 0.02 0.04 0.06 0.08 0.10 0.5 0.5 σ Y = 100 ~ horizontal arrows = μ ± σ σ Z = 10-100 -50 0 50 100 ~ Ex: Var[aX+b] = a 2 Var[X] E[X] = 0 Var[X] = 1 Y = 1000 X NOT linear; insensitive to location (b), quadratic in scale (a) E[Y] = E[1000 X] = 1000 E[X] = 0 Var[Y] = Var[10 3 X]=10 6 Var[X] = 10 6 10 Var(X) = E[(X µ) 2 ] = E[X 2 2µX + µ 2 ] = E[X 2 ] 2µE[X]+µ 2 = E[X 2 ] 2µ 2 + µ 2 = E[X 2 ] µ 2 = E[X 2 ] (E[X]) 2 11 12

Example: What is Var[X] when X is outcome of one fair die? E[X] = 7/2, so In general: Var[X+Y] Var[X] + Var[Y] ^^^^^^^ Ex 1: Let X = ±1 based on 1 coin flip As shown above, E[X] = 0, Var[X] = 1 Let Y = -X; then Var[Y] = (-1) 2 Var[X] = 1 But X+Y = 0, always, so Var[X+Y] = 0 Ex 2: As another example, is Var[X+X] = 2Var[X]? NOT linear 13 14 more examples more examples X1 = sum of 2 fair dice, minus 7 X1 = sum of 2 fair dice, minus 7 σ 2 = 5.83 X2 = fair 11-sided die labeled -5,..., 5 X2 = fair 11-sided die labeled -5,..., 5 σ 2 = 10-1, 0, +1-1, 0, +1 X3 = Y-6 signum(y), where Y is the difference of 2 fair dice, given no doubles X3 = Y-6 signum(y), where Y is the difference of 2 fair dice, given no doubles σ 2 = 15 X4 = X3 when 3 pairs of dice all give same X3 0.20 X4 = X3 when 3 pairs of dice all give same X3 0.20 σ 2 = 19.7 15 16

independence Defn: Random variable X and event E are independent if the event E is independent of the event {X=x} (for any fixed x), i.e. x P({X = x} & E) = P({X=x}) P(E) Defn: Two random variables X and Y are independent if the events {X=x} and {Y=y} are independent (for any fixed x, y), i.e. x, y P({X = x} & {Y=y}) = P({X=x}) P({Y=y}) of r.v.s Intuition as before: knowing X doesn t help you guess Y or E and vice versa. 17 18 Two random variables X and Y are independent if the events {X=x} and {Y=y} are independent (for any x, y), i.e. Random variable X and event E are independent if x P({X = x} & E) = P({X=x}) P(E) Ex 1: Roll a fair die to obtain a random number 1 X 6, then flip a fair coin X times. Let E be the event that the number of heads is even. P({X=x}) = 1/6 for any 1 x 6, P(E) = 1/2 P( {X=x} & E ) = 1/12, so they are independent Ex 2: as above, and let F be the event that the total number of heads = 6. P(F) = 2-6/6 > 0, and considering, say, X=4, we have P(X=4) = 1/6 > 0 (as above), but P({X=4} & F) = 0, since you can t see 6 heads in 4 flips. So X & F are dependent. (Knowing that X is small renders F impossible; knowing that F happened means X must be 6.) 19 x, y P({X = x} & {Y=y}) = P({X=x}) P({Y=y}) Ex: Let X be number of heads in first n of 2n coin flips,y be number in the last n flips, and let Z be the total. X and Y are independent: But X and Z are not independent, since, e.g., knowing that X = 0 precludes Z > n. E.g., P(X = 0) and P(Z = n+1) are both positive, but P(X = 0 & Z = n+1) = 0. 20