Chapter 2. Random variables. 2.3 Expectation

Similar documents
STAT/MATH 395 PROBABILITY II

Random Variables Handout. Xavier Vilà

Chapter 3 Common Families of Distributions. Definition 3.4.1: A family of pmfs or pdfs is called exponential family if it can be expressed as

Version A. Problem 1. Let X be the continuous random variable defined by the following pdf: 1 x/2 when 0 x 2, f(x) = 0 otherwise.

IEOR 165 Lecture 1 Probability Review

continuous rv Note for a legitimate pdf, we have f (x) 0 and f (x)dx = 1. For a continuous rv, P(X = c) = c f (x)dx = 0, hence

INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY. Lecture -5 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc.

Welcome to Stat 410!

Statistics & Flood Frequency Chapter 3. Dr. Philip B. Bedient

Chapter 3 - Lecture 4 Moments and Moment Generating Funct

5. In fact, any function of a random variable is also a random variable

Normal Distribution. Notes. Normal Distribution. Standard Normal. Sums of Normal Random Variables. Normal. approximation of Binomial.

NORMAL APPROXIMATION. In the last chapter we discovered that, when sampling from almost any distribution, e r2 2 rdrdϕ = 2π e u du =2π.

UQ, STAT2201, 2017, Lectures 3 and 4 Unit 3 Probability Distributions.

Statistical Tables Compiled by Alan J. Terry

Random variables. Contents

Section 7.1: Continuous Random Variables

Continuous random variables

STOR Lecture 7. Random Variables - I

Review of the Topics for Midterm I

Probability and Random Variables A FINANCIAL TIMES COMPANY

9 Expectation and Variance

This chapter reviews basic probability concepts that are necessary for the modeling and statistical analysis of financial data.

6. Continous Distributions

Problems from 9th edition of Probability and Statistical Inference by Hogg, Tanis and Zimmerman:

ECSE B Assignment 5 Solutions Fall (a) Using whichever of the Markov or the Chebyshev inequalities is applicable, estimate

Covariance and Correlation. Def: If X and Y are JDRVs with finite means and variances, then. Example Sampling

Lecture 23. STAT 225 Introduction to Probability Models April 4, Whitney Huang Purdue University. Normal approximation to Binomial

ECE 340 Probabilistic Methods in Engineering M/W 3-4:15. Lecture 10: Continuous RV Families. Prof. Vince Calhoun

Exam M Fall 2005 PRELIMINARY ANSWER KEY

The Normal Distribution

Discrete probability distributions

Two Hours. Mathematical formula books and statistical tables are to be provided THE UNIVERSITY OF MANCHESTER. 22 January :00 16:00

Lecture Notes 6. Assume F belongs to a family of distributions, (e.g. F is Normal), indexed by some parameter θ.

Chapter 2: Random Variables (Cont d)

Homework Assignments

Statistics for Business and Economics

2.1 Probability, stochastic variables and distribution functions

Introduction to Computational Finance and Financial Econometrics Chapter 1 Asset Return Calculations

2.1 Properties of PDFs

Tutorial 11: Limit Theorems. Baoxiang Wang & Yihan Zhang bxwang, April 10, 2017

Basic notions of probability theory: continuous probability distributions. Piero Baraldi

Lecture 5: Fundamentals of Statistical Analysis and Distributions Derived from Normal Distributions

The rth moment of a real-valued random variable X with density f(x) is. x r f(x) dx

Probability Theory and Simulation Methods. April 9th, Lecture 20: Special distributions

ECE 295: Lecture 03 Estimation and Confidence Interval

Homework Problems Stat 479

Exam P September 2014 Study Sheet! copylefted by Jared Nakamura, 9/4/2014!

Normal Distribution. Definition A continuous rv X is said to have a normal distribution with. the pdf of X is

14.30 Introduction to Statistical Methods in Economics Spring 2009

Drunken Birds, Brownian Motion, and Other Random Fun

Chapter 3 - Lecture 3 Expected Values of Discrete Random Va

2011 Pearson Education, Inc

CS134: Networks Spring Random Variables and Independence. 1.2 Probability Distribution Function (PDF) Number of heads Probability 2 0.

Applications of Good s Generalized Diversity Index. A. J. Baczkowski Department of Statistics, University of Leeds Leeds LS2 9JT, UK

PROBABILITY AND STATISTICS

Expectations. Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or

MATH 3200 Exam 3 Dr. Syring

Probability Theory. Mohamed I. Riffi. Islamic University of Gaza

Commonly Used Distributions

Tutorial 6. Sampling Distribution. ENGG2450A Tutors. 27 February The Chinese University of Hong Kong 1/6

Lecture 10: Point Estimation

Random Variables and Probability Distributions

2 of PU_2015_375 Which of the following measures is more flexible when compared to other measures?

Continuous Probability Distributions & Normal Distribution

Probability Distributions for Discrete RV

Populations and Samples Bios 662

Lecture Stat 302 Introduction to Probability - Slides 15

Normal Probability Distributions

Engineering Statistics ECIV 2305

Favorite Distributions

Statistics 6 th Edition

Chapter 6 Continuous Probability Distributions. Learning objectives

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER

Math489/889 Stochastic Processes and Advanced Mathematical Finance Homework 5

Reliability and Risk Analysis. Survival and Reliability Function

VI. Continuous Probability Distributions

STATISTICS and PROBABILITY

Discrete Random Variables and Probability Distributions. Stat 4570/5570 Based on Devore s book (Ed 8)

Probability Models.S2 Discrete Random Variables

Deriving the Black-Scholes Equation and Basic Mathematical Finance

Bivariate Birnbaum-Saunders Distribution

MATH MW Elementary Probability Course Notes Part IV: Binomial/Normal distributions Mean and Variance

. (i) What is the probability that X is at most 8.75? =.875

Definition 9.1 A point estimate is any function T (X 1,..., X n ) of a random sample. We often write an estimator of the parameter θ as ˆθ.

STAT 830 Convergence in Distribution

Random variables. Discrete random variables. Continuous random variables.

Frequency and Severity with Coverage Modifications

AMH4 - ADVANCED OPTION PRICING. Contents

Chapter 4 Continuous Random Variables and Probability Distributions

Exercise. Show the corrected sample variance is an unbiased estimator of population variance. S 2 = n i=1 (X i X ) 2 n 1. Exercise Estimation

4.3 Normal distribution

EVA Tutorial #1 BLOCK MAXIMA APPROACH IN HYDROLOGIC/CLIMATE APPLICATIONS. Rick Katz

CERTIFICATE IN FINANCE CQF. Certificate in Quantitative Finance Subtext t here GLOBAL STANDARD IN FINANCIAL ENGINEERING

Business Statistics 41000: Probability 3

Financial Econometrics Jeffrey R. Russell. Midterm 2014 Suggested Solutions. TA: B. B. Deng

Statistics (This summary is for chapters 17, 28, 29 and section G of chapter 19)

Model Paper Statistics Objective. Paper Code Time Allowed: 20 minutes

Chapter 5 Discrete Probability Distributions. Random Variables Discrete Probability Distributions Expected Value and Variance

Chapter 7 1. Random Variables

Transcription:

Random processes - Chapter 2. Random variables 1 Random processes Chapter 2. Random variables 2.3 Expectation 2.3 Expectation

Random processes - Chapter 2. Random variables 2 Among the parameters representing a typical value of a random variable are mode, median, and expectation. Mode The mode of a random variable X is a number x mod satisfying Median f X (x mod ) f X (x), p X (x mod ) p X (x), X is a continuous random variable, X is a discrete random variable. The median of a random variable X is a number x med satisfying P {X x med } = P {X x med }. There may exist several medians and modes for a random variable. 2.3 Expectation

Random processes - Chapter 2. Random variables 3 Expectation Expectation, expected value, average Let the cdf of a random variable X be F X. Then, if x df X(x) < -or, x p X (x) < - we call the following the expectation, average, or expected value x of the random variable X. E{X} = = xdf X (x) { xf X(x)dx, X is a continuous random variable, xp X (x), x X is a discrete random variable. Uniform random variable Let the random variable X be distributed uniformly on [a, b), that is, X U[a, b). Then, f X (x) =1/(b a), a<x<b. Thus, E{X} = b a xdx/(b a) = (b 2 a 2 )/{2(b a)} =(a + b)/2. The mode is any real number between a and b, and the median is (a + b)/2. 2.3 Expectation / 2.3.1 Expectation

Random processes - Chapter 2. Random variables 4 The expected value of a function Y = g(x) of a random variable X is or E{Y } = ydf Y (y), E{Y } = E{g(X)} = = g(x)df X (x) { g(x)f X(x)dx, continuous random variable, g(x)p X (x), Here, F Y is the cdf of Y = g(x). x discrete random variable. 2.3 Expectation / 2.3.1 Expectation

Random processes - Chapter 2. Random variables 5 We can show the following from the definition of the expectation. We have E(X) 0 when a random variable X is not smaller than 0 (that is, when Pr {X 0} =1). The expectation of a constant is the constant. That is, if Pr {X = c} =1, then E(X) =c. { n } n E a i g i (X) = a i E{g i (X)}. When h 1 (x) h 2 (x), i=1 i=1 E{h(X)} E{ h(x) }. E{h 1 (X)} E{h 2 (X)}. min(h(x)) E{h(X)} max(h(x)). When a and b are constants and X is a random variable, E{aX +b} = ae{x} + b. 2.3 Expectation / 2.3.1 Expectation

Random processes - Chapter 2. Random variables 6 Conditional expectation The conditional expectation of X when A is given can be evaluated as { E{X A} = xf X A(x A)dx, X is continuous random variable, xp X A (x A), X is discrete random variable. x When the event A is given, the conditional expectation of a function Y = g(x) of a random variable X is E{g(X) A} = = g(x)df X A (x A) { g(x)f X A(x A)dx, continuous random variable, g(x)p X A (x A), x discrete random variable. 2.3 Expectation / 2.3.1 Expectation

Random processes - Chapter 2. Random variables 7 Moment and variance The expectation of a power of a random variable is called a moment. In other words, a moment is also an expectation of a function of a random variable. Moment Let F X be the cdf of a random variable X. Then, if x n df X (x) <, then the nth moment m n of the random variable X is m n = E{X n } = x n df X (x) { = xn f X (x)dx, continuous random variable, x n p X (x), discrete random variable. x 2.3 Expectation / 2.3.2 Moment and variance

Random processes - Chapter 2. Random variables 8 Central moment The following parameter µ n is called the nth central moment of the random variable X: µ n = E{(X E{X}) n } = (x m 1 ) n df X (x) { = (x m 1) n f X (x)dx, continuous random variable, (x m 1 ) n p X (x), discrete random variable. x Variance, Standard deviation The second central moment of a random variable is called the variance. σ 2 X = E{(X E{X}) 2 } = E{X 2 } E 2 {X} = m 2 m 2 1 = µ 2. The standard deviation is the nonnegative square root of the variance. 2.3 Expectation / 2.3.2 Moment and variance

Random processes - Chapter 2. Random variables 9 The expectation and variance of Cauchy random variable do not exist because E{ X } = and E{ X 2 } =. f(r) = α π 1 r 2 + α2, r R. The uniformly distributed random variable X with the pdf f X (r) = 1, r [a, b], b > a b a has the following expectation and variance E{X} = a + b 2 (b a)2, Var{X} =. 12 2.3 Expectation / 2.3.2 Moment and variance

Random processes - Chapter 2. Random variables 10 The exponentially distributed random variable X with the pdf f(r) = λe λr,r 0 has E{X} =1/λ and Var{X} =1/λ 2 as the expectation and variance, respectively. The Poisson random variable X with parameter λ has the following expectation, second moment, and variance E{X} = ke λ λ k /k! =e λ kλ k /k! =λ, k=0 E{X 2 } = λ 2 + λ, σ 2 X = λ. The binomial random variable X b(n, p) has the following expectation and variance E{X} = np, k=0 σ 2 X = np(1 p). 2.3 Expectation / 2.3.2 Moment and variance

Random processes - Chapter 2. Random variables 11 Let us obtain the expectation and variance of the normal random variable X N(m, σ 2 ). Since the pdf of the normal random variable is f X (x) = 1 expectation and second moment are as follows: { } x E{X} = exp (x m)2 dx 2πσ 2 2σ 2 = 2σt + m 2πσ 2 = 1 π { = m π π = m, 2πσ 2 exp{ (x m)2 2σ 2 e t2 2σdt ((x m)/ 2σ = t) } 2σt exp( t 2 )dt + m exp ( t 2 )dt }, the 2.3 Expectation / 2.3.2 Moment and variance

Random processes - Chapter 2. Random variables 12 E{X 2 } = = 1 2πσ 2 = 1 π { x 2 { exp 2πσ 2 = 1 π (σ 2 π + m 2 π) = σ 2 + m 2. } (x m)2 dx 2σ 2 (2σ 2 t 2 +2 2mσt + m 2 )e t2 2σdt 2σ 2 t 2 e t2 dt + m 2 } π Thus, Var{X} = E{X 2 } m 2 = σ 2. We have used exp( t2 )dt = π, t exp( t2 )dt=0,and t2 exp( t 2 )dt= π/2. 2.3 Expectation / 2.3.2 Moment and variance

Random processes - Chapter 2. Random variables 13 Consider a normal random variable X with the pdf f(x) = { } 1 exp x2 2πσ 2 2σ 2 Using that f(x) is symmetric and taking the kth derivative of π exp{ αx 2 }dx = α with respect to α we can obtain the following results. { 0, n =2k +1, E{X n } = 1 3 5 (n 1)σ n, n =2k.. 2.3 Expectation / 2.3.2 Moment and variance

Random processes - Chapter 2. Random variables 14 Characteristic function and moment generating function Characteristic function (cf) The characteristic function ϕ X (ω) of random variable X is ϕ X (ω) = E{e jωx } = = e jωx df X (x) { f X(x)e jωx dx, continuous random variable, p X (x)e jωx, discrete random variable. x The characteristic function has the following properties. ϕ(ω) ϕ(0) = 1. ϕ is uniformly continuous for any real number. ϕ is semi-definite. In other words, for any real numbers ω 1,,ω n and z 1,,z n, ϕ(ω j ω k )z j z k 0. j,k 2.3 Expectation / 2.3.3 Characteristic function and moment generating function

Random processes - Chapter 2. Random variables 15 Moment generating function (mgf) The mgf M X (t) of a random variable X is defined as M X (t) = E{e tx } = e tx df X (x). If the characteristic function of the random variable X is ϕ X, a, b R, and Y = ax + b, then the characteristic function of Y is ϕ Y (ω) =e iωb ϕ X (aω). 2.3 Expectation / 2.3.3 Characteristic function and moment generating function

Random processes - Chapter 2. Random variables 16 Moment theorem Moment theorem Let the mgf and cf of a random variable X be M X (t) and ϕ X (ω), respectively. Then the kth moment of X can be evaluated by k k m k = j ω kϕ X(ω) m k = j k k ϕ X (0), ω k = M (k) X (0). ω=0 Let X N(m, σ 2 ). Then, since ϕ X (ω) = exp{ ω2 σ 2 2 + jmω}, E{X} = j 1 ϕ X (0) = m, E{X2 } = j 2 ϕ X (0) = m2 + σ 2, Var{X} = σ 2. 2.3 Expectation / 2.3.4 Moment theorem

Random processes - Chapter 2. Random variables 17 Cumulant* Let us expand the natural logarithm ψ(ω) =lnϕ(ω) of the characteristic function ϕ(ω) in Taylor series at ω =0. ψ(ω) = lnϕ(ω) { } = ln 1+ (jω) sm s s! s=1 [ ] [ = (jω) sm s 1 s! 2 s=1 s=1 jω = m 1 1! +(m 2 m 2 1) (jω)2 2! (jω) n = k n. n! n=1 (jω) sm s s! ] 2 [ + 1 ] 3 (jω) sm s + 3 s! s=1 +(m 3 3m 1 m 2 +2m 3 1) (jω)3 3! + The parameter k n in the last line is called a cumulant and is defined as k n = n (jω) nψ(ω). ω=0 2.3 Expectation / 2.3.5 Cumulant*

Random processes - Chapter 2. Random variables 18 Coefficient of variation, skewness, kurtosis For a random variable X with mean µ and variance σ 2, v 1 = σ µ, v 2 = µ 3 σ 3 = k 3 (k 2 ) 3/2, v 3 = µ 4 σ 4 =3+k 4 k 2 2 are called the coefficient of variation, skewness, and kurtosis, respectively. The symmetry of pdf and skewness v 2 2.3 Expectation / 2.3.5 Cumulant*

Random processes - Chapter 2. Random variables 19 Several inequalities* Markov inequality: If X is a random variable that takes only nonnegative values, then for any value α>0, P {X α} E{X}/α. Chebyshev inequality: For a random variable Y and any positive value ɛ, P { Y E{Y } ɛ} Var{Y } ɛ 2. 2.3 Expectation / 2.3.5 Several inequalities*

Random processes - Chapter 2. Random variables 20 Bienayme-Chebyshev inequality: Let the rth absolute moment of a random variable X be finite, that is, E{ X r } <, r>0. Then, for any positive ɛ, we have P { X ɛ} E{ X r } ɛ r. Generalized Bienayme-Chebyshev inequality: Let g(x), x (0, ) be a nondecreasing and nonnegative function. If E{g( X )}/g(ɛ) is defined, for any positive ɛ, wehave P { X ɛ} E{g( X )}. g(ɛ) Jensen s inequality: If f is a convex function, E{f(X)} f(e{x}). 2.3 Expectation / 2.3.5 Several inequalities*