Random processes - Chapter 2. Random variables 1 Random processes Chapter 2. Random variables 2.3 Expectation 2.3 Expectation
Random processes - Chapter 2. Random variables 2 Among the parameters representing a typical value of a random variable are mode, median, and expectation. Mode The mode of a random variable X is a number x mod satisfying Median f X (x mod ) f X (x), p X (x mod ) p X (x), X is a continuous random variable, X is a discrete random variable. The median of a random variable X is a number x med satisfying P {X x med } = P {X x med }. There may exist several medians and modes for a random variable. 2.3 Expectation
Random processes - Chapter 2. Random variables 3 Expectation Expectation, expected value, average Let the cdf of a random variable X be F X. Then, if x df X(x) < -or, x p X (x) < - we call the following the expectation, average, or expected value x of the random variable X. E{X} = = xdf X (x) { xf X(x)dx, X is a continuous random variable, xp X (x), x X is a discrete random variable. Uniform random variable Let the random variable X be distributed uniformly on [a, b), that is, X U[a, b). Then, f X (x) =1/(b a), a<x<b. Thus, E{X} = b a xdx/(b a) = (b 2 a 2 )/{2(b a)} =(a + b)/2. The mode is any real number between a and b, and the median is (a + b)/2. 2.3 Expectation / 2.3.1 Expectation
Random processes - Chapter 2. Random variables 4 The expected value of a function Y = g(x) of a random variable X is or E{Y } = ydf Y (y), E{Y } = E{g(X)} = = g(x)df X (x) { g(x)f X(x)dx, continuous random variable, g(x)p X (x), Here, F Y is the cdf of Y = g(x). x discrete random variable. 2.3 Expectation / 2.3.1 Expectation
Random processes - Chapter 2. Random variables 5 We can show the following from the definition of the expectation. We have E(X) 0 when a random variable X is not smaller than 0 (that is, when Pr {X 0} =1). The expectation of a constant is the constant. That is, if Pr {X = c} =1, then E(X) =c. { n } n E a i g i (X) = a i E{g i (X)}. When h 1 (x) h 2 (x), i=1 i=1 E{h(X)} E{ h(x) }. E{h 1 (X)} E{h 2 (X)}. min(h(x)) E{h(X)} max(h(x)). When a and b are constants and X is a random variable, E{aX +b} = ae{x} + b. 2.3 Expectation / 2.3.1 Expectation
Random processes - Chapter 2. Random variables 6 Conditional expectation The conditional expectation of X when A is given can be evaluated as { E{X A} = xf X A(x A)dx, X is continuous random variable, xp X A (x A), X is discrete random variable. x When the event A is given, the conditional expectation of a function Y = g(x) of a random variable X is E{g(X) A} = = g(x)df X A (x A) { g(x)f X A(x A)dx, continuous random variable, g(x)p X A (x A), x discrete random variable. 2.3 Expectation / 2.3.1 Expectation
Random processes - Chapter 2. Random variables 7 Moment and variance The expectation of a power of a random variable is called a moment. In other words, a moment is also an expectation of a function of a random variable. Moment Let F X be the cdf of a random variable X. Then, if x n df X (x) <, then the nth moment m n of the random variable X is m n = E{X n } = x n df X (x) { = xn f X (x)dx, continuous random variable, x n p X (x), discrete random variable. x 2.3 Expectation / 2.3.2 Moment and variance
Random processes - Chapter 2. Random variables 8 Central moment The following parameter µ n is called the nth central moment of the random variable X: µ n = E{(X E{X}) n } = (x m 1 ) n df X (x) { = (x m 1) n f X (x)dx, continuous random variable, (x m 1 ) n p X (x), discrete random variable. x Variance, Standard deviation The second central moment of a random variable is called the variance. σ 2 X = E{(X E{X}) 2 } = E{X 2 } E 2 {X} = m 2 m 2 1 = µ 2. The standard deviation is the nonnegative square root of the variance. 2.3 Expectation / 2.3.2 Moment and variance
Random processes - Chapter 2. Random variables 9 The expectation and variance of Cauchy random variable do not exist because E{ X } = and E{ X 2 } =. f(r) = α π 1 r 2 + α2, r R. The uniformly distributed random variable X with the pdf f X (r) = 1, r [a, b], b > a b a has the following expectation and variance E{X} = a + b 2 (b a)2, Var{X} =. 12 2.3 Expectation / 2.3.2 Moment and variance
Random processes - Chapter 2. Random variables 10 The exponentially distributed random variable X with the pdf f(r) = λe λr,r 0 has E{X} =1/λ and Var{X} =1/λ 2 as the expectation and variance, respectively. The Poisson random variable X with parameter λ has the following expectation, second moment, and variance E{X} = ke λ λ k /k! =e λ kλ k /k! =λ, k=0 E{X 2 } = λ 2 + λ, σ 2 X = λ. The binomial random variable X b(n, p) has the following expectation and variance E{X} = np, k=0 σ 2 X = np(1 p). 2.3 Expectation / 2.3.2 Moment and variance
Random processes - Chapter 2. Random variables 11 Let us obtain the expectation and variance of the normal random variable X N(m, σ 2 ). Since the pdf of the normal random variable is f X (x) = 1 expectation and second moment are as follows: { } x E{X} = exp (x m)2 dx 2πσ 2 2σ 2 = 2σt + m 2πσ 2 = 1 π { = m π π = m, 2πσ 2 exp{ (x m)2 2σ 2 e t2 2σdt ((x m)/ 2σ = t) } 2σt exp( t 2 )dt + m exp ( t 2 )dt }, the 2.3 Expectation / 2.3.2 Moment and variance
Random processes - Chapter 2. Random variables 12 E{X 2 } = = 1 2πσ 2 = 1 π { x 2 { exp 2πσ 2 = 1 π (σ 2 π + m 2 π) = σ 2 + m 2. } (x m)2 dx 2σ 2 (2σ 2 t 2 +2 2mσt + m 2 )e t2 2σdt 2σ 2 t 2 e t2 dt + m 2 } π Thus, Var{X} = E{X 2 } m 2 = σ 2. We have used exp( t2 )dt = π, t exp( t2 )dt=0,and t2 exp( t 2 )dt= π/2. 2.3 Expectation / 2.3.2 Moment and variance
Random processes - Chapter 2. Random variables 13 Consider a normal random variable X with the pdf f(x) = { } 1 exp x2 2πσ 2 2σ 2 Using that f(x) is symmetric and taking the kth derivative of π exp{ αx 2 }dx = α with respect to α we can obtain the following results. { 0, n =2k +1, E{X n } = 1 3 5 (n 1)σ n, n =2k.. 2.3 Expectation / 2.3.2 Moment and variance
Random processes - Chapter 2. Random variables 14 Characteristic function and moment generating function Characteristic function (cf) The characteristic function ϕ X (ω) of random variable X is ϕ X (ω) = E{e jωx } = = e jωx df X (x) { f X(x)e jωx dx, continuous random variable, p X (x)e jωx, discrete random variable. x The characteristic function has the following properties. ϕ(ω) ϕ(0) = 1. ϕ is uniformly continuous for any real number. ϕ is semi-definite. In other words, for any real numbers ω 1,,ω n and z 1,,z n, ϕ(ω j ω k )z j z k 0. j,k 2.3 Expectation / 2.3.3 Characteristic function and moment generating function
Random processes - Chapter 2. Random variables 15 Moment generating function (mgf) The mgf M X (t) of a random variable X is defined as M X (t) = E{e tx } = e tx df X (x). If the characteristic function of the random variable X is ϕ X, a, b R, and Y = ax + b, then the characteristic function of Y is ϕ Y (ω) =e iωb ϕ X (aω). 2.3 Expectation / 2.3.3 Characteristic function and moment generating function
Random processes - Chapter 2. Random variables 16 Moment theorem Moment theorem Let the mgf and cf of a random variable X be M X (t) and ϕ X (ω), respectively. Then the kth moment of X can be evaluated by k k m k = j ω kϕ X(ω) m k = j k k ϕ X (0), ω k = M (k) X (0). ω=0 Let X N(m, σ 2 ). Then, since ϕ X (ω) = exp{ ω2 σ 2 2 + jmω}, E{X} = j 1 ϕ X (0) = m, E{X2 } = j 2 ϕ X (0) = m2 + σ 2, Var{X} = σ 2. 2.3 Expectation / 2.3.4 Moment theorem
Random processes - Chapter 2. Random variables 17 Cumulant* Let us expand the natural logarithm ψ(ω) =lnϕ(ω) of the characteristic function ϕ(ω) in Taylor series at ω =0. ψ(ω) = lnϕ(ω) { } = ln 1+ (jω) sm s s! s=1 [ ] [ = (jω) sm s 1 s! 2 s=1 s=1 jω = m 1 1! +(m 2 m 2 1) (jω)2 2! (jω) n = k n. n! n=1 (jω) sm s s! ] 2 [ + 1 ] 3 (jω) sm s + 3 s! s=1 +(m 3 3m 1 m 2 +2m 3 1) (jω)3 3! + The parameter k n in the last line is called a cumulant and is defined as k n = n (jω) nψ(ω). ω=0 2.3 Expectation / 2.3.5 Cumulant*
Random processes - Chapter 2. Random variables 18 Coefficient of variation, skewness, kurtosis For a random variable X with mean µ and variance σ 2, v 1 = σ µ, v 2 = µ 3 σ 3 = k 3 (k 2 ) 3/2, v 3 = µ 4 σ 4 =3+k 4 k 2 2 are called the coefficient of variation, skewness, and kurtosis, respectively. The symmetry of pdf and skewness v 2 2.3 Expectation / 2.3.5 Cumulant*
Random processes - Chapter 2. Random variables 19 Several inequalities* Markov inequality: If X is a random variable that takes only nonnegative values, then for any value α>0, P {X α} E{X}/α. Chebyshev inequality: For a random variable Y and any positive value ɛ, P { Y E{Y } ɛ} Var{Y } ɛ 2. 2.3 Expectation / 2.3.5 Several inequalities*
Random processes - Chapter 2. Random variables 20 Bienayme-Chebyshev inequality: Let the rth absolute moment of a random variable X be finite, that is, E{ X r } <, r>0. Then, for any positive ɛ, we have P { X ɛ} E{ X r } ɛ r. Generalized Bienayme-Chebyshev inequality: Let g(x), x (0, ) be a nondecreasing and nonnegative function. If E{g( X )}/g(ɛ) is defined, for any positive ɛ, wehave P { X ɛ} E{g( X )}. g(ɛ) Jensen s inequality: If f is a convex function, E{f(X)} f(e{x}). 2.3 Expectation / 2.3.5 Several inequalities*