Exercise Show the corrected sample variance is an unbiased estimator of population variance. S 2 = n i=1 (X i X ) 2 n 1
Exercise S 2 = = = = n i=1 (X i x) 2 n i=1 = (X i µ + µ X ) 2 = n 1 n 1 n i=1 ((X i µ) ( X µ)) 2 n 1 n ( i=1 (Xi µ) 2 + ( X µ) 2 2(X i µ)( X µ) ) n 1 n i=1 (X i µ) 2 i + ( X µ) 2 n 1 n 1 i = ( 2(Xi µ)( X µ) ) n 1
Exercise E((X i µ) 2 ) = σ 2 E( X µ) 2 = σ2 ( ) n ( ) (X i µ)( X µ) = ( X µ) (X i µ) = n( X µ) 2 i E(S 2 ) = nσ2 n 1 + σ2 n 1 2 nσ 2 n(n 1) E(S 2 ) = nσ2 n 1 σ2 n 1 E(S 2 ) = σ 2 i
Uniform Distribution Example: Let X 1,..., X n be iid r.v. distributed as continuous uniform distribution on [0, θ]. The probability distribution function of X i for each i is: { θ f (x θ) = 1, 0 x θ 0, otherwise Consider T = max i X i an estimator of θ, is T an unbiased estimator for θ?
Uniform Distribution Example: Consider T = max i X i an estimator of θ, is T an unbiased estimator for θ? We need to compute E(T ), thus we need distribution for T. Let F be the cumulative distribution function for T, F T (y) = P(T y) = P(max i X i y) ( y ) n P(max i X i y) = P(X 1 y,..., X n y) = P(X y) n = θ f T (t) = F T (y) y E(T ) = yf T (y)dy = Bias(T ) = E(T ) θ = ( y ) n 1 1 = n θ θ ( y ) n n n dy = θ n + 1 θ 1 n + 1 θ
Uniform Distribution Example: Consider T = max i X i an estimator of θ, is T an unbiased estimator for θ? We need to compute E(T ), thus we need distribution for T. Let F be the cumulative distribution function for T, F T (y) = P(T y) = P(max i X i y) ( y ) n P(max i X i y) = P(X 1 y,..., X n y) = P(X y) n = θ f T (t) = F T (y) y E(T 2 ) = y 2 f T (y)dy = ( y ) n 1 1 = n θ θ n y (n+1) θ n dy = n n + 2 θ2 n ( n Var(T ) = E(T 2 ) (E(T )) 2 = n + 2 θ2 n + 1 θ ( ) n 2 ( ) 1 2 ( MSE(T ) = (n + 2)(n + 1) 2 θ 2 + θ 2 = n + 1 ) 2 θ 2 (n + 2)(n + 1) )
Uniform Distribution: ˆθ MOM Example: E(X ) = xf (x θ)dx = θ 0 x θ dx = θ 0 x 2 2θ = θ 2 X = θ 2 ˆθ MOM = 2 X
Uniform Distribution: ˆθ MOM Example: ) E (ˆθ MOM Var (ˆθMOM ) = θ = 4Var ( ) θ X 2 = 4 12n = θ2 3n ) MSE (ˆθ MLE ( ) 1 (n + 1)(n + 2) θ2 ( ) 1 (n + 1)(n + 2) ) Var (ˆθ MOM θ2 3n 1 3n
Exercise Let (X 1,..., X n ) be independent identically distributed random variables with p.d.f. f (x) = θ 2 x exp( θx) x > 0 Is T (X 1,..., X n ) = 1/X 1 an unbiased estimator of θ?
Exercise Let (X 1,..., X n ) be independent identically distributed random variables with E(X ) = µ, Var(X ) = σ 2, Are the following estimators unbiased estimator for σ 2? T 1 (X 1,..., X n ) = (X 1 X 2 ) 2 2 T 2 (X 1,..., X n ) = (X 1 + X 2 ) 2 X 1 X 2 2
Exercise 4 Let T 1 and T 2 be two independent and unbiased estimators of the parameter θ, with Var(T 1 ) = σ1 2 and Var(T 2) = σ2 2. Find the UMVUE for θ among all linear combinations of T 1 and T 2. What is its variance?
Solution Exercise 4 T = a 1 T 1 + a 2 T 2 E(T ) = a 1 E(T 1 ) + a 2 E(T 2 ) E(T ) = (a 1 + a 2 )θ To be unbiased: a 1 + a 2 = 1, a 2 = 1 a 1 Var(T ) = a 2 Var(T 1 ) + (1 a) 2 Var(T 2 ) Var(T ) = a 2 σ 2 1 + (1 a) 2 σ 2 2
Solution Exercise 4 dvar(t ) da = 2aσ1 2 2(1 a)σ2 2 2aσ1 2 + 2(1 a)σ2 2 = 0 a = σ 2 2 σ 2 1 + σ2 2 The UMVUE estimator for θ among all linear combinations of T 1 and T 2 is T = σ2 2 σ1 2 + T 1 + σ2 1 σ2 2 σ1 2 + T 2 σ2 2
Exercise Let (X 1,..., X n ) be a random sample of i.i.d. random variables with expected value µ and variance σ 2. Consider the following estimator of µ: T n (a) = a X n + (1 a) X n 1 where X n is the n th observed random variable and X n 1 is the sample mean based on n 1 observations. 1 Find value of a such that T n (a) is an unbiased estimator for µ 2 Find value of a such that T n (a ) is the most efficient estimator for µ within the class T n (a)? 3 Define concept of efficiency
Pareto Let (X 1,..., X n ) be a random sample of i.i.d. random variables distributed as a Pareto distribution with unknown parameters α and x 0 known f (x; α, x 0 ) = α x α 0 x (α+1) for x x 0. The log-likelihood function is l(α, x 0 ) = nlogα + nαlog(x 0 ) (α + 1) n logx i i=1
Pareto Thus Solving for δl(α,x 0) δα δl(α, x 0 ) δα = n α + nlog(x 0) n logx i i=1 = 0, the mle of α is given by ˆα = n n i=1 logx i nlog(x 0 )
Pareto: sufficiency Observe that the joint pdf of X = (X 1,..., X n ) f (x; α, x m ) = n α xm α x α+1 i=1 i = α n xm nα n i=1 x α+1 i = g(t, α)h(x) where t = n i=1 x i g(t, α) = cα n xm nα t (α+1) and h(x) = 1. By the factorization theorem, T (X ) = n i=1 X i is sufficient for α.
Pareto: Fisher Information Thus δ 2 l(α, x 0 ) δα 2 = n α 2 I n (θ) = n α 2
EXERCISE Let (X 1,..., X n ) be a random sample of i.i.d. random variables distributed as follows: f (x; θ) = θ2θ x θ+1 x > 2 1 Show that i log(x i) is a sufficient statistics for θ 2 Find ˆθ MLE maximum likelihood estimator (MLE) for θ and discuss properties of this estimator. 3 Find ˆθ MOM method of moment estimator (MOM) for θ.
EXERCISE: solution f (x 1, x 2,..., x n ; θ) = i θ2 θ x θ+1 i f (x 1, x 2,..., x n ; θ) = θ n 2 nθ ( i 1 x i ) θ+1 Sufficient statistics i i log(x i) 1 X i, or any trasformation, as for example,
EXERCISE: solution f (x 1, x 2,..., x n ; θ) = θ n 2 nθ ( i 1 x i ) θ+1 ) θ+1 ( L(θ x 1, x 2,..., x n ) = θ n 2 nθ 1 x i i ( ) l(θ x 1, x 2,..., x n ) = nlog(θ) + nθlog(2) (θ + 1) log(x i ) l(θ) θ 2 2 l(θ) θ = n θ + nlog2 ( i = n θ 2 log(x i ) ) i
EXERCISE: solution l(θ) θ = 0 ˆθ MLE = n i log(x i) nlog(2) = n ( i log ( x i 2 ))
EXERCISE: solution xf (x; θ)dx = xf (x; θ)dx = 2 2 x θ2θ dx x θ+1 θ2 θ x θ dx E(X ) = 2 θ2 θ x ( θ+1) θ + 1 dx E(X ) = 2 θ θ 1 x = θ MOM ˆ 2 θ MOM ˆ 1 ˆθ MOM = x x 2
Poisson distribution Example: Let X be distributed as a Possion; f (x; λ) = λx exp( λ) x! Compare the following estimator for exp ( λ): T 1 = exp ( X ) T 2 = n i=1 I (X i = 0) n
Maximum Likelihood Estimator for the Geometric distribution Considering for n observations from a Geometric distribution: p(x π) = π(1 π) x Find maximum likelihood estimator for E(X ) = 1 π π The second derivative: dlogl(π x) dπ d 2 logl(π x) dπ 2 = n π i x i 1 π = n π 2 i x i (1 π) 2
Maximum Likelihood Estimator for the Geometric distribution n π i x i 1 π = 0 n ˆπ = n + i x i
Example: Exam January 2016 Let (X 1,..., X n ) be a random sample of i.i.d. random variables distributed as a Pareto distribution with parameters α and x m both un known f (x; α, x m ) = α x α mx (α+1) for x x m. Calculate the Fisher information matrix for the parameter vector θ = (x m, α). How do you interpret the off-diagonal terms?
Example: Exam January 2016 The log-likelihood function is l(α, x m ) = nlogα + nαlog(x m ) (α + 1) l(α, x m ) α l(α, x m ) = nα x m x m 2 l(α, x m ) α 2 = n α 2 2 l(α, x m ) xm 2 2 l(α, x m ) = n α x m x m = n α + nlog(x m) = nα x 2 m n logx i i=1 n logx i i=1
Exercise We consider two continuous independent random variables U and W normally distributed with N(0, σ 2 ). The variable X defined by X = U 2 + V 2 has a Rayleigh distribution with a parameter σ 2 f (x; θ) = x σ 2 exp ( x 2 2σ 2 ), x 0 Let (X 1,..., X n ) be a random sample of i.i.d. random variables distributed as X 1 Apply the method of the moments to find the estimator ˆσ MOM of the parameter σ. 2 Find ˆσ 2 MLE maximum likelihood estimator (MLE) for σ 2 and discuss properties of this 1estimator. 3 Compute the score function and the Fisher information. 4 Specify asymptotic distribution of ˆθ MLE.
Exercise 0 x 2 xf (x; θ)dx = ( 0 σ 2 exp x 2 ) 2σ 2 dx = 1 x 2 ( σ 0 σ exp x 2 ) 2σ 2 dx 2π 1 x 2 = exp ( x 2 ) σ 2 2πσ 2σ 2 dx Y N(0, σ 2 ) y 2 exp ( y 2 ) 2πσ 2σ 2 dy = E(Y 2 ) = Var(Y ) + E(Y ) 2 = σ 2
Exercise 0 xf (x; θ)dx = 2π 1 σ 2 σ2 = σ π 2 π E(X ) = σ 2 2 ˆσ MOM = x π
Exercise Find ˆσ MLE 2 maximum likelihood estimator (MLE) for σ2 and discuss properties of this 1estimator. L(σ 2 i x) = x ( i i exp x 2 ) i σ2n 2σ 2 logl(σ 2 x) = log(x i ) n logσ 2 i x i 2 2σ 2 i logl(σ 2 x) σ 2 = n σ 2 + i x i 2 2σ 4
Exercise logl(σ 2 x) σ 2 = 0 if σ 2 i = x i 2 2n 2 logl(σ 2 x) 2 σ 2 = + n σ 4 i x i 2 σ 6 2 logl(σ 2 x) 2 σ 2 ˆσ2 = + ṋ σ 4 2 ṋ σ 4 < 0 ˆσ MLE 2 i = x i 2 2n
Exercise Compute the score function and the Fisher information. Score(σ) = logl(σ2 x) σ 2 = n σ 2 + 2 logl(σ 2 x) 2 σ 2 = n σ 4 i x i 2 2σ 4 i x 2 i σ 6 0 x 2 f (x; θ)dx = 0 x 3 ( σ 2 exp x 2 ) 2σ 2 dx
Exercise Integration by parts 0 0 x 2 x σ 2 exp ( x 2 2σ 2 f g = f g ) dx = 0 x 2 ( exp E(X 2 ) = 2σ 2 ( 2 logl(σ 2 ) x) I (θ) = E 2 σ 2 = 2σ 2 0 = E f g ( x 2 x σ 2 exp )) 2σ 2 ( x 2 2σ 2 ( n σ 4 i x i 2 σ 6 I (θ) = n σ 4 2nσ2 σ 6 I (θ) = n σ 4 ) 0 ) dx 2x ( ex
Exercise Specify asymptotic distribution of ˆθ MLE. ˆσ 2 MLE N ) (σ 2, σ4 n