Practice Exercises for Midterm Exam ST 522 - Statistical Theory - II The ACTUAL exam will consists of less number of problems. 1. Suppose X i F ( ) for i = 1,..., n, where F ( ) is a strictly increasing continuous cumulative distribution function. Let X (1) X (2) X (n) denote the ordered statistics. (a) Show that U j = F (X j ) U(0, 1) for j = 1,..., n. (b) Show that B j = F (X (j) ) ind Beta(j, n j + 1) for j = 1,..., n. (c) Show that E[X (j) ] = E[F 1 (B j )] where B j is as defined in part (b) above and F 1 (u) = inf{x R : F (x) u} for u [0, 1]. 2. Suppose X i f(x µ, σ) = 1 σ exp{ (x µ)/σ}i (µ, )(x) for i = 1,..., n, where µ R and σ > 0 are both unknown. Let θ = (µ, σ) denote parameter of this shifted exponential family. (a) Obtain a minimal sufficient statistic for θ. Is your minimal sufficient statistic complete for this family of distributions? (b) Show that X 1 µ σ Exp(1), hence (or otherwise) show that X (1) µ σ Exp ( 1 n). (c) Show that ( X X (1) )/(X (n) X (1) ) is an ancillary statistic. (d) Obtain the MME 1 of θ. Is the MME of θ unbiased? (e) Obtain the MLE 2 of θ. Is the MLE of θ unbiased? (f) Obtain the UMVUE 3 of θ. (g) Compute the MSE 4 of the MLE and UMVUE of σ. Which estimator is better in terms of having smaller MSE? (h) Obtain a class of conjugate prior distributions for θ. (i) Obtain the Bayes estimators, E[µ X 1,..., X n ] and E[σ X 1,..., X n ] of µ and σ, respectively, under the conjugate prior your derived in the previous part. 1 MME=Method of Moments Estimator 2 MLE=Maximum Likelihood Estimator 3 UMVUE=Uniformly Minimum Variance Unbiased Estimator 4 MSE=Mean Squared Error ST 522: Practice Problems for Midterm Exam Page 1 c Sujit Ghosh, NCSU Statistics
3. Provide examples of the following cases: (a) A minimal sufficient statistic that is of same dimension as that of the parameter. (b) A minimal sufficient statistic that is of larger dimension than that of the parameter. (c) A sufficient statistic that is of smaller dimension than that of the parameter. 4. Suppose X i f(x µ, σ) = 1 x µ φ( ) + 1 x µ exp{ } for i = 1, 2,..., n where 2σ σ 4σ σ µ R, σ > 0 are both unknown and φ( ) denotes the density of a standard normal distribution. State if the following statistics are ancillary (and provide justifications to your answers): (a) T 1 = X X (1) X (n) X (1) (b) T 2 = X X (1) X (n) +X (1) (c) T 3 = X X (1) S 1 where S 1 = 1 n n i=1 X i X (d) T 4 = 2 X X (1) X (n) S 1 5. Suppose X i U( 1, θ) for i = 1,..., n where θ > 1 is unknown. θ (a) Show that (X (1), X (n) ) is sufficient for θ (b) Is the statistic in part (a) above minimal sufficient? If yes, prove it otherwise exhibit a minimal sufficient statistic. (c) Is the minimal sufficient statistic that you found in part (b) above complete? Provide justifications. (d) Obtain the MLE and UMVUE of θ 6. Suppose X i U(0, θ) for i = 1,..., n where θ 1 is unknown. (a) Obtain the MLE of θ. Is the MLE unbiased? (b) Show that X (n) is not complete for this family of uniform distributions. (c) Obtain a minimal sufficient statistic for θ. Is your minimal sufficient statistic complete? (d) Obtain the UMVUE of θ (bit tricky!) ST 522: Practice Problems for Midterm Exam Page 2 c Sujit Ghosh, NCSU Statistics
7. Suppose X i f(x θ) for i = 1,..., n where θ Θ R d for some integer d 1. Let T = T (X 1,..., X n ) be a minimal sufficient statistic. (a) Show that the MLE of θ (if it exists uniquely) is a function of T only. (b) Consider a prior distribution of θ π(θ). Show that the posterior distribution of θ given (X 1,..., X n ) is same as that of θ given T. Conclude that any Bayes estimator is a function of T only. (c) Suppose d = 1. Show that the UMVUE of θ (if it exists) is a function of T only. (d) Give an example to show that MME need not be a function of T only. 8. Suppose X i f(x θ) for i = 1,..., n where θ Θ R d for some integer d 1. Let T 1 = T (X 1,..., X n ) and T 2 = T (X 1,..., X n ) be real-valued statistics to estimate η = τ(θ) (a real-valued function of θ). Assume that for each θ Θ, T 1 η is stochastically smaller 5 than T 2 η. (a) Show that MSE θ (T 1 ) = E θ [(T 1 η) 2 ] E[(T 2 η) 2 ] = MSE θ (T 2 ) for all θ Θ. (b) More generally, given any non-negative valued increasing continuous function G( ), show that E θ [G( T 1 η )] E[G( T 2 η )] for all θ Θ. 9. Suppose X i f(x θ) for i = 1,..., n where θ Θ R d for some integer d 1. Let T = T (X 1,..., X n ) be an UMVUE of η = τ(θ), a real valued function of θ. (a) Suppose U = U(X 1,..., X n ) is another unbiased estimator of η. Show that Cor θ [T, U] > 0. (b) Suppose T 2 is another UMVUE of η. Show that Cor θ [T 1, T 2 ] = 1. (c) Show that T 2 is the UMVUE of E θ [T 2 ], provided E θ [T 4 ] <. More generally, show that T k is the UMVUE of E θ [T k ], provided E θ [T 2k ] < for k = 2, 3,... (d) Suppose T = g(s) where S is a complete sufficient statistic for θ and let T 2 another unbiased estimate of η. Show that E[T 2 S] = g(s). 5 A real-valued random variable U is said to be stochastically smaller than another real-valued random variable V if Pr[U ɛ] > Pr[V ɛ] for all ɛ R ST 522: Practice Problems for Midterm Exam Page 3 c Sujit Ghosh, NCSU Statistics
10. Two statistics T 1 and T 2 are said to be equivalent if we can write T 2 = H(T 1 ) for some 1 1 transformation H( ) of the range of T 1 into the range of T 2. Which of the following statistics are equivalent? (Prove or disprove) (a) n i=1 X i and n i=1 log X i (b) n i=1 X i and n i=1 log X i (c) ( n i=1 X i, n i=1 X2 i ) and ( X, S 2 ) where X is the sample mean and S 2 is the sample variance. (d) ( n i=1 X i, n i=1 X3 i ) and ( X, n i=1 (X i X) 3 ) 11. Suppose X i N(µ 1, σ 2 1) and Y j N(µ 2, σ 2 2) for i = 1,..., n and j = 1,..., m. Find minimal sufficient statistics and compute the MLE for the following cases: (a) µ 1, µ 2 R and σ 1, σ 2 (0, ) are arbitrary (b) µ 1 = µ 2 R and σ 1, σ 2 (0, ) are arbitrary (c) σ 1 = σ 2 (0, ) and µ 1, µ 2 R arbitrary 12. Suppose X i f(x θ) for i = 1,..., n where θ Θ R. In the following cases show that there are no unbiased estimators of η = τ(θ). (a) f(x θ) = θ x (1 θ) 1 x for x = 0, 1 and θ (0, 1) and η = θ, the odds 1 θ (b) f(x θ) = θx e θ x! for x {0, 1,...} and θ > 0 and η = θ, the standard deviation of X. 13. Suppose X i f(x θ) for i = 1,..., n where θ Θ R d for some integer d 1 where f(x θ) is integrable as a function of θ. Assume that the support S = {x : f(x θ) > 0} does not involve θ. Show that the following class of prior densities is conjugate: π(θ) = N j=1 where ξ j S and N {1, 2,...}. f(ξ j θ)/ Θ N f(ξ j θ)dθ, 14. Suppose X i U(θ 1, θ + 1) for i = 1,..., n where θ R. (a) Obtain the MME of θ. (b) Obtain MLE of θ. Is it unique? j=1 ST 522: Practice Problems for Midterm Exam Page 4 c Sujit Ghosh, NCSU Statistics
15. The Kullback-Liebler divergence (KLD) between two densities f and g is defined as:. KLD(f, g) = f(x) log f(x) g(x) dx (a) Show that KLD(f, g) 0 and the equality holds if and only if f g (i.e., f(x) = g(x) for all x {x : f(x) > 0}). (b) Suppose f(x) = 1 σ 0 φ( x µ 0 σ 0 ) and g(x) = 1 x µ φ( ) where φ( ) denotes the density σ σ of the standard normal distribution. Compute KLD(f, g). (c) In part (b) above suppose σ = σ 0. Show that KLD(f, g) = 0 if and only if µ = µ 0 (d) In part (b) above suppose KLD(f, g) = 0. Can you conclude µ = µ 0 and σ = σ 0? (e) Suppose X i f 0 (x) for i = 1,..., n where f 0 ( ) is an unknown density. Suppose we use a statistical model that assumes X i f(x θ). i. Show that if X i f 0 (x) then it follows by SLLN that KLD n (θ) = 1 n n i=1 log f 0(X i ) f(x i θ) a.s. KLD(f 0, f( θ)) as n ii. Show that the MLE of θ under the assumed statistical model minimizes KLD n (θ). 16. Review all the problems solved during Lab hours (ST 522L) and Home assignments Hint for Exercise#2(a): You may assume (or prove) that for the given family of distributions, 2n( X X (1) ) and 2n(X (1) µ) are independently distributed as χ 2 σ σ 2(n 1) and χ2 2 distributions. ST 522: Practice Problems for Midterm Exam Page 5 c Sujit Ghosh, NCSU Statistics