Exercise. Show the corrected sample variance is an unbiased estimator of population variance. S 2 = n i=1 (X i X ) 2 n 1. Exercise Estimation

Similar documents
Definition 9.1 A point estimate is any function T (X 1,..., X n ) of a random sample. We often write an estimator of the parameter θ as ˆθ.

Point Estimators. STATISTICS Lecture no. 10. Department of Econometrics FEM UO Brno office 69a, tel

Chapter 4: Asymptotic Properties of MLE (Part 3)

Lecture Notes 6. Assume F belongs to a family of distributions, (e.g. F is Normal), indexed by some parameter θ.

Lecture 10: Point Estimation

Statistical analysis and bootstrapping

Chapter 8: Sampling distributions of estimators Sections

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER

Version A. Problem 1. Let X be the continuous random variable defined by the following pdf: 1 x/2 when 0 x 2, f(x) = 0 otherwise.

Chapter 8: Sampling distributions of estimators Sections

Qualifying Exam Solutions: Theoretical Statistics

Point Estimation. Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage

INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY. Lecture -5 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc.

Back to estimators...

6. Genetics examples: Hardy-Weinberg Equilibrium

Chapter 7: Point Estimation and Sampling Distributions

continuous rv Note for a legitimate pdf, we have f (x) 0 and f (x)dx = 1. For a continuous rv, P(X = c) = c f (x)dx = 0, hence

Applied Statistics I

Chapter 8. Introduction to Statistical Inference

Chapter 3 Common Families of Distributions. Definition 3.4.1: A family of pmfs or pdfs is called exponential family if it can be expressed as

Computer Statistics with R

Chapter 7 - Lecture 1 General concepts and criteria

Likelihood Methods of Inference. Toss coin 6 times and get Heads twice.

ECE 340 Probabilistic Methods in Engineering M/W 3-4:15. Lecture 10: Continuous RV Families. Prof. Vince Calhoun

Homework Problems Stat 479

Practice Exercises for Midterm Exam ST Statistical Theory - II The ACTUAL exam will consists of less number of problems.

Probability Theory and Simulation Methods. April 9th, Lecture 20: Special distributions

The Normal Distribution

3 ˆθ B = X 1 + X 2 + X 3. 7 a) Find the Bias, Variance and MSE of each estimator. Which estimator is the best according

Hardy Weinberg Model- 6 Genotypes

Statistical estimation

STAT/MATH 395 PROBABILITY II

Chapter 6: Point Estimation

Exam M Fall 2005 PRELIMINARY ANSWER KEY

EE641 Digital Image Processing II: Purdue University VISE - October 29,

STRESS-STRENGTH RELIABILITY ESTIMATION

Actuarial Mathematics and Statistics Statistics 5 Part 2: Statistical Inference Tutorial Problems

Chapter 5. Statistical inference for Parametric Models

Chapter 7: Estimation Sections

UQ, STAT2201, 2017, Lectures 3 and 4 Unit 3 Probability Distributions.

IEOR 165 Lecture 1 Probability Review

Chapter 7: Estimation Sections

Bivariate Birnbaum-Saunders Distribution

STAT 111 Recitation 4

PROBABILITY AND STATISTICS

**BEGINNING OF EXAMINATION** A random sample of five observations from a population is:

CS340 Machine learning Bayesian model selection

14.30 Introduction to Statistical Methods in Economics Spring 2009

MATH 3200 Exam 3 Dr. Syring

Summary. Recap. Last Lecture. .1 If you know MLE of θ, can you also know MLE of τ(θ) for any function τ?

Point Estimation. Principle of Unbiased Estimation. When choosing among several different estimators of θ, select one that is unbiased.

Homework Problems Stat 479

4-1. Chapter 4. Commonly Used Distributions by The McGraw-Hill Companies, Inc. All rights reserved.

Exam 1 Spring 2015 Statistics for Applications 3/5/2015

Rowan University Department of Electrical and Computer Engineering

(Practice Version) Midterm Exam 1

Learning From Data: MLE. Maximum Likelihood Estimators

2.1 Probability, stochastic variables and distribution functions

IEOR E4703: Monte-Carlo Simulation

Central Limit Theorem, Joint Distributions Spring 2018

Practice Exam 1. Loss Amount Number of Losses

Parameters Estimation in Stochastic Process Model

Tutorial 11: Limit Theorems. Baoxiang Wang & Yihan Zhang bxwang, April 10, 2017

x satisfying all regularity conditions. Then

Lecture Stat 302 Introduction to Probability - Slides 15

Improved Inference for Signal Discovery Under Exceptionally Low False Positive Error Rates

Interval estimation. September 29, Outline Basic ideas Sampling variation and CLT Interval estimation using X More general problems

Covariance and Correlation. Def: If X and Y are JDRVs with finite means and variances, then. Example Sampling

Homework Assignments

Homework Problems Stat 479

درس هفتم یادگیري ماشین. (Machine Learning) دانشگاه فردوسی مشهد دانشکده مهندسی رضا منصفی

Modeling Co-movements and Tail Dependency in the International Stock Market via Copulae

VI. Continuous Probability Distributions

Chapter 5: Statistical Inference (in General)

Chapter 4: Commonly Used Distributions. Statistics for Engineers and Scientists Fourth Edition William Navidi

Point Estimation. Some General Concepts of Point Estimation. Example. Estimator quality

Comparing the Means of. Two Log-Normal Distributions: A Likelihood Approach

Chapter 2. Random variables. 2.3 Expectation

Chapter 7: Estimation Sections

Lecture 5: Fundamentals of Statistical Analysis and Distributions Derived from Normal Distributions

Review for Final Exam Spring 2014 Jeremy Orloff and Jonathan Bloom

Lecture 7: Computation of Greeks

Modelling financial data with stochastic processes

Asymptotic results discrete time martingales and stochastic algorithms

Random Variables Handout. Xavier Vilà

The method of Maximum Likelihood.

A Stochastic Reserving Today (Beyond Bootstrap)

1. The number of dental claims for each insured in a calendar year is distributed as a Geometric distribution with variance of

A New Hybrid Estimation Method for the Generalized Pareto Distribution

STAT 509: Statistics for Engineers Dr. Dewei Wang. Copyright 2014 John Wiley & Sons, Inc. All rights reserved.

M1 M1 A1 M1 A1 M1 A1 A1 A1 11 A1 2 B1 B1. B1 M1 Relative efficiency (y) = M1 A1 BEWARE PRINTED ANSWER. 5

Web-based Supplementary Materials for. A space-time conditional intensity model. for invasive meningococcal disease occurence

IEOR E4703: Monte-Carlo Simulation

12 The Bootstrap and why it works

Normal Distribution. Notes. Normal Distribution. Standard Normal. Sums of Normal Random Variables. Normal. approximation of Binomial.

Point Estimation. Copyright Cengage Learning. All rights reserved.

MTH6154 Financial Mathematics I Stochastic Interest Rates

M.I.T Fall Practice Problems

Chapter 5 Univariate time-series analysis. () Chapter 5 Univariate time-series analysis 1 / 29

IEOR 3106: Introduction to OR: Stochastic Models. Fall 2013, Professor Whitt. Class Lecture Notes: Tuesday, September 10.

Transcription:

Exercise Show the corrected sample variance is an unbiased estimator of population variance. S 2 = n i=1 (X i X ) 2 n 1

Exercise S 2 = = = = n i=1 (X i x) 2 n i=1 = (X i µ + µ X ) 2 = n 1 n 1 n i=1 ((X i µ) ( X µ)) 2 n 1 n ( i=1 (Xi µ) 2 + ( X µ) 2 2(X i µ)( X µ) ) n 1 n i=1 (X i µ) 2 i + ( X µ) 2 n 1 n 1 i = ( 2(Xi µ)( X µ) ) n 1

Exercise E((X i µ) 2 ) = σ 2 E( X µ) 2 = σ2 ( ) n ( ) (X i µ)( X µ) = ( X µ) (X i µ) = n( X µ) 2 i E(S 2 ) = nσ2 n 1 + σ2 n 1 2 nσ 2 n(n 1) E(S 2 ) = nσ2 n 1 σ2 n 1 E(S 2 ) = σ 2 i

Uniform Distribution Example: Let X 1,..., X n be iid r.v. distributed as continuous uniform distribution on [0, θ]. The probability distribution function of X i for each i is: { θ f (x θ) = 1, 0 x θ 0, otherwise Consider T = max i X i an estimator of θ, is T an unbiased estimator for θ?

Uniform Distribution Example: Consider T = max i X i an estimator of θ, is T an unbiased estimator for θ? We need to compute E(T ), thus we need distribution for T. Let F be the cumulative distribution function for T, F T (y) = P(T y) = P(max i X i y) ( y ) n P(max i X i y) = P(X 1 y,..., X n y) = P(X y) n = θ f T (t) = F T (y) y E(T ) = yf T (y)dy = Bias(T ) = E(T ) θ = ( y ) n 1 1 = n θ θ ( y ) n n n dy = θ n + 1 θ 1 n + 1 θ

Uniform Distribution Example: Consider T = max i X i an estimator of θ, is T an unbiased estimator for θ? We need to compute E(T ), thus we need distribution for T. Let F be the cumulative distribution function for T, F T (y) = P(T y) = P(max i X i y) ( y ) n P(max i X i y) = P(X 1 y,..., X n y) = P(X y) n = θ f T (t) = F T (y) y E(T 2 ) = y 2 f T (y)dy = ( y ) n 1 1 = n θ θ n y (n+1) θ n dy = n n + 2 θ2 n ( n Var(T ) = E(T 2 ) (E(T )) 2 = n + 2 θ2 n + 1 θ ( ) n 2 ( ) 1 2 ( MSE(T ) = (n + 2)(n + 1) 2 θ 2 + θ 2 = n + 1 ) 2 θ 2 (n + 2)(n + 1) )

Uniform Distribution: ˆθ MOM Example: E(X ) = xf (x θ)dx = θ 0 x θ dx = θ 0 x 2 2θ = θ 2 X = θ 2 ˆθ MOM = 2 X

Uniform Distribution: ˆθ MOM Example: ) E (ˆθ MOM Var (ˆθMOM ) = θ = 4Var ( ) θ X 2 = 4 12n = θ2 3n ) MSE (ˆθ MLE ( ) 1 (n + 1)(n + 2) θ2 ( ) 1 (n + 1)(n + 2) ) Var (ˆθ MOM θ2 3n 1 3n

Exercise Let (X 1,..., X n ) be independent identically distributed random variables with p.d.f. f (x) = θ 2 x exp( θx) x > 0 Is T (X 1,..., X n ) = 1/X 1 an unbiased estimator of θ?

Exercise Let (X 1,..., X n ) be independent identically distributed random variables with E(X ) = µ, Var(X ) = σ 2, Are the following estimators unbiased estimator for σ 2? T 1 (X 1,..., X n ) = (X 1 X 2 ) 2 2 T 2 (X 1,..., X n ) = (X 1 + X 2 ) 2 X 1 X 2 2

Exercise 4 Let T 1 and T 2 be two independent and unbiased estimators of the parameter θ, with Var(T 1 ) = σ1 2 and Var(T 2) = σ2 2. Find the UMVUE for θ among all linear combinations of T 1 and T 2. What is its variance?

Solution Exercise 4 T = a 1 T 1 + a 2 T 2 E(T ) = a 1 E(T 1 ) + a 2 E(T 2 ) E(T ) = (a 1 + a 2 )θ To be unbiased: a 1 + a 2 = 1, a 2 = 1 a 1 Var(T ) = a 2 Var(T 1 ) + (1 a) 2 Var(T 2 ) Var(T ) = a 2 σ 2 1 + (1 a) 2 σ 2 2

Solution Exercise 4 dvar(t ) da = 2aσ1 2 2(1 a)σ2 2 2aσ1 2 + 2(1 a)σ2 2 = 0 a = σ 2 2 σ 2 1 + σ2 2 The UMVUE estimator for θ among all linear combinations of T 1 and T 2 is T = σ2 2 σ1 2 + T 1 + σ2 1 σ2 2 σ1 2 + T 2 σ2 2

Exercise Let (X 1,..., X n ) be a random sample of i.i.d. random variables with expected value µ and variance σ 2. Consider the following estimator of µ: T n (a) = a X n + (1 a) X n 1 where X n is the n th observed random variable and X n 1 is the sample mean based on n 1 observations. 1 Find value of a such that T n (a) is an unbiased estimator for µ 2 Find value of a such that T n (a ) is the most efficient estimator for µ within the class T n (a)? 3 Define concept of efficiency

Pareto Let (X 1,..., X n ) be a random sample of i.i.d. random variables distributed as a Pareto distribution with unknown parameters α and x 0 known f (x; α, x 0 ) = α x α 0 x (α+1) for x x 0. The log-likelihood function is l(α, x 0 ) = nlogα + nαlog(x 0 ) (α + 1) n logx i i=1

Pareto Thus Solving for δl(α,x 0) δα δl(α, x 0 ) δα = n α + nlog(x 0) n logx i i=1 = 0, the mle of α is given by ˆα = n n i=1 logx i nlog(x 0 )

Pareto: sufficiency Observe that the joint pdf of X = (X 1,..., X n ) f (x; α, x m ) = n α xm α x α+1 i=1 i = α n xm nα n i=1 x α+1 i = g(t, α)h(x) where t = n i=1 x i g(t, α) = cα n xm nα t (α+1) and h(x) = 1. By the factorization theorem, T (X ) = n i=1 X i is sufficient for α.

Pareto: Fisher Information Thus δ 2 l(α, x 0 ) δα 2 = n α 2 I n (θ) = n α 2

EXERCISE Let (X 1,..., X n ) be a random sample of i.i.d. random variables distributed as follows: f (x; θ) = θ2θ x θ+1 x > 2 1 Show that i log(x i) is a sufficient statistics for θ 2 Find ˆθ MLE maximum likelihood estimator (MLE) for θ and discuss properties of this estimator. 3 Find ˆθ MOM method of moment estimator (MOM) for θ.

EXERCISE: solution f (x 1, x 2,..., x n ; θ) = i θ2 θ x θ+1 i f (x 1, x 2,..., x n ; θ) = θ n 2 nθ ( i 1 x i ) θ+1 Sufficient statistics i i log(x i) 1 X i, or any trasformation, as for example,

EXERCISE: solution f (x 1, x 2,..., x n ; θ) = θ n 2 nθ ( i 1 x i ) θ+1 ) θ+1 ( L(θ x 1, x 2,..., x n ) = θ n 2 nθ 1 x i i ( ) l(θ x 1, x 2,..., x n ) = nlog(θ) + nθlog(2) (θ + 1) log(x i ) l(θ) θ 2 2 l(θ) θ = n θ + nlog2 ( i = n θ 2 log(x i ) ) i

EXERCISE: solution l(θ) θ = 0 ˆθ MLE = n i log(x i) nlog(2) = n ( i log ( x i 2 ))

EXERCISE: solution xf (x; θ)dx = xf (x; θ)dx = 2 2 x θ2θ dx x θ+1 θ2 θ x θ dx E(X ) = 2 θ2 θ x ( θ+1) θ + 1 dx E(X ) = 2 θ θ 1 x = θ MOM ˆ 2 θ MOM ˆ 1 ˆθ MOM = x x 2

Poisson distribution Example: Let X be distributed as a Possion; f (x; λ) = λx exp( λ) x! Compare the following estimator for exp ( λ): T 1 = exp ( X ) T 2 = n i=1 I (X i = 0) n

Maximum Likelihood Estimator for the Geometric distribution Considering for n observations from a Geometric distribution: p(x π) = π(1 π) x Find maximum likelihood estimator for E(X ) = 1 π π The second derivative: dlogl(π x) dπ d 2 logl(π x) dπ 2 = n π i x i 1 π = n π 2 i x i (1 π) 2

Maximum Likelihood Estimator for the Geometric distribution n π i x i 1 π = 0 n ˆπ = n + i x i

Example: Exam January 2016 Let (X 1,..., X n ) be a random sample of i.i.d. random variables distributed as a Pareto distribution with parameters α and x m both un known f (x; α, x m ) = α x α mx (α+1) for x x m. Calculate the Fisher information matrix for the parameter vector θ = (x m, α). How do you interpret the off-diagonal terms?

Example: Exam January 2016 The log-likelihood function is l(α, x m ) = nlogα + nαlog(x m ) (α + 1) l(α, x m ) α l(α, x m ) = nα x m x m 2 l(α, x m ) α 2 = n α 2 2 l(α, x m ) xm 2 2 l(α, x m ) = n α x m x m = n α + nlog(x m) = nα x 2 m n logx i i=1 n logx i i=1

Exercise We consider two continuous independent random variables U and W normally distributed with N(0, σ 2 ). The variable X defined by X = U 2 + V 2 has a Rayleigh distribution with a parameter σ 2 f (x; θ) = x σ 2 exp ( x 2 2σ 2 ), x 0 Let (X 1,..., X n ) be a random sample of i.i.d. random variables distributed as X 1 Apply the method of the moments to find the estimator ˆσ MOM of the parameter σ. 2 Find ˆσ 2 MLE maximum likelihood estimator (MLE) for σ 2 and discuss properties of this 1estimator. 3 Compute the score function and the Fisher information. 4 Specify asymptotic distribution of ˆθ MLE.

Exercise 0 x 2 xf (x; θ)dx = ( 0 σ 2 exp x 2 ) 2σ 2 dx = 1 x 2 ( σ 0 σ exp x 2 ) 2σ 2 dx 2π 1 x 2 = exp ( x 2 ) σ 2 2πσ 2σ 2 dx Y N(0, σ 2 ) y 2 exp ( y 2 ) 2πσ 2σ 2 dy = E(Y 2 ) = Var(Y ) + E(Y ) 2 = σ 2

Exercise 0 xf (x; θ)dx = 2π 1 σ 2 σ2 = σ π 2 π E(X ) = σ 2 2 ˆσ MOM = x π

Exercise Find ˆσ MLE 2 maximum likelihood estimator (MLE) for σ2 and discuss properties of this 1estimator. L(σ 2 i x) = x ( i i exp x 2 ) i σ2n 2σ 2 logl(σ 2 x) = log(x i ) n logσ 2 i x i 2 2σ 2 i logl(σ 2 x) σ 2 = n σ 2 + i x i 2 2σ 4

Exercise logl(σ 2 x) σ 2 = 0 if σ 2 i = x i 2 2n 2 logl(σ 2 x) 2 σ 2 = + n σ 4 i x i 2 σ 6 2 logl(σ 2 x) 2 σ 2 ˆσ2 = + ṋ σ 4 2 ṋ σ 4 < 0 ˆσ MLE 2 i = x i 2 2n

Exercise Compute the score function and the Fisher information. Score(σ) = logl(σ2 x) σ 2 = n σ 2 + 2 logl(σ 2 x) 2 σ 2 = n σ 4 i x i 2 2σ 4 i x 2 i σ 6 0 x 2 f (x; θ)dx = 0 x 3 ( σ 2 exp x 2 ) 2σ 2 dx

Exercise Integration by parts 0 0 x 2 x σ 2 exp ( x 2 2σ 2 f g = f g ) dx = 0 x 2 ( exp E(X 2 ) = 2σ 2 ( 2 logl(σ 2 ) x) I (θ) = E 2 σ 2 = 2σ 2 0 = E f g ( x 2 x σ 2 exp )) 2σ 2 ( x 2 2σ 2 ( n σ 4 i x i 2 σ 6 I (θ) = n σ 4 2nσ2 σ 6 I (θ) = n σ 4 ) 0 ) dx 2x ( ex

Exercise Specify asymptotic distribution of ˆθ MLE. ˆσ 2 MLE N ) (σ 2, σ4 n