Exercise. Show the corrected sample variance is an unbiased estimator of population variance. S 2 = n i=1 (X i X ) 2 n 1. Exercise Estimation

Size: px
Start display at page:

Download "Exercise. Show the corrected sample variance is an unbiased estimator of population variance. S 2 = n i=1 (X i X ) 2 n 1. Exercise Estimation"

Transcription

1 Exercise Show the corrected sample variance is an unbiased estimator of population variance. S 2 = n i=1 (X i X ) 2 n 1

2 Exercise S 2 = = = = n i=1 (X i x) 2 n i=1 = (X i µ + µ X ) 2 = n 1 n 1 n i=1 ((X i µ) ( X µ)) 2 n 1 n ( i=1 (Xi µ) 2 + ( X µ) 2 2(X i µ)( X µ) ) n 1 n i=1 (X i µ) 2 i + ( X µ) 2 n 1 n 1 i = ( 2(Xi µ)( X µ) ) n 1

3 Exercise E((X i µ) 2 ) = σ 2 E( X µ) 2 = σ2 ( ) n ( ) (X i µ)( X µ) = ( X µ) (X i µ) = n( X µ) 2 i E(S 2 ) = nσ2 n 1 + σ2 n 1 2 nσ 2 n(n 1) E(S 2 ) = nσ2 n 1 σ2 n 1 E(S 2 ) = σ 2 i

4 Uniform Distribution Example: Let X 1,..., X n be iid r.v. distributed as continuous uniform distribution on [0, θ]. The probability distribution function of X i for each i is: { θ f (x θ) = 1, 0 x θ 0, otherwise Consider T = max i X i an estimator of θ, is T an unbiased estimator for θ?

5 Uniform Distribution Example: Consider T = max i X i an estimator of θ, is T an unbiased estimator for θ? We need to compute E(T ), thus we need distribution for T. Let F be the cumulative distribution function for T, F T (y) = P(T y) = P(max i X i y) ( y ) n P(max i X i y) = P(X 1 y,..., X n y) = P(X y) n = θ f T (t) = F T (y) y E(T ) = yf T (y)dy = Bias(T ) = E(T ) θ = ( y ) n 1 1 = n θ θ ( y ) n n n dy = θ n + 1 θ 1 n + 1 θ

6 Uniform Distribution Example: Consider T = max i X i an estimator of θ, is T an unbiased estimator for θ? We need to compute E(T ), thus we need distribution for T. Let F be the cumulative distribution function for T, F T (y) = P(T y) = P(max i X i y) ( y ) n P(max i X i y) = P(X 1 y,..., X n y) = P(X y) n = θ f T (t) = F T (y) y E(T 2 ) = y 2 f T (y)dy = ( y ) n 1 1 = n θ θ n y (n+1) θ n dy = n n + 2 θ2 n ( n Var(T ) = E(T 2 ) (E(T )) 2 = n + 2 θ2 n + 1 θ ( ) n 2 ( ) 1 2 ( MSE(T ) = (n + 2)(n + 1) 2 θ 2 + θ 2 = n + 1 ) 2 θ 2 (n + 2)(n + 1) )

7 Uniform Distribution: ˆθ MOM Example: E(X ) = xf (x θ)dx = θ 0 x θ dx = θ 0 x 2 2θ = θ 2 X = θ 2 ˆθ MOM = 2 X

8 Uniform Distribution: ˆθ MOM Example: ) E (ˆθ MOM Var (ˆθMOM ) = θ = 4Var ( ) θ X 2 = 4 12n = θ2 3n ) MSE (ˆθ MLE ( ) 1 (n + 1)(n + 2) θ2 ( ) 1 (n + 1)(n + 2) ) Var (ˆθ MOM θ2 3n 1 3n

9 Exercise Let (X 1,..., X n ) be independent identically distributed random variables with p.d.f. f (x) = θ 2 x exp( θx) x > 0 Is T (X 1,..., X n ) = 1/X 1 an unbiased estimator of θ?

10 Exercise Let (X 1,..., X n ) be independent identically distributed random variables with E(X ) = µ, Var(X ) = σ 2, Are the following estimators unbiased estimator for σ 2? T 1 (X 1,..., X n ) = (X 1 X 2 ) 2 2 T 2 (X 1,..., X n ) = (X 1 + X 2 ) 2 X 1 X 2 2

11 Exercise 4 Let T 1 and T 2 be two independent and unbiased estimators of the parameter θ, with Var(T 1 ) = σ1 2 and Var(T 2) = σ2 2. Find the UMVUE for θ among all linear combinations of T 1 and T 2. What is its variance?

12 Solution Exercise 4 T = a 1 T 1 + a 2 T 2 E(T ) = a 1 E(T 1 ) + a 2 E(T 2 ) E(T ) = (a 1 + a 2 )θ To be unbiased: a 1 + a 2 = 1, a 2 = 1 a 1 Var(T ) = a 2 Var(T 1 ) + (1 a) 2 Var(T 2 ) Var(T ) = a 2 σ (1 a) 2 σ 2 2

13 Solution Exercise 4 dvar(t ) da = 2aσ1 2 2(1 a)σ2 2 2aσ (1 a)σ2 2 = 0 a = σ 2 2 σ σ2 2 The UMVUE estimator for θ among all linear combinations of T 1 and T 2 is T = σ2 2 σ1 2 + T 1 + σ2 1 σ2 2 σ1 2 + T 2 σ2 2

14 Exercise Let (X 1,..., X n ) be a random sample of i.i.d. random variables with expected value µ and variance σ 2. Consider the following estimator of µ: T n (a) = a X n + (1 a) X n 1 where X n is the n th observed random variable and X n 1 is the sample mean based on n 1 observations. 1 Find value of a such that T n (a) is an unbiased estimator for µ 2 Find value of a such that T n (a ) is the most efficient estimator for µ within the class T n (a)? 3 Define concept of efficiency

15 Pareto Let (X 1,..., X n ) be a random sample of i.i.d. random variables distributed as a Pareto distribution with unknown parameters α and x 0 known f (x; α, x 0 ) = α x α 0 x (α+1) for x x 0. The log-likelihood function is l(α, x 0 ) = nlogα + nαlog(x 0 ) (α + 1) n logx i i=1

16 Pareto Thus Solving for δl(α,x 0) δα δl(α, x 0 ) δα = n α + nlog(x 0) n logx i i=1 = 0, the mle of α is given by ˆα = n n i=1 logx i nlog(x 0 )

17 Pareto: sufficiency Observe that the joint pdf of X = (X 1,..., X n ) f (x; α, x m ) = n α xm α x α+1 i=1 i = α n xm nα n i=1 x α+1 i = g(t, α)h(x) where t = n i=1 x i g(t, α) = cα n xm nα t (α+1) and h(x) = 1. By the factorization theorem, T (X ) = n i=1 X i is sufficient for α.

18 Pareto: Fisher Information Thus δ 2 l(α, x 0 ) δα 2 = n α 2 I n (θ) = n α 2

19 EXERCISE Let (X 1,..., X n ) be a random sample of i.i.d. random variables distributed as follows: f (x; θ) = θ2θ x θ+1 x > 2 1 Show that i log(x i) is a sufficient statistics for θ 2 Find ˆθ MLE maximum likelihood estimator (MLE) for θ and discuss properties of this estimator. 3 Find ˆθ MOM method of moment estimator (MOM) for θ.

20 EXERCISE: solution f (x 1, x 2,..., x n ; θ) = i θ2 θ x θ+1 i f (x 1, x 2,..., x n ; θ) = θ n 2 nθ ( i 1 x i ) θ+1 Sufficient statistics i i log(x i) 1 X i, or any trasformation, as for example,

21 EXERCISE: solution f (x 1, x 2,..., x n ; θ) = θ n 2 nθ ( i 1 x i ) θ+1 ) θ+1 ( L(θ x 1, x 2,..., x n ) = θ n 2 nθ 1 x i i ( ) l(θ x 1, x 2,..., x n ) = nlog(θ) + nθlog(2) (θ + 1) log(x i ) l(θ) θ 2 2 l(θ) θ = n θ + nlog2 ( i = n θ 2 log(x i ) ) i

22 EXERCISE: solution l(θ) θ = 0 ˆθ MLE = n i log(x i) nlog(2) = n ( i log ( x i 2 ))

23 EXERCISE: solution xf (x; θ)dx = xf (x; θ)dx = 2 2 x θ2θ dx x θ+1 θ2 θ x θ dx E(X ) = 2 θ2 θ x ( θ+1) θ + 1 dx E(X ) = 2 θ θ 1 x = θ MOM ˆ 2 θ MOM ˆ 1 ˆθ MOM = x x 2

24 Poisson distribution Example: Let X be distributed as a Possion; f (x; λ) = λx exp( λ) x! Compare the following estimator for exp ( λ): T 1 = exp ( X ) T 2 = n i=1 I (X i = 0) n

25 Maximum Likelihood Estimator for the Geometric distribution Considering for n observations from a Geometric distribution: p(x π) = π(1 π) x Find maximum likelihood estimator for E(X ) = 1 π π The second derivative: dlogl(π x) dπ d 2 logl(π x) dπ 2 = n π i x i 1 π = n π 2 i x i (1 π) 2

26 Maximum Likelihood Estimator for the Geometric distribution n π i x i 1 π = 0 n ˆπ = n + i x i

27 Example: Exam January 2016 Let (X 1,..., X n ) be a random sample of i.i.d. random variables distributed as a Pareto distribution with parameters α and x m both un known f (x; α, x m ) = α x α mx (α+1) for x x m. Calculate the Fisher information matrix for the parameter vector θ = (x m, α). How do you interpret the off-diagonal terms?

28 Example: Exam January 2016 The log-likelihood function is l(α, x m ) = nlogα + nαlog(x m ) (α + 1) l(α, x m ) α l(α, x m ) = nα x m x m 2 l(α, x m ) α 2 = n α 2 2 l(α, x m ) xm 2 2 l(α, x m ) = n α x m x m = n α + nlog(x m) = nα x 2 m n logx i i=1 n logx i i=1

29 Exercise We consider two continuous independent random variables U and W normally distributed with N(0, σ 2 ). The variable X defined by X = U 2 + V 2 has a Rayleigh distribution with a parameter σ 2 f (x; θ) = x σ 2 exp ( x 2 2σ 2 ), x 0 Let (X 1,..., X n ) be a random sample of i.i.d. random variables distributed as X 1 Apply the method of the moments to find the estimator ˆσ MOM of the parameter σ. 2 Find ˆσ 2 MLE maximum likelihood estimator (MLE) for σ 2 and discuss properties of this 1estimator. 3 Compute the score function and the Fisher information. 4 Specify asymptotic distribution of ˆθ MLE.

30 Exercise 0 x 2 xf (x; θ)dx = ( 0 σ 2 exp x 2 ) 2σ 2 dx = 1 x 2 ( σ 0 σ exp x 2 ) 2σ 2 dx 2π 1 x 2 = exp ( x 2 ) σ 2 2πσ 2σ 2 dx Y N(0, σ 2 ) y 2 exp ( y 2 ) 2πσ 2σ 2 dy = E(Y 2 ) = Var(Y ) + E(Y ) 2 = σ 2

31 Exercise 0 xf (x; θ)dx = 2π 1 σ 2 σ2 = σ π 2 π E(X ) = σ 2 2 ˆσ MOM = x π

32 Exercise Find ˆσ MLE 2 maximum likelihood estimator (MLE) for σ2 and discuss properties of this 1estimator. L(σ 2 i x) = x ( i i exp x 2 ) i σ2n 2σ 2 logl(σ 2 x) = log(x i ) n logσ 2 i x i 2 2σ 2 i logl(σ 2 x) σ 2 = n σ 2 + i x i 2 2σ 4

33 Exercise logl(σ 2 x) σ 2 = 0 if σ 2 i = x i 2 2n 2 logl(σ 2 x) 2 σ 2 = + n σ 4 i x i 2 σ 6 2 logl(σ 2 x) 2 σ 2 ˆσ2 = + ṋ σ 4 2 ṋ σ 4 < 0 ˆσ MLE 2 i = x i 2 2n

34 Exercise Compute the score function and the Fisher information. Score(σ) = logl(σ2 x) σ 2 = n σ logl(σ 2 x) 2 σ 2 = n σ 4 i x i 2 2σ 4 i x 2 i σ 6 0 x 2 f (x; θ)dx = 0 x 3 ( σ 2 exp x 2 ) 2σ 2 dx

35 Exercise Integration by parts 0 0 x 2 x σ 2 exp ( x 2 2σ 2 f g = f g ) dx = 0 x 2 ( exp E(X 2 ) = 2σ 2 ( 2 logl(σ 2 ) x) I (θ) = E 2 σ 2 = 2σ 2 0 = E f g ( x 2 x σ 2 exp )) 2σ 2 ( x 2 2σ 2 ( n σ 4 i x i 2 σ 6 I (θ) = n σ 4 2nσ2 σ 6 I (θ) = n σ 4 ) 0 ) dx 2x ( ex

36 Exercise Specify asymptotic distribution of ˆθ MLE. ˆσ 2 MLE N ) (σ 2, σ4 n

Definition 9.1 A point estimate is any function T (X 1,..., X n ) of a random sample. We often write an estimator of the parameter θ as ˆθ.

Definition 9.1 A point estimate is any function T (X 1,..., X n ) of a random sample. We often write an estimator of the parameter θ as ˆθ. 9 Point estimation 9.1 Rationale behind point estimation When sampling from a population described by a pdf f(x θ) or probability function P [X = x θ] knowledge of θ gives knowledge of the entire population.

More information

Point Estimators. STATISTICS Lecture no. 10. Department of Econometrics FEM UO Brno office 69a, tel

Point Estimators. STATISTICS Lecture no. 10. Department of Econometrics FEM UO Brno office 69a, tel STATISTICS Lecture no. 10 Department of Econometrics FEM UO Brno office 69a, tel. 973 442029 email:jiri.neubauer@unob.cz 8. 12. 2009 Introduction Suppose that we manufacture lightbulbs and we want to state

More information

Chapter 4: Asymptotic Properties of MLE (Part 3)

Chapter 4: Asymptotic Properties of MLE (Part 3) Chapter 4: Asymptotic Properties of MLE (Part 3) Daniel O. Scharfstein 09/30/13 1 / 1 Breakdown of Assumptions Non-Existence of the MLE Multiple Solutions to Maximization Problem Multiple Solutions to

More information

Lecture Notes 6. Assume F belongs to a family of distributions, (e.g. F is Normal), indexed by some parameter θ.

Lecture Notes 6. Assume F belongs to a family of distributions, (e.g. F is Normal), indexed by some parameter θ. Sufficient Statistics Lecture Notes 6 Sufficiency Data reduction in terms of a particular statistic can be thought of as a partition of the sample space X. Definition T is sufficient for θ if the conditional

More information

Lecture 10: Point Estimation

Lecture 10: Point Estimation Lecture 10: Point Estimation MSU-STT-351-Sum-17B (P. Vellaisamy: MSU-STT-351-Sum-17B) Probability & Statistics for Engineers 1 / 31 Basic Concepts of Point Estimation A point estimate of a parameter θ,

More information

Statistical analysis and bootstrapping

Statistical analysis and bootstrapping Statistical analysis and bootstrapping p. 1/15 Statistical analysis and bootstrapping Michel Bierlaire michel.bierlaire@epfl.ch Transport and Mobility Laboratory Statistical analysis and bootstrapping

More information

Chapter 8: Sampling distributions of estimators Sections

Chapter 8: Sampling distributions of estimators Sections Chapter 8: Sampling distributions of estimators Sections 8.1 Sampling distribution of a statistic 8.2 The Chi-square distributions 8.3 Joint Distribution of the sample mean and sample variance Skip: p.

More information

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER Two hours MATH20802 To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER STATISTICAL METHODS Answer any FOUR of the SIX questions.

More information

Version A. Problem 1. Let X be the continuous random variable defined by the following pdf: 1 x/2 when 0 x 2, f(x) = 0 otherwise.

Version A. Problem 1. Let X be the continuous random variable defined by the following pdf: 1 x/2 when 0 x 2, f(x) = 0 otherwise. Math 224 Q Exam 3A Fall 217 Tues Dec 12 Version A Problem 1. Let X be the continuous random variable defined by the following pdf: { 1 x/2 when x 2, f(x) otherwise. (a) Compute the mean µ E[X]. E[X] x

More information

Chapter 8: Sampling distributions of estimators Sections

Chapter 8: Sampling distributions of estimators Sections Chapter 8 continued Chapter 8: Sampling distributions of estimators Sections 8.1 Sampling distribution of a statistic 8.2 The Chi-square distributions 8.3 Joint Distribution of the sample mean and sample

More information

Qualifying Exam Solutions: Theoretical Statistics

Qualifying Exam Solutions: Theoretical Statistics Qualifying Exam Solutions: Theoretical Statistics. (a) For the first sampling plan, the expectation of any statistic W (X, X,..., X n ) is a polynomial of θ of degree less than n +. Hence τ(θ) cannot have

More information

Point Estimation. Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage

Point Estimation. Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage 6 Point Estimation Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage Point Estimation Statistical inference: directed toward conclusions about one or more parameters. We will use the generic

More information

INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY. Lecture -5 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc.

INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY. Lecture -5 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc. INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY Lecture -5 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc. Summary of the previous lecture Moments of a distribubon Measures of

More information

Back to estimators...

Back to estimators... Back to estimators... So far, we have: Identified estimators for common parameters Discussed the sampling distributions of estimators Introduced ways to judge the goodness of an estimator (bias, MSE, etc.)

More information

6. Genetics examples: Hardy-Weinberg Equilibrium

6. Genetics examples: Hardy-Weinberg Equilibrium PBCB 206 (Fall 2006) Instructor: Fei Zou email: fzou@bios.unc.edu office: 3107D McGavran-Greenberg Hall Lecture 4 Topics for Lecture 4 1. Parametric models and estimating parameters from data 2. Method

More information

Chapter 7: Point Estimation and Sampling Distributions

Chapter 7: Point Estimation and Sampling Distributions Chapter 7: Point Estimation and Sampling Distributions Seungchul Baek Department of Statistics, University of South Carolina STAT 509: Statistics for Engineers 1 / 20 Motivation In chapter 3, we learned

More information

continuous rv Note for a legitimate pdf, we have f (x) 0 and f (x)dx = 1. For a continuous rv, P(X = c) = c f (x)dx = 0, hence

continuous rv Note for a legitimate pdf, we have f (x) 0 and f (x)dx = 1. For a continuous rv, P(X = c) = c f (x)dx = 0, hence continuous rv Let X be a continuous rv. Then a probability distribution or probability density function (pdf) of X is a function f(x) such that for any two numbers a and b with a b, P(a X b) = b a f (x)dx.

More information

Applied Statistics I

Applied Statistics I Applied Statistics I Liang Zhang Department of Mathematics, University of Utah July 14, 2008 Liang Zhang (UofU) Applied Statistics I July 14, 2008 1 / 18 Point Estimation Liang Zhang (UofU) Applied Statistics

More information

Chapter 8. Introduction to Statistical Inference

Chapter 8. Introduction to Statistical Inference Chapter 8. Introduction to Statistical Inference Point Estimation Statistical inference is to draw some type of conclusion about one or more parameters(population characteristics). Now you know that a

More information

Chapter 3 Common Families of Distributions. Definition 3.4.1: A family of pmfs or pdfs is called exponential family if it can be expressed as

Chapter 3 Common Families of Distributions. Definition 3.4.1: A family of pmfs or pdfs is called exponential family if it can be expressed as Lecture 0 on BST 63: Statistical Theory I Kui Zhang, 09/9/008 Review for the previous lecture Definition: Several continuous distributions, including uniform, gamma, normal, Beta, Cauchy, double exponential

More information

Computer Statistics with R

Computer Statistics with R MAREK GAGOLEWSKI KONSTANCJA BOBECKA-WESO LOWSKA PRZEMYS LAW GRZEGORZEWSKI Computer Statistics with R 5. Point Estimation Faculty of Mathematics and Information Science Warsaw University of Technology []

More information

Chapter 7 - Lecture 1 General concepts and criteria

Chapter 7 - Lecture 1 General concepts and criteria Chapter 7 - Lecture 1 General concepts and criteria January 29th, 2010 Best estimator Mean Square error Unbiased estimators Example Unbiased estimators not unique Special case MVUE Bootstrap General Question

More information

Likelihood Methods of Inference. Toss coin 6 times and get Heads twice.

Likelihood Methods of Inference. Toss coin 6 times and get Heads twice. Methods of Inference Toss coin 6 times and get Heads twice. p is probability of getting H. Probability of getting exactly 2 heads is 15p 2 (1 p) 4 This function of p, is likelihood function. Definition:

More information

ECE 340 Probabilistic Methods in Engineering M/W 3-4:15. Lecture 10: Continuous RV Families. Prof. Vince Calhoun

ECE 340 Probabilistic Methods in Engineering M/W 3-4:15. Lecture 10: Continuous RV Families. Prof. Vince Calhoun ECE 340 Probabilistic Methods in Engineering M/W 3-4:15 Lecture 10: Continuous RV Families Prof. Vince Calhoun 1 Reading This class: Section 4.4-4.5 Next class: Section 4.6-4.7 2 Homework 3.9, 3.49, 4.5,

More information

Homework Problems Stat 479

Homework Problems Stat 479 Chapter 10 91. * A random sample, X1, X2,, Xn, is drawn from a distribution with a mean of 2/3 and a variance of 1/18. ˆ = (X1 + X2 + + Xn)/(n-1) is the estimator of the distribution mean θ. Find MSE(

More information

Practice Exercises for Midterm Exam ST Statistical Theory - II The ACTUAL exam will consists of less number of problems.

Practice Exercises for Midterm Exam ST Statistical Theory - II The ACTUAL exam will consists of less number of problems. Practice Exercises for Midterm Exam ST 522 - Statistical Theory - II The ACTUAL exam will consists of less number of problems. 1. Suppose X i F ( ) for i = 1,..., n, where F ( ) is a strictly increasing

More information

Probability Theory and Simulation Methods. April 9th, Lecture 20: Special distributions

Probability Theory and Simulation Methods. April 9th, Lecture 20: Special distributions April 9th, 2018 Lecture 20: Special distributions Week 1 Chapter 1: Axioms of probability Week 2 Chapter 3: Conditional probability and independence Week 4 Chapters 4, 6: Random variables Week 9 Chapter

More information

The Normal Distribution

The Normal Distribution The Normal Distribution The normal distribution plays a central role in probability theory and in statistics. It is often used as a model for the distribution of continuous random variables. Like all models,

More information

3 ˆθ B = X 1 + X 2 + X 3. 7 a) Find the Bias, Variance and MSE of each estimator. Which estimator is the best according

3 ˆθ B = X 1 + X 2 + X 3. 7 a) Find the Bias, Variance and MSE of each estimator. Which estimator is the best according STAT 345 Spring 2018 Homework 9 - Point Estimation Name: Please adhere to the homework rules as given in the Syllabus. 1. Mean Squared Error. Suppose that X 1, X 2 and X 3 are independent random variables

More information

Hardy Weinberg Model- 6 Genotypes

Hardy Weinberg Model- 6 Genotypes Hardy Weinberg Model- 6 Genotypes Silvelyn Zwanzig Hardy -Weinberg with six genotypes. In a large population of plants (Mimulus guttatus there are possible alleles S, I, F at one locus resulting in six

More information

Statistical estimation

Statistical estimation Statistical estimation Statistical modelling: theory and practice Gilles Guillot gigu@dtu.dk September 3, 2013 Gilles Guillot (gigu@dtu.dk) Estimation September 3, 2013 1 / 27 1 Introductory example 2

More information

STAT/MATH 395 PROBABILITY II

STAT/MATH 395 PROBABILITY II STAT/MATH 395 PROBABILITY II Distribution of Random Samples & Limit Theorems Néhémy Lim University of Washington Winter 2017 Outline Distribution of i.i.d. Samples Convergence of random variables The Laws

More information

Chapter 6: Point Estimation

Chapter 6: Point Estimation Chapter 6: Point Estimation Professor Sharabati Purdue University March 10, 2014 Professor Sharabati (Purdue University) Point Estimation Spring 2014 1 / 37 Chapter Overview Point estimator and point estimate

More information

Exam M Fall 2005 PRELIMINARY ANSWER KEY

Exam M Fall 2005 PRELIMINARY ANSWER KEY Exam M Fall 005 PRELIMINARY ANSWER KEY Question # Answer Question # Answer 1 C 1 E C B 3 C 3 E 4 D 4 E 5 C 5 C 6 B 6 E 7 A 7 E 8 D 8 D 9 B 9 A 10 A 30 D 11 A 31 A 1 A 3 A 13 D 33 B 14 C 34 C 15 A 35 A

More information

EE641 Digital Image Processing II: Purdue University VISE - October 29,

EE641 Digital Image Processing II: Purdue University VISE - October 29, EE64 Digital Image Processing II: Purdue University VISE - October 9, 004 The EM Algorithm. Suffient Statistics and Exponential Distributions Let p(y θ) be a family of density functions parameterized by

More information

STRESS-STRENGTH RELIABILITY ESTIMATION

STRESS-STRENGTH RELIABILITY ESTIMATION CHAPTER 5 STRESS-STRENGTH RELIABILITY ESTIMATION 5. Introduction There are appliances (every physical component possess an inherent strength) which survive due to their strength. These appliances receive

More information

Actuarial Mathematics and Statistics Statistics 5 Part 2: Statistical Inference Tutorial Problems

Actuarial Mathematics and Statistics Statistics 5 Part 2: Statistical Inference Tutorial Problems Actuarial Mathematics and Statistics Statistics 5 Part 2: Statistical Inference Tutorial Problems Spring 2005 1. Which of the following statements relate to probabilities that can be interpreted as frequencies?

More information

Chapter 5. Statistical inference for Parametric Models

Chapter 5. Statistical inference for Parametric Models Chapter 5. Statistical inference for Parametric Models Outline Overview Parameter estimation Method of moments How good are method of moments estimates? Interval estimation Statistical Inference for Parametric

More information

Chapter 7: Estimation Sections

Chapter 7: Estimation Sections 1 / 40 Chapter 7: Estimation Sections 7.1 Statistical Inference Bayesian Methods: Chapter 7 7.2 Prior and Posterior Distributions 7.3 Conjugate Prior Distributions 7.4 Bayes Estimators Frequentist Methods:

More information

UQ, STAT2201, 2017, Lectures 3 and 4 Unit 3 Probability Distributions.

UQ, STAT2201, 2017, Lectures 3 and 4 Unit 3 Probability Distributions. UQ, STAT2201, 2017, Lectures 3 and 4 Unit 3 Probability Distributions. Random Variables 2 A random variable X is a numerical (integer, real, complex, vector etc.) summary of the outcome of the random experiment.

More information

IEOR 165 Lecture 1 Probability Review

IEOR 165 Lecture 1 Probability Review IEOR 165 Lecture 1 Probability Review 1 Definitions in Probability and Their Consequences 1.1 Defining Probability A probability space (Ω, F, P) consists of three elements: A sample space Ω is the set

More information

Chapter 7: Estimation Sections

Chapter 7: Estimation Sections Chapter 7: Estimation Sections 7.1 Statistical Inference Bayesian Methods: 7.2 Prior and Posterior Distributions 7.3 Conjugate Prior Distributions Frequentist Methods: 7.5 Maximum Likelihood Estimators

More information

Bivariate Birnbaum-Saunders Distribution

Bivariate Birnbaum-Saunders Distribution Department of Mathematics & Statistics Indian Institute of Technology Kanpur January 2nd. 2013 Outline 1 Collaborators 2 3 Birnbaum-Saunders Distribution: Introduction & Properties 4 5 Outline 1 Collaborators

More information

STAT 111 Recitation 4

STAT 111 Recitation 4 STAT 111 Recitation 4 Linjun Zhang http://stat.wharton.upenn.edu/~linjunz/ September 29, 2017 Misc. Mid-term exam time: 6-8 pm, Wednesday, Oct. 11 The mid-term break is Oct. 5-8 The next recitation class

More information

PROBABILITY AND STATISTICS

PROBABILITY AND STATISTICS Monday, January 12, 2015 1 PROBABILITY AND STATISTICS Zhenyu Ye January 12, 2015 Monday, January 12, 2015 2 References Ch10 of Experiments in Modern Physics by Melissinos. Particle Physics Data Group Review

More information

**BEGINNING OF EXAMINATION** A random sample of five observations from a population is:

**BEGINNING OF EXAMINATION** A random sample of five observations from a population is: **BEGINNING OF EXAMINATION** 1. You are given: (i) A random sample of five observations from a population is: 0.2 0.7 0.9 1.1 1.3 (ii) You use the Kolmogorov-Smirnov test for testing the null hypothesis,

More information

CS340 Machine learning Bayesian model selection

CS340 Machine learning Bayesian model selection CS340 Machine learning Bayesian model selection Bayesian model selection Suppose we have several models, each with potentially different numbers of parameters. Example: M0 = constant, M1 = straight line,

More information

14.30 Introduction to Statistical Methods in Economics Spring 2009

14.30 Introduction to Statistical Methods in Economics Spring 2009 MIT OpeCourseWare http://ocwmitedu 430 Itroductio to Statistical Methods i Ecoomics Sprig 009 For iformatio about citig these materials or our Terms of Use, visit: http://ocwmitedu/terms 430 Itroductio

More information

MATH 3200 Exam 3 Dr. Syring

MATH 3200 Exam 3 Dr. Syring . Suppose n eligible voters are polled (randomly sampled) from a population of size N. The poll asks voters whether they support or do not support increasing local taxes to fund public parks. Let M be

More information

Summary. Recap. Last Lecture. .1 If you know MLE of θ, can you also know MLE of τ(θ) for any function τ?

Summary. Recap. Last Lecture. .1 If you know MLE of θ, can you also know MLE of τ(θ) for any function τ? Last Lecture Biostatistics 60 - Statistical Iferece Lecture Cramer-Rao Theorem Hyu Mi Kag February 9th, 03 If you kow MLE of, ca you also kow MLE of τ() for ay fuctio τ? What are plausible ways to compare

More information

Point Estimation. Principle of Unbiased Estimation. When choosing among several different estimators of θ, select one that is unbiased.

Point Estimation. Principle of Unbiased Estimation. When choosing among several different estimators of θ, select one that is unbiased. Point Estimation Point Estimation Definition A point estimate of a parameter θ is a single number that can be regarded as a sensible value for θ. A point estimate is obtained by selecting a suitable statistic

More information

Homework Problems Stat 479

Homework Problems Stat 479 Chapter 2 1. Model 1 is a uniform distribution from 0 to 100. Determine the table entries for a generalized uniform distribution covering the range from a to b where a < b. 2. Let X be a discrete random

More information

4-1. Chapter 4. Commonly Used Distributions by The McGraw-Hill Companies, Inc. All rights reserved.

4-1. Chapter 4. Commonly Used Distributions by The McGraw-Hill Companies, Inc. All rights reserved. 4-1 Chapter 4 Commonly Used Distributions 2014 by The Companies, Inc. All rights reserved. Section 4.1: The Bernoulli Distribution 4-2 We use the Bernoulli distribution when we have an experiment which

More information

Exam 1 Spring 2015 Statistics for Applications 3/5/2015

Exam 1 Spring 2015 Statistics for Applications 3/5/2015 8.443 Exam Sprig 05 Statistics for Applicatios 3/5/05. Log Normal Distributio: A radom variable X follows a Logormal(θ, σ ) distributio if l(x) follows a Normal(θ, σ ) distributio. For the ormal radom

More information

Rowan University Department of Electrical and Computer Engineering

Rowan University Department of Electrical and Computer Engineering Rowan University Department of Electrical and Computer Engineering Estimation and Detection Theory Fall 203 Practice EXAM Solution This is a closed book exam. One letter-size sheet is allowed. There are

More information

(Practice Version) Midterm Exam 1

(Practice Version) Midterm Exam 1 EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 19, 2014 (Practice Version) Midterm Exam 1 Last name First name SID Rules. DO NOT open

More information

Learning From Data: MLE. Maximum Likelihood Estimators

Learning From Data: MLE. Maximum Likelihood Estimators Learning From Data: MLE Maximum Likelihood Estimators 1 Parameter Estimation Assuming sample x1, x2,..., xn is from a parametric distribution f(x θ), estimate θ. E.g.: Given sample HHTTTTTHTHTTTHH of (possibly

More information

2.1 Probability, stochastic variables and distribution functions

2.1 Probability, stochastic variables and distribution functions Chapter 2 Probability and statistics 2.1 Probability, stochastic variables and distribution functions The defining characteristic of a stochastic experiment E is that it produces different outcomes under

More information

IEOR E4703: Monte-Carlo Simulation

IEOR E4703: Monte-Carlo Simulation IEOR E4703: Monte-Carlo Simulation Generating Random Variables and Stochastic Processes Martin Haugh Department of Industrial Engineering and Operations Research Columbia University Email: martin.b.haugh@gmail.com

More information

Central Limit Theorem, Joint Distributions Spring 2018

Central Limit Theorem, Joint Distributions Spring 2018 Central Limit Theorem, Joint Distributions 18.5 Spring 218.5.4.3.2.1-4 -3-2 -1 1 2 3 4 Exam next Wednesday Exam 1 on Wednesday March 7, regular room and time. Designed for 1 hour. You will have the full

More information

Practice Exam 1. Loss Amount Number of Losses

Practice Exam 1. Loss Amount Number of Losses Practice Exam 1 1. You are given the following data on loss sizes: An ogive is used as a model for loss sizes. Determine the fitted median. Loss Amount Number of Losses 0 1000 5 1000 5000 4 5000 10000

More information

Parameters Estimation in Stochastic Process Model

Parameters Estimation in Stochastic Process Model Parameters Estimation in Stochastic Process Model A Quasi-Likelihood Approach Ziliang Li University of Maryland, College Park GEE RIT, Spring 28 Outline 1 Model Review The Big Model in Mind: Signal + Noise

More information

Tutorial 11: Limit Theorems. Baoxiang Wang & Yihan Zhang bxwang, April 10, 2017

Tutorial 11: Limit Theorems. Baoxiang Wang & Yihan Zhang bxwang, April 10, 2017 Tutorial 11: Limit Theorems Baoxiang Wang & Yihan Zhang bxwang, yhzhang@cse.cuhk.edu.hk April 10, 2017 1 Outline The Central Limit Theorem (CLT) Normal Approximation Based on CLT De Moivre-Laplace Approximation

More information

x satisfying all regularity conditions. Then

x satisfying all regularity conditions. Then AMS570.01 Practice Midterm Exam Sprig, 018 Name: ID: Sigature: Istructio: This is a close book exam. You are allowed oe-page 8x11 formula sheet (-sided). No cellphoe or calculator or computer is allowed.

More information

Lecture Stat 302 Introduction to Probability - Slides 15

Lecture Stat 302 Introduction to Probability - Slides 15 Lecture Stat 30 Introduction to Probability - Slides 15 AD March 010 AD () March 010 1 / 18 Continuous Random Variable Let X a (real-valued) continuous r.v.. It is characterized by its pdf f : R! [0, )

More information

Improved Inference for Signal Discovery Under Exceptionally Low False Positive Error Rates

Improved Inference for Signal Discovery Under Exceptionally Low False Positive Error Rates Improved Inference for Signal Discovery Under Exceptionally Low False Positive Error Rates (to appear in Journal of Instrumentation) Igor Volobouev & Alex Trindade Dept. of Physics & Astronomy, Texas Tech

More information

Interval estimation. September 29, Outline Basic ideas Sampling variation and CLT Interval estimation using X More general problems

Interval estimation. September 29, Outline Basic ideas Sampling variation and CLT Interval estimation using X More general problems Interval estimation September 29, 2017 STAT 151 Class 7 Slide 1 Outline of Topics 1 Basic ideas 2 Sampling variation and CLT 3 Interval estimation using X 4 More general problems STAT 151 Class 7 Slide

More information

Covariance and Correlation. Def: If X and Y are JDRVs with finite means and variances, then. Example Sampling

Covariance and Correlation. Def: If X and Y are JDRVs with finite means and variances, then. Example Sampling Definitions Properties E(X) µ X Transformations Linearity Monotonicity Expectation Chapter 7 xdf X (x). Expectation Independence Recall: µ X minimizes E[(X c) ] w.r.t. c. The Prediction Problem The Problem:

More information

Homework Assignments

Homework Assignments Homework Assignments Week 1 (p. 57) #4.1, 4., 4.3 Week (pp 58 6) #4.5, 4.6, 4.8(a), 4.13, 4.0, 4.6(b), 4.8, 4.31, 4.34 Week 3 (pp 15 19) #1.9, 1.1, 1.13, 1.15, 1.18 (pp 9 31) #.,.6,.9 Week 4 (pp 36 37)

More information

Homework Problems Stat 479

Homework Problems Stat 479 Chapter 2 1. Model 1 in the table handed out in class is a uniform distribution from 0 to 100. Determine what the table entries would be for a generalized uniform distribution covering the range from a

More information

درس هفتم یادگیري ماشین. (Machine Learning) دانشگاه فردوسی مشهد دانشکده مهندسی رضا منصفی

درس هفتم یادگیري ماشین. (Machine Learning) دانشگاه فردوسی مشهد دانشکده مهندسی رضا منصفی یادگیري ماشین توزیع هاي نمونه و تخمین نقطه اي پارامترها Sampling Distributions and Point Estimation of Parameter (Machine Learning) دانشگاه فردوسی مشهد دانشکده مهندسی رضا منصفی درس هفتم 1 Outline Introduction

More information

Modeling Co-movements and Tail Dependency in the International Stock Market via Copulae

Modeling Co-movements and Tail Dependency in the International Stock Market via Copulae Modeling Co-movements and Tail Dependency in the International Stock Market via Copulae Katja Ignatieva, Eckhard Platen Bachelier Finance Society World Congress 22-26 June 2010, Toronto K. Ignatieva, E.

More information

VI. Continuous Probability Distributions

VI. Continuous Probability Distributions VI. Continuous Proaility Distriutions A. An Important Definition (reminder) Continuous Random Variale - a numerical description of the outcome of an experiment whose outcome can assume any numerical value

More information

Chapter 5: Statistical Inference (in General)

Chapter 5: Statistical Inference (in General) Chapter 5: Statistical Inference (in General) Shiwen Shen University of South Carolina 2016 Fall Section 003 1 / 17 Motivation In chapter 3, we learn the discrete probability distributions, including Bernoulli,

More information

Chapter 4: Commonly Used Distributions. Statistics for Engineers and Scientists Fourth Edition William Navidi

Chapter 4: Commonly Used Distributions. Statistics for Engineers and Scientists Fourth Edition William Navidi Chapter 4: Commonly Used Distributions Statistics for Engineers and Scientists Fourth Edition William Navidi 2014 by Education. This is proprietary material solely for authorized instructor use. Not authorized

More information

Point Estimation. Some General Concepts of Point Estimation. Example. Estimator quality

Point Estimation. Some General Concepts of Point Estimation. Example. Estimator quality Point Estimation Some General Concepts of Point Estimation Statistical inference = conclusions about parameters Parameters == population characteristics A point estimate of a parameter is a value (based

More information

Comparing the Means of. Two Log-Normal Distributions: A Likelihood Approach

Comparing the Means of. Two Log-Normal Distributions: A Likelihood Approach Journal of Statistical and Econometric Methods, vol.3, no.1, 014, 137-15 ISSN: 179-660 (print), 179-6939 (online) Scienpress Ltd, 014 Comparing the Means of Two Log-Normal Distributions: A Likelihood Approach

More information

Chapter 2. Random variables. 2.3 Expectation

Chapter 2. Random variables. 2.3 Expectation Random processes - Chapter 2. Random variables 1 Random processes Chapter 2. Random variables 2.3 Expectation 2.3 Expectation Random processes - Chapter 2. Random variables 2 Among the parameters representing

More information

Chapter 7: Estimation Sections

Chapter 7: Estimation Sections 1 / 31 : Estimation Sections 7.1 Statistical Inference Bayesian Methods: 7.2 Prior and Posterior Distributions 7.3 Conjugate Prior Distributions 7.4 Bayes Estimators Frequentist Methods: 7.5 Maximum Likelihood

More information

Lecture 5: Fundamentals of Statistical Analysis and Distributions Derived from Normal Distributions

Lecture 5: Fundamentals of Statistical Analysis and Distributions Derived from Normal Distributions Lecture 5: Fundamentals of Statistical Analysis and Distributions Derived from Normal Distributions ELE 525: Random Processes in Information Systems Hisashi Kobayashi Department of Electrical Engineering

More information

Review for Final Exam Spring 2014 Jeremy Orloff and Jonathan Bloom

Review for Final Exam Spring 2014 Jeremy Orloff and Jonathan Bloom Review for Final Exam 18.05 Spring 2014 Jeremy Orloff and Jonathan Bloom THANK YOU!!!! JON!! PETER!! RUTHI!! ERIKA!! ALL OF YOU!!!! Probability Counting Sets Inclusion-exclusion principle Rule of product

More information

Lecture 7: Computation of Greeks

Lecture 7: Computation of Greeks Lecture 7: Computation of Greeks Ahmed Kebaier kebaier@math.univ-paris13.fr HEC, Paris Outline 1 The log-likelihood approach Motivation The pathwise method requires some restrictive regularity assumptions

More information

Modelling financial data with stochastic processes

Modelling financial data with stochastic processes Modelling financial data with stochastic processes Vlad Ardelean, Fabian Tinkl 01.08.2012 Chair of statistics and econometrics FAU Erlangen-Nuremberg Outline Introduction Stochastic processes Volatility

More information

Asymptotic results discrete time martingales and stochastic algorithms

Asymptotic results discrete time martingales and stochastic algorithms Asymptotic results discrete time martingales and stochastic algorithms Bernard Bercu Bordeaux University, France IFCAM Summer School Bangalore, India, July 2015 Bernard Bercu Asymptotic results for discrete

More information

Random Variables Handout. Xavier Vilà

Random Variables Handout. Xavier Vilà Random Variables Handout Xavier Vilà Course 2004-2005 1 Discrete Random Variables. 1.1 Introduction 1.1.1 Definition of Random Variable A random variable X is a function that maps each possible outcome

More information

The method of Maximum Likelihood.

The method of Maximum Likelihood. Maximum Likelihood The method of Maximum Likelihood. In developing the least squares estimator - no mention of probabilities. Minimize the distance between the predicted linear regression and the observed

More information

A Stochastic Reserving Today (Beyond Bootstrap)

A Stochastic Reserving Today (Beyond Bootstrap) A Stochastic Reserving Today (Beyond Bootstrap) Presented by Roger M. Hayne, PhD., FCAS, MAAA Casualty Loss Reserve Seminar 6-7 September 2012 Denver, CO CAS Antitrust Notice The Casualty Actuarial Society

More information

1. The number of dental claims for each insured in a calendar year is distributed as a Geometric distribution with variance of

1. The number of dental claims for each insured in a calendar year is distributed as a Geometric distribution with variance of 1. The number of dental claims for each insured in a calendar year is distributed as a Geometric distribution with variance of 7.315. The amount of each claim is distributed as a Pareto distribution with

More information

A New Hybrid Estimation Method for the Generalized Pareto Distribution

A New Hybrid Estimation Method for the Generalized Pareto Distribution A New Hybrid Estimation Method for the Generalized Pareto Distribution Chunlin Wang Department of Mathematics and Statistics University of Calgary May 18, 2011 A New Hybrid Estimation Method for the GPD

More information

STAT 509: Statistics for Engineers Dr. Dewei Wang. Copyright 2014 John Wiley & Sons, Inc. All rights reserved.

STAT 509: Statistics for Engineers Dr. Dewei Wang. Copyright 2014 John Wiley & Sons, Inc. All rights reserved. STAT 509: Statistics for Engineers Dr. Dewei Wang Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger 7 Point CHAPTER OUTLINE 7-1 Point Estimation 7-2

More information

M1 M1 A1 M1 A1 M1 A1 A1 A1 11 A1 2 B1 B1. B1 M1 Relative efficiency (y) = M1 A1 BEWARE PRINTED ANSWER. 5

M1 M1 A1 M1 A1 M1 A1 A1 A1 11 A1 2 B1 B1. B1 M1 Relative efficiency (y) = M1 A1 BEWARE PRINTED ANSWER. 5 Q L e σ π ( W μ e σ π ( W μ M M A Product form. Two Normal terms. Fully correct. (ii ln L const ( W ( W d ln L ( W + ( W dμ 0 σ W σ μ W σ W W ˆ μ σ Chec this is a maximum. d ln L E.g. < 0 dμ σ σ σ μ σ

More information

Web-based Supplementary Materials for. A space-time conditional intensity model. for invasive meningococcal disease occurence

Web-based Supplementary Materials for. A space-time conditional intensity model. for invasive meningococcal disease occurence Web-based Supplementary Materials for A space-time conditional intensity model for invasive meningococcal disease occurence by Sebastian Meyer 1,2, Johannes Elias 3, and Michael Höhle 4,2 1 Department

More information

IEOR E4703: Monte-Carlo Simulation

IEOR E4703: Monte-Carlo Simulation IEOR E4703: Monte-Carlo Simulation Further Variance Reduction Methods Martin Haugh Department of Industrial Engineering and Operations Research Columbia University Email: martin.b.haugh@gmail.com Outline

More information

12 The Bootstrap and why it works

12 The Bootstrap and why it works 12 he Bootstrap and why it works For a review of many applications of bootstrap see Efron and ibshirani (1994). For the theory behind the bootstrap see the books by Hall (1992), van der Waart (2000), Lahiri

More information

Normal Distribution. Notes. Normal Distribution. Standard Normal. Sums of Normal Random Variables. Normal. approximation of Binomial.

Normal Distribution. Notes. Normal Distribution. Standard Normal. Sums of Normal Random Variables. Normal. approximation of Binomial. Lecture 21,22, 23 Text: A Course in Probability by Weiss 8.5 STAT 225 Introduction to Probability Models March 31, 2014 Standard Sums of Whitney Huang Purdue University 21,22, 23.1 Agenda 1 2 Standard

More information

Point Estimation. Copyright Cengage Learning. All rights reserved.

Point Estimation. Copyright Cengage Learning. All rights reserved. 6 Point Estimation Copyright Cengage Learning. All rights reserved. 6.2 Methods of Point Estimation Copyright Cengage Learning. All rights reserved. Methods of Point Estimation The definition of unbiasedness

More information

MTH6154 Financial Mathematics I Stochastic Interest Rates

MTH6154 Financial Mathematics I Stochastic Interest Rates MTH6154 Financial Mathematics I Stochastic Interest Rates Contents 4 Stochastic Interest Rates 45 4.1 Fixed Interest Rate Model............................ 45 4.2 Varying Interest Rate Model...........................

More information

M.I.T Fall Practice Problems

M.I.T Fall Practice Problems M.I.T. 15.450-Fall 2010 Sloan School of Management Professor Leonid Kogan Practice Problems 1. Consider a 3-period model with t = 0, 1, 2, 3. There are a stock and a risk-free asset. The initial stock

More information

Chapter 5 Univariate time-series analysis. () Chapter 5 Univariate time-series analysis 1 / 29

Chapter 5 Univariate time-series analysis. () Chapter 5 Univariate time-series analysis 1 / 29 Chapter 5 Univariate time-series analysis () Chapter 5 Univariate time-series analysis 1 / 29 Time-Series Time-series is a sequence fx 1, x 2,..., x T g or fx t g, t = 1,..., T, where t is an index denoting

More information

IEOR 3106: Introduction to OR: Stochastic Models. Fall 2013, Professor Whitt. Class Lecture Notes: Tuesday, September 10.

IEOR 3106: Introduction to OR: Stochastic Models. Fall 2013, Professor Whitt. Class Lecture Notes: Tuesday, September 10. IEOR 3106: Introduction to OR: Stochastic Models Fall 2013, Professor Whitt Class Lecture Notes: Tuesday, September 10. The Central Limit Theorem and Stock Prices 1. The Central Limit Theorem (CLT See

More information