Point Estimation. Edwin Leuven
|
|
- Sylvia Barton
- 5 years ago
- Views:
Transcription
1 Point Estimation Edwin Leuven
2 Introduction Last time we reviewed statistical inference We saw that while in probability we ask: given a data generating process, what are the properties of the outcomes? in statistics the question is the reverse: given the outcomes, what can we say about the process that generated the data? Statistical inference consists in 1. Estimation (point, interval) 2. Inference (quantifying sampling error, hypothesis testing) 2/43
3 Introduction Today we take a closer look at point estimation We will go over three desirable properties of estimator: 1. Unbiasedness 2. Consistency 3. Efficiency And how to quantify the trade-off between location and variance using the Mean Squared Error (MSE) 3/43
4 Random sampling Statistical inference starts with an assumption about how our data came about (the data generating process ) We introduced the notion of sampling where we consider observations in our data X 1,..., X n as draws from a population or, more generally, an unknown probability distribution f (X) Simple Random Sample We call a sample X 1,..., X n random if X i are independent and identically distributed (i.i.d) random variables Random samples arise if we draw each unit in the population with equal probability in our sample. 4/43
5 Random sampling We will assume throughout that our samples are random! The aim is to use our data X 1,..., X n to learn something about the unknown probability distribution f (X) where the data came from We typically focus on E[X], the mean of X, to explain things but we can ask many different questions: What is the variance of X What is the 10th percentile of X What fraction of X lies below 100,000 etc. Very often we are interested in comparing measurements across populations What is the difference in earnings between men and women 5/43
6 Bias Consider 1. the estimand E[X], and 2. an estimator ˆX What properties do we want our estimator ˆX to have? One desirable property is that ˆX is on average correct We call such estimators unbiased Bias Bias = E[ ˆX] E[X] E[ ˆX] = E[X] 6/43
7 Bias The estimand in our example the population mean E[X] is a number For a given sample ˆX is also a number, we call this the estimate Bias is not the difference between the estimate and the estimand this the estimation error Bias is the average estimation error across (infintely) many random samples! 7/43
8 Estimating the Mean of X The sample average is an unbiased estimator of the mean E[ X n ] = 1 n ni=1 E[X i ] = E[X] but we can think of different unbiased estimators, f.e. X 1 is also an unbiased estimate of E[X] If X has a symmetric distribution then both median(x), and (min(x) + max(x))/2 are unbiased 8/43
9 Estimation the Variance of X The estimator of the variance Var(X) = 1 ni=1 n 1 (X i X n ) 2 Why divide by n 1 and not n? E[ 1 n (X i X n ) 2 ] = 1 n E[(X i X n ) 2 ] n n i=1 i=1 = 1 n E[Xi 2 2X i X n + X n 2 ] n i=1 = E[Xi 2 ] 2E[X i Xn ] + E[ X n 2 ] where the last line follows since = n 1 n (E[X 2 i ] E[X i ] 2 ) = n 1 n Var(X i) E[ X n 2 ] = E[X i X n ] = 1 n E[X i 2 ] + n 1 n E[X i] 2 9/43
10 Variance Estimation We can verify this through numerical simulation: n = 20; nrep = 10^5 varhat1 = rep(0, nrep); varhat2 = rep(0, nrep) for(i in 1:nrep) { x = rnorm(n, 5, sqrt(3)) sx = sum((x - mean(x))^2) varhat1[i] = sx / (n - 1); varhat2[i] = sx / n } mean(varhat1) ## [1] mean(varhat2) ## [1] /43
11 How to choose between two unbiased estimators? Since both are centered around the truth: pick the one that tends to be closest! One measure of close is Var( ˆX), the sampling variance of ˆX x1 = rep(0, nrep); x2 = rep(0, nrep) for(i in 1:nrep) { x = rnorm(100, 0, 1) x1[i] = mean(x); x2[i] = (min(x) + max(x)) / 2 } var(x1) ## [1] var(x2) ## [1] /43
12 How to choose between two unbiased estimators? Since both are centered around the truth: pick the one that tends to be closest! One measure of close is Var( ˆX), the sampling variance of ˆX y1 = rep(0, nrep); y2 = rep(0, nrep) for(i in 1:nrep) { x = runif(100, 0, 1) y1[i] = mean(x); y2[i] = (min(x) + max(x)) / 2 } var(y1) ## [1] var(y2) ## [1] /43
13 How to choose between two unbiased estimators? Normal(0,1) distribution Density x1 13/43
14 How to choose between two unbiased estimators? Uniform[0,1] distribution Density y1 14/43
15 How to choose between two unbiased estimators? The sampling distribution of our estimator depends on the underlying distribution of X i in the population! X i Normal the sample average outperforms the midrange X i Uniform the midrange outperforms the sample average However, the sample average is attractive default because it is often 1. has a sampling distribution that is well understood 2. more efficient (smaller sampling variance) than alternative estimators We will say more about this in the context of the WLLN and the CLT 15/43
16 The Standard Error Above we compared the average and the midrange estimators using the sampling variance Var( ˆX) = E[( ˆX E[ ˆX]) 2 ] = E[ ˆX 2 ] E[ ˆX] 2 It is however common to use the square root of the sampling variance of our estimators This is called the standard error Standard Error of ˆX = Var( ˆX) 16/43
17 The Standard Error of the Sample Proportion Consider a Bernouilli random variable X where { 1 with probability p X = 0 with probability 1 p The sample proportion is X n = 1 n Var( ˆX) = 1 n 2 i i X i with variance Var(X i ) = nvar(x) p(1 p) n 2 = n but this depends on p which is unknown! We have an unbiased estimator of p, namely ˆX and we can therefore estimate the variance as follows Var( X n ) = X(1 X)/n 17/43
18 The Standard Error of the Sample Mean When the distribution of X is unknown but i.i.d. we can also more generally derive the variance of the sample mean as follows Var( X n ) = 1 n 2 i Var(X i ) = Var(X) n this again depends on an unknown parameter, Var(X), but that we also have an estimator of so that Var(X) = 1 n 1 n (X i X n ) 2 i=1 Var(X n ) = Var(X)/n and we get the standard error by taking the square root 18/43
19 Calculating Standard Errors phat = mean(rbinom(100,1,.54)) sqrt(phat * (1-phat) / 100) # estimate ## [1] sqrt(.54*(1-.54)/100) # theoretical s.e. ## [1] sqrt(var(rnorm(100,1,2)) / 100) # estimate ## [1] sqrt(2^2/100) # theoretical s.e. ## [1] /43
20 Bias vs Variance Suppose we have 1. an unbiased estimator with a large sampling variance 2. a biased estimator with a small sampling variance Should we choosing our best estimator on? bias, or variance 20/43
21 Bias vs Variance Low Variance High Variance High Bias Low Bias 21/43
22 Bias vs Variance E[X]=0 Density xhat 22/43
23 Bias vs Variance E[X]=0 Density xhat 23/43
24 Mean Squared Error We may need to choose between two estimators one of which is unbiased Consider the biased estimator, is the sampling variance (or the standard error) still a good measure? Var( ˆX) = E[( ˆX E[ ˆX]) 2 ] = E[( ˆX (E[ ˆX] E[X]) E[X]) 2 ] = E[( ˆX E[X] Bias) 2 ] Suppose Var( ˆX biased ) < Var( ˆX unbiased ) what would you conclude? 24/43
25 Mean Squared Error We are interest in the spread relative to the truth!! This is called the Mean Squared Error (MSE) Mean Squared Error MSE = E[( ˆX E[X]) 2 ] We can show that MSE = E[( ˆX E[X]) 2 ] = E[( ˆX E[ ˆX] + E[ ˆX] E[X]) 2 ] = E[( ˆX E[ ˆX]) 2 ] + (E[ ˆX] E[X]) 2 ] }{{}}{{} Var( ˆX) Bias 2 There is a potential trade-off between Bias and Variance 25/43
26 Mean Squared Error Consider again the following two estimators of the variance: 1. Var(X) = 1 ni=1 n 1 (X i X n ) 2 2. Var(X) = 1 ni=1 n (X i X n ) 2 We saw that 1. is unbiased while 2. is not How about the MSE? Consider the example on p.10 bias2 var mse vhat vhat here X N (5, 3) and n = 20, try for X χ 2 (1) and vary n 26/43
27 Consistency We mentioned unbiasedness as an attractive property of an estimator But unbiasedness is a finite sample property silent on how close the estimate is to the truth a nonlinear function of an unbiased estimator is typically not unbiased We will now consider consistenty which is a large sample property consistent estimators converge to the truth as sample sizes grow large a nonlinear function of a consistent estimator is typically consistent 27/43
28 Consistency Consistency Let ˆθ n be an estimator of θ based on a sample of size n. We call ˆθ n consistent if it gets closer and closer to θ as data accumulates, and write: ˆθ n θ The precise definition is: lim Pr( ˆθ n θ > ɛ) = 0 ɛ > 0 n Weak law of large numbers If X i are i.i.d. random variables with E[ X i ] <, then 1 n X i E[X i ] i 28/43
29 Consistency Consider sampling from a population of voters where { 1 if person i support the right X i = 0 if person i support the left and Pr(X i = 1) = 0.54 Denote our data by x 1,..., x n We estimate p by ˆp = (x x n )/n 29/43
30 Consistency phat(n) n 30/43
31 Consistency phat(n) n 31/43
32 Consistency phat(n) n 32/43
33 Biased and Consistent Consider then U Uniform[0, θ] ˆθ = max(u 1,..., u n ) is a biased estimator since but is consistent E[ˆθ] = n n 1 θ 33/43
34 Biased and Consistent phat(n) n 34/43
35 Biased vs Consistent Estimates of the mean unbiased and consistent X unbiased and inconsistent X 1 biased and consistent X + 1/n biased and inconsistent can you think of one? 35/43
36 Consistent Estimators Finding unbiased estimators is not so easy because even if E[ˆθ] = θ E[g(ˆθ)] g(θ) For example, if we know that E[ˆθ] = σ 2 then E[ ˆθ] σ Finding consistent estimators is much easier because of the WLLN and because functions and combinations of consistent estimators are often again consistent 36/43
37 Consistent Estimators Continuous Mapping Theorem (CMT) If g( ) is a continuous function and ˆθ a consistent estimator of θ, then g(ˆθ) g(θ) This means that if then ˆθ σ 2 ˆθ σ 37/43
38 Consistent Estimators Suppose you want a consistent estimator of the variance of X: By the WLLN you know that 1 n and by the CMT and therefore that Var(X) = E[X 2 ] E[X] 2 X i E[X], and 1 n i i 1 n i ( 1 X i ) 2 E[X] 2 n i X 2 i X 2 i E[X 2 ] ( 1 X i ) 2 Var(X) n i This is an application of the Method of Moments 38/43
39 Summary With point estimation the objective is to estimate (compute a best guess of) a population parameter θ using our data Parameters are things like: means, percentiles, minima, maxima, differences in means between groups, etc. etc. Estimates differ across samples, and estimators are therefore random variables Estimators have a distribution 39/43
40 Summary To characterize an estimator we focussed on two key properties of its sampling distribution: 1. location (unbiasedness, consistency) 2. spread (variance, MSE) Unbiasedness, E[ˆθ] = θ means that the expectation of our estimator equals the population parameter it intends to estimate The expectation here is across infintely many random samples, and unbiasedness means that we are correct on average Unbiasedness is a finite sample property because it is true for samples of any size 40/43
41 Summary While being on target on average (location) is important, we never have this average estimate but a single one We would therefore prefer to be close to the target in a given sample This is more likely to happen if the spread of our estimator is small A natural measure of spead is the variance: Var(ˆθ) = E[(ˆθ E[ˆθ]) 2 ] But for a biased estimator it measure the spread around the wrong location since then E[ˆθ] = θ + Bias 41/43
42 Summary This is why we turned to the Mean Squared Error (MSE) MSE(ˆθ) = E[(ˆθ θ) 2 ] which measures the spread of the estimator ˆθ around the true parameter value θ We saw that MSE = Variance + Bias 2 and that a trade-off between bias and variance can make us prefer a biased estimator over an unbiased one 42/43
43 Summary We often use consistent estimators because unbiased estimators are difficult to find or may not exist Consistent estimators can be biased in small samples, but converge to the population parameter as more data become avaiable: ˆθ θ The Weak Law of Large Numbers says that with random sampling sample averages are consistent estimators of corresponding population averages We can often combine consistent estimators to construct new consistent estimators Consistency is a large sample property 43/43
Definition 9.1 A point estimate is any function T (X 1,..., X n ) of a random sample. We often write an estimator of the parameter θ as ˆθ.
9 Point estimation 9.1 Rationale behind point estimation When sampling from a population described by a pdf f(x θ) or probability function P [X = x θ] knowledge of θ gives knowledge of the entire population.
More informationChapter 8. Introduction to Statistical Inference
Chapter 8. Introduction to Statistical Inference Point Estimation Statistical inference is to draw some type of conclusion about one or more parameters(population characteristics). Now you know that a
More informationChapter 8: Sampling distributions of estimators Sections
Chapter 8 continued Chapter 8: Sampling distributions of estimators Sections 8.1 Sampling distribution of a statistic 8.2 The Chi-square distributions 8.3 Joint Distribution of the sample mean and sample
More informationMATH 3200 Exam 3 Dr. Syring
. Suppose n eligible voters are polled (randomly sampled) from a population of size N. The poll asks voters whether they support or do not support increasing local taxes to fund public parks. Let M be
More informationmay be of interest. That is, the average difference between the estimator and the truth. Estimators with Bias(ˆθ) = 0 are called unbiased.
1 Evaluating estimators Suppose you observe data X 1,..., X n that are iid observations with distribution F θ indexed by some parameter θ. When trying to estimate θ, one may be interested in determining
More informationPoint Estimation. Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage
6 Point Estimation Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage Point Estimation Statistical inference: directed toward conclusions about one or more parameters. We will use the generic
More informationReview of key points about estimators
Review of key points about estimators Populations can be at least partially described by population parameters Population parameters include: mean, proportion, variance, etc. Because populations are often
More informationReview of key points about estimators
Review of key points about estimators Populations can be at least partially described by population parameters Population parameters include: mean, proportion, variance, etc. Because populations are often
More informationChapter 5. Statistical inference for Parametric Models
Chapter 5. Statistical inference for Parametric Models Outline Overview Parameter estimation Method of moments How good are method of moments estimates? Interval estimation Statistical Inference for Parametric
More information4.1 Introduction Estimating a population mean The problem with estimating a population mean with a sample mean: an example...
Chapter 4 Point estimation Contents 4.1 Introduction................................... 2 4.2 Estimating a population mean......................... 2 4.2.1 The problem with estimating a population mean
More informationChapter 7 - Lecture 1 General concepts and criteria
Chapter 7 - Lecture 1 General concepts and criteria January 29th, 2010 Best estimator Mean Square error Unbiased estimators Example Unbiased estimators not unique Special case MVUE Bootstrap General Question
More informationStatistical analysis and bootstrapping
Statistical analysis and bootstrapping p. 1/15 Statistical analysis and bootstrapping Michel Bierlaire michel.bierlaire@epfl.ch Transport and Mobility Laboratory Statistical analysis and bootstrapping
More informationTwo hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER
Two hours MATH20802 To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER STATISTICAL METHODS Answer any FOUR of the SIX questions.
More informationApplied Statistics I
Applied Statistics I Liang Zhang Department of Mathematics, University of Utah July 14, 2008 Liang Zhang (UofU) Applied Statistics I July 14, 2008 1 / 18 Point Estimation Liang Zhang (UofU) Applied Statistics
More informationSection 2.4. Properties of point estimators 135
Section 2.4. Properties of point estimators 135 The fact that S 2 is an estimator of σ 2 for any population distribution is one of the most compelling reasons to use the n 1 in the denominator of the definition
More informationPoint Estimation. Some General Concepts of Point Estimation. Example. Estimator quality
Point Estimation Some General Concepts of Point Estimation Statistical inference = conclusions about parameters Parameters == population characteristics A point estimate of a parameter is a value (based
More informationChapter 7: Point Estimation and Sampling Distributions
Chapter 7: Point Estimation and Sampling Distributions Seungchul Baek Department of Statistics, University of South Carolina STAT 509: Statistics for Engineers 1 / 20 Motivation In chapter 3, we learned
More information8.1 Estimation of the Mean and Proportion
8.1 Estimation of the Mean and Proportion Statistical inference enables us to make judgments about a population on the basis of sample information. The mean, standard deviation, and proportions of a population
More informationInterval estimation. September 29, Outline Basic ideas Sampling variation and CLT Interval estimation using X More general problems
Interval estimation September 29, 2017 STAT 151 Class 7 Slide 1 Outline of Topics 1 Basic ideas 2 Sampling variation and CLT 3 Interval estimation using X 4 More general problems STAT 151 Class 7 Slide
More informationSection 0: Introduction and Review of Basic Concepts
Section 0: Introduction and Review of Basic Concepts Carlos M. Carvalho The University of Texas McCombs School of Business mccombs.utexas.edu/faculty/carlos.carvalho/teaching 1 Getting Started Syllabus
More informationPoint Estimators. STATISTICS Lecture no. 10. Department of Econometrics FEM UO Brno office 69a, tel
STATISTICS Lecture no. 10 Department of Econometrics FEM UO Brno office 69a, tel. 973 442029 email:jiri.neubauer@unob.cz 8. 12. 2009 Introduction Suppose that we manufacture lightbulbs and we want to state
More informationBIO5312 Biostatistics Lecture 5: Estimations
BIO5312 Biostatistics Lecture 5: Estimations Yujin Chung September 27th, 2016 Fall 2016 Yujin Chung Lec5: Estimations Fall 2016 1/34 Recap Yujin Chung Lec5: Estimations Fall 2016 2/34 Today s lecture and
More informationChapter 8: Sampling distributions of estimators Sections
Chapter 8: Sampling distributions of estimators Sections 8.1 Sampling distribution of a statistic 8.2 The Chi-square distributions 8.3 Joint Distribution of the sample mean and sample variance Skip: p.
More informationModule 4: Point Estimation Statistics (OA3102)
Module 4: Point Estimation Statistics (OA3102) Professor Ron Fricker Naval Postgraduate School Monterey, California Reading assignment: WM&S chapter 8.1-8.4 Revision: 1-12 1 Goals for this Module Define
More informationSTAT 509: Statistics for Engineers Dr. Dewei Wang. Copyright 2014 John Wiley & Sons, Inc. All rights reserved.
STAT 509: Statistics for Engineers Dr. Dewei Wang Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger 7 Point CHAPTER OUTLINE 7-1 Point Estimation 7-2
More informationدرس هفتم یادگیري ماشین. (Machine Learning) دانشگاه فردوسی مشهد دانشکده مهندسی رضا منصفی
یادگیري ماشین توزیع هاي نمونه و تخمین نقطه اي پارامترها Sampling Distributions and Point Estimation of Parameter (Machine Learning) دانشگاه فردوسی مشهد دانشکده مهندسی رضا منصفی درس هفتم 1 Outline Introduction
More informationBack to estimators...
Back to estimators... So far, we have: Identified estimators for common parameters Discussed the sampling distributions of estimators Introduced ways to judge the goodness of an estimator (bias, MSE, etc.)
More informationPoint Estimation. Principle of Unbiased Estimation. When choosing among several different estimators of θ, select one that is unbiased.
Point Estimation Point Estimation Definition A point estimate of a parameter θ is a single number that can be regarded as a sensible value for θ. A point estimate is obtained by selecting a suitable statistic
More informationSection 1.4: Learning from data
Section 1.4: Learning from data Jared S. Murray The University of Texas at Austin McCombs School of Business Suggested reading: OpenIntro Statistics, Chapter 4.1, 4.2, 4.4, 5.3 1 A First Modeling Exercise
More informationChapter 7: SAMPLING DISTRIBUTIONS & POINT ESTIMATION OF PARAMETERS
Chapter 7: SAMPLING DISTRIBUTIONS & POINT ESTIMATION OF PARAMETERS Part 1: Introduction Sampling Distributions & the Central Limit Theorem Point Estimation & Estimators Sections 7-1 to 7-2 Sample data
More informationChapter 7 Sampling Distributions and Point Estimation of Parameters
Chapter 7 Sampling Distributions and Point Estimation of Parameters Part 1: Sampling Distributions, the Central Limit Theorem, Point Estimation & Estimators Sections 7-1 to 7-2 1 / 25 Statistical Inferences
More informationStatistical estimation
Statistical estimation Statistical modelling: theory and practice Gilles Guillot gigu@dtu.dk September 3, 2013 Gilles Guillot (gigu@dtu.dk) Estimation September 3, 2013 1 / 27 1 Introductory example 2
More informationConfidence Intervals Introduction
Confidence Intervals Introduction A point estimate provides no information about the precision and reliability of estimation. For example, the sample mean X is a point estimate of the population mean μ
More informationKey Objectives. Module 2: The Logic of Statistical Inference. Z-scores. SGSB Workshop: Using Statistical Data to Make Decisions
SGSB Workshop: Using Statistical Data to Make Decisions Module 2: The Logic of Statistical Inference Dr. Tom Ilvento January 2006 Dr. Mugdim Pašić Key Objectives Understand the logic of statistical inference
More informationUnit 5: Sampling Distributions of Statistics
Unit 5: Sampling Distributions of Statistics Statistics 571: Statistical Methods Ramón V. León 6/12/2004 Unit 5 - Stat 571 - Ramon V. Leon 1 Definitions and Key Concepts A sample statistic used to estimate
More information1. Covariance between two variables X and Y is denoted by Cov(X, Y) and defined by. Cov(X, Y ) = E(X E(X))(Y E(Y ))
Correlation & Estimation - Class 7 January 28, 2014 Debdeep Pati Association between two variables 1. Covariance between two variables X and Y is denoted by Cov(X, Y) and defined by Cov(X, Y ) = E(X E(X))(Y
More informationUnit 5: Sampling Distributions of Statistics
Unit 5: Sampling Distributions of Statistics Statistics 571: Statistical Methods Ramón V. León 6/12/2004 Unit 5 - Stat 571 - Ramon V. Leon 1 Definitions and Key Concepts A sample statistic used to estimate
More informationSimulation Wrap-up, Statistics COS 323
Simulation Wrap-up, Statistics COS 323 Today Simulation Re-cap Statistics Variance and confidence intervals for simulations Simulation wrap-up FYI: No class or office hours Thursday Simulation wrap-up
More informationSTAT/MATH 395 PROBABILITY II
STAT/MATH 395 PROBABILITY II Distribution of Random Samples & Limit Theorems Néhémy Lim University of Washington Winter 2017 Outline Distribution of i.i.d. Samples Convergence of random variables The Laws
More informationLecture 10: Point Estimation
Lecture 10: Point Estimation MSU-STT-351-Sum-17B (P. Vellaisamy: MSU-STT-351-Sum-17B) Probability & Statistics for Engineers 1 / 31 Basic Concepts of Point Estimation A point estimate of a parameter θ,
More informationChapter 4: Asymptotic Properties of MLE (Part 3)
Chapter 4: Asymptotic Properties of MLE (Part 3) Daniel O. Scharfstein 09/30/13 1 / 1 Breakdown of Assumptions Non-Existence of the MLE Multiple Solutions to Maximization Problem Multiple Solutions to
More informationThe Bernoulli distribution
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike License. Your use of this material constitutes acceptance of that license and the conditions of use of materials on this
More informationUNIVERSITY OF VICTORIA Midterm June 2014 Solutions
UNIVERSITY OF VICTORIA Midterm June 04 Solutions NAME: STUDENT NUMBER: V00 Course Name & No. Inferential Statistics Economics 46 Section(s) A0 CRN: 375 Instructor: Betty Johnson Duration: hour 50 minutes
More informationChapter 7. Sampling Distributions and the Central Limit Theorem
Chapter 7. Sampling Distributions and the Central Limit Theorem 1 Introduction 2 Sampling Distributions related to the normal distribution 3 The central limit theorem 4 The normal approximation to binomial
More informationDiscrete probability distributions
Discrete probability distributions Probability distributions Discrete random variables Expected values (mean) Variance Linear functions - mean & standard deviation Standard deviation 1 Probability distributions
More informationChapter 7: Estimation Sections
1 / 40 Chapter 7: Estimation Sections 7.1 Statistical Inference Bayesian Methods: Chapter 7 7.2 Prior and Posterior Distributions 7.3 Conjugate Prior Distributions 7.4 Bayes Estimators Frequentist Methods:
More informationLikelihood Methods of Inference. Toss coin 6 times and get Heads twice.
Methods of Inference Toss coin 6 times and get Heads twice. p is probability of getting H. Probability of getting exactly 2 heads is 15p 2 (1 p) 4 This function of p, is likelihood function. Definition:
More informationStat 139 Homework 2 Solutions, Fall 2016
Stat 139 Homework 2 Solutions, Fall 2016 Problem 1. The sum of squares of a sample of data is minimized when the sample mean, X = Xi /n, is used as the basis of the calculation. Define g(c) as a function
More informationStatistics for Business and Economics
Statistics for Business and Economics Chapter 7 Estimation: Single Population Copyright 010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 7-1 Confidence Intervals Contents of this chapter: Confidence
More informationA New Hybrid Estimation Method for the Generalized Pareto Distribution
A New Hybrid Estimation Method for the Generalized Pareto Distribution Chunlin Wang Department of Mathematics and Statistics University of Calgary May 18, 2011 A New Hybrid Estimation Method for the GPD
More informationLecture 9 - Sampling Distributions and the CLT
Lecture 9 - Sampling Distributions and the CLT Sta102/BME102 Colin Rundel September 23, 2015 1 Variability of Estimates Activity Sampling distributions - via simulation Sampling distributions - via CLT
More informationChapter 5: Statistical Inference (in General)
Chapter 5: Statistical Inference (in General) Shiwen Shen University of South Carolina 2016 Fall Section 003 1 / 17 Motivation In chapter 3, we learn the discrete probability distributions, including Bernoulli,
More informationECE 295: Lecture 03 Estimation and Confidence Interval
ECE 295: Lecture 03 Estimation and Confidence Interval Spring 2018 Prof Stanley Chan School of Electrical and Computer Engineering Purdue University 1 / 23 Theme of this Lecture What is Estimation? You
More informationSTAT Chapter 6: Sampling Distributions
STAT 515 -- Chapter 6: Sampling Distributions Definition: Parameter = a number that characterizes a population (example: population mean ) it s typically unknown. Statistic = a number that characterizes
More informationLecture Notes 6. Assume F belongs to a family of distributions, (e.g. F is Normal), indexed by some parameter θ.
Sufficient Statistics Lecture Notes 6 Sufficiency Data reduction in terms of a particular statistic can be thought of as a partition of the sample space X. Definition T is sufficient for θ if the conditional
More informationLecture 9 - Sampling Distributions and the CLT. Mean. Margin of error. Sta102/BME102. February 6, Sample mean ( X ): x i
Lecture 9 - Sampling Distributions and the CLT Sta102/BME102 Colin Rundel February 6, 2015 http:// pewresearch.org/ pubs/ 2191/ young-adults-workers-labor-market-pay-careers-advancement-recession Sta102/BME102
More informationSection The Sampling Distribution of a Sample Mean
Section 5.2 - The Sampling Distribution of a Sample Mean Statistics 104 Autumn 2004 Copyright c 2004 by Mark E. Irwin The Sampling Distribution of a Sample Mean Example: Quality control check of light
More informationComputer Statistics with R
MAREK GAGOLEWSKI KONSTANCJA BOBECKA-WESO LOWSKA PRZEMYS LAW GRZEGORZEWSKI Computer Statistics with R 5. Point Estimation Faculty of Mathematics and Information Science Warsaw University of Technology []
More informationChapter 4: Commonly Used Distributions. Statistics for Engineers and Scientists Fourth Edition William Navidi
Chapter 4: Commonly Used Distributions Statistics for Engineers and Scientists Fourth Edition William Navidi 2014 by Education. This is proprietary material solely for authorized instructor use. Not authorized
More informationMVE051/MSG Lecture 7
MVE051/MSG810 2017 Lecture 7 Petter Mostad Chalmers November 20, 2017 The purpose of collecting and analyzing data Purpose: To build and select models for parts of the real world (which can be used for
More information1 Introduction 1. 3 Confidence interval for proportion p 6
Math 321 Chapter 5 Confidence Intervals (draft version 2019/04/15-13:41:02) Contents 1 Introduction 1 2 Confidence interval for mean µ 2 2.1 Known variance................................. 3 2.2 Unknown
More information1. Statistical problems - a) Distribution is known. b) Distribution is unknown.
Probability February 5, 2013 Debdeep Pati Estimation 1. Statistical problems - a) Distribution is known. b) Distribution is unknown. 2. When Distribution is known, then we can have either i) Parameters
More informationMuch of what appears here comes from ideas presented in the book:
Chapter 11 Robust statistical methods Much of what appears here comes from ideas presented in the book: Huber, Peter J. (1981), Robust statistics, John Wiley & Sons (New York; Chichester). There are many
More informationχ 2 distributions and confidence intervals for population variance
χ 2 distributions and confidence intervals for population variance Let Z be a standard Normal random variable, i.e., Z N(0, 1). Define Y = Z 2. Y is a non-negative random variable. Its distribution is
More informationSimple Random Sampling. Sampling Distribution
STAT 503 Sampling Distribution and Statistical Estimation 1 Simple Random Sampling Simple random sampling selects with equal chance from (available) members of population. The resulting sample is a simple
More informationSection 2: Estimation, Confidence Intervals and Testing Hypothesis
Section 2: Estimation, Confidence Intervals and Testing Hypothesis Carlos M. Carvalho The University of Texas at Austin McCombs School of Business http://faculty.mccombs.utexas.edu/carlos.carvalho/teaching/
More informationThe Constant Expected Return Model
Chapter 1 The Constant Expected Return Model The first model of asset returns we consider is the very simple constant expected return (CER)model.Thismodelassumesthatanasset sreturnover time is normally
More informationElementary Statistics Lecture 5
Elementary Statistics Lecture 5 Sampling Distributions Chong Ma Department of Statistics University of South Carolina Chong Ma (Statistics, USC) STAT 201 Elementary Statistics 1 / 24 Outline 1 Introduction
More informationStatistics 431 Spring 2007 P. Shaman. Preliminaries
Statistics 4 Spring 007 P. Shaman The Binomial Distribution Preliminaries A binomial experiment is defined by the following conditions: A sequence of n trials is conducted, with each trial having two possible
More informationContents. 1 Introduction. Math 321 Chapter 5 Confidence Intervals. 1 Introduction 1
Math 321 Chapter 5 Confidence Intervals (draft version 2019/04/11-11:17:37) Contents 1 Introduction 1 2 Confidence interval for mean µ 2 2.1 Known variance................................. 2 2.2 Unknown
More informationReview for Final Exam Spring 2014 Jeremy Orloff and Jonathan Bloom
Review for Final Exam 18.05 Spring 2014 Jeremy Orloff and Jonathan Bloom THANK YOU!!!! JON!! PETER!! RUTHI!! ERIKA!! ALL OF YOU!!!! Probability Counting Sets Inclusion-exclusion principle Rule of product
More informationLET us say we have a population drawn from some unknown probability distribution f(x) with some
CmpE 343 Lecture Notes 9: Estimation Ethem Alpaydın December 30, 04 LET us say we have a population drawn from some unknown probability distribution fx with some parameter θ. When we do not know θ, we
More informationModule 4: Probability
Module 4: Probability 1 / 22 Probability concepts in statistical inference Probability is a way of quantifying uncertainty associated with random events and is the basis for statistical inference. Inference
More informationModule 3: Sampling Distributions and the CLT Statistics (OA3102)
Module 3: Sampling Distributions and the CLT Statistics (OA3102) Professor Ron Fricker Naval Postgraduate School Monterey, California Reading assignment: WM&S chpt 7.1-7.3, 7.5 Revision: 1-12 1 Goals for
More informationStatistics and Probability
Statistics and Probability Continuous RVs (Normal); Confidence Intervals Outline Continuous random variables Normal distribution CLT Point estimation Confidence intervals http://www.isrec.isb-sib.ch/~darlene/geneve/
More informationChapter 7. Sampling Distributions and the Central Limit Theorem
Chapter 7. Sampling Distributions and the Central Limit Theorem 1 Introduction 2 Sampling Distributions related to the normal distribution 3 The central limit theorem 4 The normal approximation to binomial
More informationStatistics for Business and Economics
Statistics for Business and Economics Chapter 5 Continuous Random Variables and Probability Distributions Ch. 5-1 Probability Distributions Probability Distributions Ch. 4 Discrete Continuous Ch. 5 Probability
More informationIntroduction Dickey-Fuller Test Option Pricing Bootstrapping. Simulation Methods. Chapter 13 of Chris Brook s Book.
Simulation Methods Chapter 13 of Chris Brook s Book Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 April 26, 2017 Christopher
More informationHomework Problems Stat 479
Chapter 10 91. * A random sample, X1, X2,, Xn, is drawn from a distribution with a mean of 2/3 and a variance of 1/18. ˆ = (X1 + X2 + + Xn)/(n-1) is the estimator of the distribution mean θ. Find MSE(
More informationScenario Generation and Sampling Methods
Scenario Generation and Sampling Methods Güzin Bayraksan Tito Homem-de-Mello SVAN 2016 IMPA May 9th, 2016 Bayraksan (OSU) & Homem-de-Mello (UAI) Scenario Generation and Sampling SVAN IMPA May 9 1 / 30
More informationWeek 2 Quantitative Analysis of Financial Markets Hypothesis Testing and Confidence Intervals
Week 2 Quantitative Analysis of Financial Markets Hypothesis Testing and Confidence Intervals Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg :
More informationChapter 6: Point Estimation
Chapter 6: Point Estimation Professor Sharabati Purdue University March 10, 2014 Professor Sharabati (Purdue University) Point Estimation Spring 2014 1 / 37 Chapter Overview Point estimator and point estimate
More informationTutorial 6. Sampling Distribution. ENGG2450A Tutors. 27 February The Chinese University of Hong Kong 1/6
Tutorial 6 Sampling Distribution ENGG2450A Tutors The Chinese University of Hong Kong 27 February 2017 1/6 Random Sample and Sampling Distribution 2/6 Random sample Consider a random variable X with distribution
More informationSTATS 200: Introduction to Statistical Inference. Lecture 4: Asymptotics and simulation
STATS 200: Introduction to Statistical Inference Lecture 4: Asymptotics and simulation Recap We ve discussed a few examples of how to determine the distribution of a statistic computed from data, assuming
More informationChapter 7 Study Guide: The Central Limit Theorem
Chapter 7 Study Guide: The Central Limit Theorem Introduction Why are we so concerned with means? Two reasons are that they give us a middle ground for comparison and they are easy to calculate. In this
More information14.30 Introduction to Statistical Methods in Economics Spring 2009
MIT OpenCourseWare http://ocw.mit.edu 14.30 Introduction to Statistical Methods in Economics Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.
More informationMath 5760/6890 Introduction to Mathematical Finance
Math 5760/6890 Introduction to Mathematical Finance Instructor: Jingyi Zhu Office: LCB 335 Telephone:581-3236 E-mail: zhu@math.utah.edu Class web page: www.math.utah.edu/~zhu/5760_12f.html What you should
More informationIEOR E4703: Monte-Carlo Simulation
IEOR E4703: Monte-Carlo Simulation Simulation Efficiency and an Introduction to Variance Reduction Methods Martin Haugh Department of Industrial Engineering and Operations Research Columbia University
More informationSection 2: Estimation, Confidence Intervals and Testing Hypothesis
Section 2: Estimation, Confidence Intervals and Testing Hypothesis Tengyuan Liang, Chicago Booth https://tyliang.github.io/bus41000/ Suggested Reading: Naked Statistics, Chapters 7, 8, 9 and 10 OpenIntro
More informationChapter 4: Estimation
Slide 4.1 Chapter 4: Estimation Estimation is the process of using sample data to draw inferences about the population Sample information x, s Inferences Population parameters µ,σ Slide 4. Point and interval
More informationFinancial Risk Forecasting Chapter 9 Extreme Value Theory
Financial Risk Forecasting Chapter 9 Extreme Value Theory Jon Danielsson 2017 London School of Economics To accompany Financial Risk Forecasting www.financialriskforecasting.com Published by Wiley 2011
More informationMath489/889 Stochastic Processes and Advanced Mathematical Finance Homework 5
Math489/889 Stochastic Processes and Advanced Mathematical Finance Homework 5 Steve Dunbar Due Fri, October 9, 7. Calculate the m.g.f. of the random variable with uniform distribution on [, ] and then
More informationSampling and sampling distribution
Sampling and sampling distribution September 12, 2017 STAT 101 Class 5 Slide 1 Outline of Topics 1 Sampling 2 Sampling distribution of a mean 3 Sampling distribution of a proportion STAT 101 Class 5 Slide
More information1. Variability in estimates and CLT
Unit3: Foundationsforinference 1. Variability in estimates and CLT Sta 101 - Fall 2015 Duke University, Department of Statistical Science Dr. Çetinkaya-Rundel Slides posted at http://bit.ly/sta101_f15
More informationModelling Returns: the CER and the CAPM
Modelling Returns: the CER and the CAPM Carlo Favero Favero () Modelling Returns: the CER and the CAPM 1 / 20 Econometric Modelling of Financial Returns Financial data are mostly observational data: they
More informationModeling Portfolios that Contain Risky Assets Stochastic Models I: One Risky Asset
Modeling Portfolios that Contain Risky Assets Stochastic Models I: One Risky Asset C. David Levermore University of Maryland, College Park Math 420: Mathematical Modeling March 25, 2014 version c 2014
More informationLecture 22. Survey Sampling: an Overview
Math 408 - Mathematical Statistics Lecture 22. Survey Sampling: an Overview March 25, 2013 Konstantin Zuev (USC) Math 408, Lecture 22 March 25, 2013 1 / 16 Survey Sampling: What and Why In surveys sampling
More information10/1/2012. PSY 511: Advanced Statistics for Psychological and Behavioral Research 1
PSY 511: Advanced Statistics for Psychological and Behavioral Research 1 Pivotal subject: distributions of statistics. Foundation linchpin important crucial You need sampling distributions to make inferences:
More informationMLLunsford 1. Activity: Central Limit Theorem Theory and Computations
MLLunsford 1 Activity: Central Limit Theorem Theory and Computations Concepts: The Central Limit Theorem; computations using the Central Limit Theorem. Prerequisites: The student should be familiar with
More informationSection 1.3: More Probability and Decisions: Linear Combinations and Continuous Random Variables
Section 1.3: More Probability and Decisions: Linear Combinations and Continuous Random Variables Jared S. Murray The University of Texas at Austin McCombs School of Business OpenIntro Statistics, Chapters
More information