Module 4: Point Estimation Statistics (OA3102)
|
|
- Eustacia Shaw
- 5 years ago
- Views:
Transcription
1 Module 4: Point Estimation Statistics (OA3102) Professor Ron Fricker Naval Postgraduate School Monterey, California Reading assignment: WM&S chapter Revision:
2 Goals for this Module Define and distinguish between point estimates vs. point estimators Discuss characteristics of good point estimates Unbiasedness and minimum variance Mean square error Consistency, efficiency, robustness Quantify and calculate the precision of an estimator via the standard error Discuss the Bootstrap as a way to empirically estimate standard errors Revision:
3 Welcome to Statistical Inference! Problem: We have a simple random sample of data We want to use it to estimate a population quantity (usually a parameter of a distribution) In point estimation, the estimate is a number Issue: Often lots of possible estimates x x E.g., estimate E(X) with,, or??? This module: What s a good point estimate? Module 5: Interval estimators Module 6: Methods for finding good estimators Revision:
4 Point Estimation A point estimate of a parameter q is a single number that is a sensible value for q I.e., it s a numerical estimate of q We ll use q to represent a generic parameter it could be m, s, p, etc. The point estimate is a statistic calculated from a sample of data The statistic is called a point estimator Using hat notation, we will denote it as qˆ For example, we might use x to estimate m, so in this case ˆ m x Revision:
5 Definition: Estimator An estimator is a rule, often expressed as a formula, that tells how to calculate the value of an estimate based on the measurements contained in a sample Revision:
6 An Example You re testing a new missile and want to estimate the probability of kill (against a particular target under specific conditions) You do a test with n=25 shots The parameter to be estimated is p k, the fraction of kills out of the 25 shots Let X be the number of kills In your test you observed x=15 A reasonable estimator and estimate is estimator: pˆ k X n x 15 estimate: pˆ k 0.6 n 25 Revision:
7 A More Difficult Example On another test, you re estimating the mean time to failure (MTTF) of a piece of electronic equipment Measurements for n=20 tests (in units of 1,000 hrs): Turns out a normal distribution fits the data quite well So, what we want to do is to estimate m, the MTTF How best to do this? Revision:
8 Example, cont d Here are some possible estimators for m and their values for this data set: 20 1 (1) ˆ m X, so x xi i (2) ˆ m X, so x min( Xi) max( Xi) (3) ˆ m Xe, so xe (4) ˆ m X tr(10), so xtr (10) x( i) i3 Which estimator should you use? I.e., which is likely to give estimates closer to the true (but unknown) population value? Revision:
9 Another Example In a wargame computer simulation, you want to estimate a scenario s run-time variability (s 2 ) The run times (in seconds) for eight runs are: Two possible estimates: 1 n 1 n (1) ˆ s S X X, so s n i1 X 2 i X 2 i n 2 (2) ˆ s, so estimate i1 Why prefer (1) over (2)? Revision:
10 Bias Definition: Let qˆ ˆ q E ˆ q q, ˆ q be a point estimator for a q is an unbiased estimator if E ˆ q If is said to be biased q Definition: The bias of a point estimator given by B ˆ q E ˆ q q qˆ is E.g.: Revision: * Figures from Probability and Statistics for Engineering and the Sciences, 7 th ed., Duxbury Press, 2008.
11 Proving Unbiasedness Proposition: Let X be a binomial r.v. with parameters n and p. The sample proportion ˆp X n is an unbiased estimator of p. Proof: Revision:
12 Remember: Rules for Linear Combinations of Random Variables For random variables X 1, X 2,, X n Whether or not the X i s are independent E a X a X a X n n a E X a E X a E X If the X 1, X 2,, X n are independent Var a X a X a X n n a Var X a Var X a Var X n n n n Revision:
13 Example 8.1 Let Y 1, Y 2,, Y n be a random sample with E(Y i )=m and Var(Y i )=s 2. Show that n 2 1 S ' Y Y is a biased estimator for s 2, while S 2 is an unbiased estimator of s 2. Solution: 2 i n i 1 Revision:
14 Example 8.1 (continued) Revision:
15 Example 8.1 (continued) Revision:
16 Another Biased Estimator Let X be the reaction time to a stimulus with X~U[0,q], where we want to estimate q based on a random sample X 1, X 2,, X n Since q is the largest possible reaction time, consider the estimator However, unbiasedness implies that we can observe values bigger and smaller than q Why? Thus, qˆ must be a biased estimator 1 ˆ 1 max X1, X 2,, X n q Revision:
17 Fixing the Biased Estimator For the same problem consider the estimator ˆ max 2 X,,, 1 X 2 X n q n 1 n Show this estimator is unbiased Revision:
18 Revision:
19 One Criterion for Choosing Among Estimators Principle of minimum variance unbiased estimation: Among all estimators of q that are unbiased, choose the one that has the minimum variance The resulting estimator is called the minimum variance unbiased estimator (MVUE) of q qˆ Estimator ˆ q is preferred to ˆ q 1 2 Revision: * Figure from Probability and Statistics for Engineering and the Sciences, 7 th ed., Duxbury Press, 2008.
20 Example of an MVUE Let X 1, X 2,, X n be a random sample from a normal distribution with parameters m and s. Then the estimator ˆ is the MVUE for m m X Proof beyond the scope of the class Note this only applies to the normal distribution When estimating the population mean E(X)=m for other distributions, X may not be the appropriate estimator E.g., for Cauchy distribution E(X)=! Revision:
21 How Variable is My Point Estimate? The Standard Error The precision of a point estimate is given by its standard error The standard error of an estimator is its standard deviation s ˆ q Var ˆ q If the standard error itself involves unknown parameters whose values are estimated, substitution of these estimates into s qˆ yields the estimated standard error The estimated standard error is denoted by s or Revision: qˆ ˆqˆ s ˆ q
22 Deriving Some Standard Errors (1) Proposition: If Y 1, Y 2,, Y n are distributed iid with variance s 2 then, for a sample of size n, Var Y s n. Thus. Proof: 2 s Y s n Revision:
23 Deriving Some Standard Errors (2) Proposition: If Y i ~Bin(n,p), i=1,,n, then, where q=1-p and. s pq n ˆp Y n ˆp Proof: Revision:
24 If populations are independent Expected Values and Standard Errors of Some Common Point Estimators Target Parameter q Sample Size(s) Point Estimator ˆ E qˆ m n Y m Y p n pˆ p n m 1 -m 2 n 1 and n 2 Y Y m 1 -m 2 p 1 -p 2 n 1 and n 2 pˆ pˆ p 1 -p 2 q Standard Error s n p q n s ˆ q s n pq n s n p q n Revision:
25 However, Unbiased Estimators Aren t Always to be Preferred Sometimes an estimator with a small bias can be preferred to an unbiased estimator Example: More detailed discussion beyond scope of course just know unbiasedness isn t necessarily required for a good estimator Revision:
26 Mean Square Error Definition: The mean square error (MSE) of a point estimator qˆ is MSE MSE of an estimator qˆ variance and its bias ˆ q E ˆ q q 2 is a function of both its I.e., it can be shown (extra credit problem) that 2 MSE ˆ q E ˆ q q Var ˆ q B ˆ q 2 So, for unbiased estimators MSE ˆ q Var ˆ q Revision:
27 Error of Estimation Definition: The error of estimation e is the distance between an estimator and its target parameter: e ˆ q q Since qˆ is a random variable, so it the error of estimation, e But we can bound the error: Pr ˆ q q b Pr b ˆ q q b Pr q b ˆ q q b Revision:
28 Bounding the Error of Estimation Tchebysheff s Theorem. Let Y be a random variable with finite mean m and variance s 2. Then for any k > 0, Pr Y m ks 11 k 2 Note that this holds for any distribution It is a (generally conservative) bound E.g., for any distribution we re guaranteed that the probability Y is within 2 standard deviations of the mean is at least 0.75 So, for unbiased estimators, a good bound to use on the error of estimation is b 2s Revision: ˆ q
29 Example 8.2 In a sample of n=1,000 randomly selected voters, y=560 are in favor of candidate Jones. Estimate p, the fraction of voters in the population favoring Jones, and put a 2-s.e. bound on the error of estimation. Solution: Revision:
30 Example 8.2 (continued) Revision:
31 Example 8.3 Car tire durability was measured on samples of two types of tires, n 1 =n 2 =100. The number of miles until wear-out were recorded with the following results: s y 26, 400 miles y 25,100 miles s2 1, 440, 000 miles 1,960, 000 miles Estimate the difference in mean miles to wear-out and put a 2-s.e. bound on the error of estimation Revision:
32 Example 8.3 Solution: Revision:
33 Example 8.3 (continued) Revision:
34 Other Properties of Good Estimators An estimator is efficient if it has a small standard deviation compared to other unbiased estimators An estimator is robust if it is not sensitive to outliers, distributional assumptions, etc. That is, robust estimators work reasonably well under a wide variety of conditions An estimator qˆn is consistent if ˆ q q e 0 as n P n For more detail, see Chapter Revision:
35 A Useful Aside: Using the Bootstrap to Empirically Estimate Standard Errors x1 x2 Population ~F The Hard Way to Empirically Estimate Standard Errors xr Draw multiple (R) samples from the population, where x i ={x 1i,x 2i,,x ni } ˆ q x ˆ q x ˆ 1 2 i1 q x R R 1 s.e.[ q( X)] ˆ q( ) ˆ i q( ) R 1 x x Calculate multiple parameter estimates Estimate s.e. of the parameter using the std. dev. of the estimates Revision:
36 The Bootstrap The hard way is either not possible or is wasteful in practice Bootstrap is: Useful when you don t know or, worse, simply cannot analytically derive sampling distribution Provides a computer-intensive method to empirically estimate sampling distribution Only feasible recently with the widespread availability of significant computing power Revision:
37 Plug-in Principle We ve been doing this throughout the class If you need a parameter for a calculation, simply plug in the equivalent statistic For example, we defined 2 Var( X ) E X E( X ) and then we sometimes did the calculation using X for EX ( ) Relevant for the bootstrap as we will plug in the empirical distribution in place of the population distribution Revision:
38 Empirically Estimating Standard Errors Using the Bootstrap x x 1, x2,..., x n ˆ ~F s * x 1 * x 2 Revision: * x B ˆ * q x1 ˆ * q x2 ˆ * q x B 1 ˆ( ) B * * ˆ q i q q B 1 x i1 2 where q 1 ˆ Draw multiple (B) resamples from the data, where B * * q xi B i1 x Calculate multiple bootstrap estimates Estimate s.e. from bootstrap estimates x, x,... x * * * * i 1i 2i ni
39 Some Key Ideas Bootstrap samples are drawn with replacement from the empirical distribution So, observations can actually occur in the bootstrap sample more frequently than they occurred in the actual sample Empirical distribution substitutes for the actual population distribution Can draw lots of bootstrap samples from the empirical distribution to calculate the statistic of interest Make B as big as can run in a reasonable timeframe Bootstrap resamples are of same size as orignal sample (n) Because this is all empirical, don t need to analytically solve for the sampling distribution of the statistic of interest Revision:
40 What We Covered in this Module Defined and distinguished between point estimates vs. point estimators Discussed characteristics of good point estimates Unbiasedness and minimum variance Mean square error Consistency, efficiency, robustness Quantified and calculated the precision of an estimator via the standard error Discussed the Bootstrap as a way to empirically estimate standard errors Revision:
41 Homework WM&S chapter Required exercises: 2, 8, 21, 23, 27 Extra credit: 1, 6 Useful hints: Problem 8.2: Don t just give the obvious answer, but show why it s true mathematically Problem 8.8: Don t do the calculations for the estimator Extra credit problem 8.6: The a term is a constant with qˆ 4 0a 1 Revision:
Module 3: Sampling Distributions and the CLT Statistics (OA3102)
Module 3: Sampling Distributions and the CLT Statistics (OA3102) Professor Ron Fricker Naval Postgraduate School Monterey, California Reading assignment: WM&S chpt 7.1-7.3, 7.5 Revision: 1-12 1 Goals for
More informationPoint Estimation. Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage
6 Point Estimation Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage Point Estimation Statistical inference: directed toward conclusions about one or more parameters. We will use the generic
More informationPoint Estimation. Some General Concepts of Point Estimation. Example. Estimator quality
Point Estimation Some General Concepts of Point Estimation Statistical inference = conclusions about parameters Parameters == population characteristics A point estimate of a parameter is a value (based
More informationChapter 7 presents the beginning of inferential statistics. The two major activities of inferential statistics are
Chapter 7 presents the beginning of inferential statistics. Concept: Inferential Statistics The two major activities of inferential statistics are 1 to use sample data to estimate values of population
More informationChapter 7 - Lecture 1 General concepts and criteria
Chapter 7 - Lecture 1 General concepts and criteria January 29th, 2010 Best estimator Mean Square error Unbiased estimators Example Unbiased estimators not unique Special case MVUE Bootstrap General Question
More informationChapter 4: Commonly Used Distributions. Statistics for Engineers and Scientists Fourth Edition William Navidi
Chapter 4: Commonly Used Distributions Statistics for Engineers and Scientists Fourth Edition William Navidi 2014 by Education. This is proprietary material solely for authorized instructor use. Not authorized
More informationECO220Y Estimation: Confidence Interval Estimator for Sample Proportions Readings: Chapter 11 (skip 11.5)
ECO220Y Estimation: Confidence Interval Estimator for Sample Proportions Readings: Chapter 11 (skip 11.5) Fall 2011 Lecture 10 (Fall 2011) Estimation Lecture 10 1 / 23 Review: Sampling Distributions Sample
More informationChapter 8. Introduction to Statistical Inference
Chapter 8. Introduction to Statistical Inference Point Estimation Statistical inference is to draw some type of conclusion about one or more parameters(population characteristics). Now you know that a
More informationدرس هفتم یادگیري ماشین. (Machine Learning) دانشگاه فردوسی مشهد دانشکده مهندسی رضا منصفی
یادگیري ماشین توزیع هاي نمونه و تخمین نقطه اي پارامترها Sampling Distributions and Point Estimation of Parameter (Machine Learning) دانشگاه فردوسی مشهد دانشکده مهندسی رضا منصفی درس هفتم 1 Outline Introduction
More informationSampling and sampling distribution
Sampling and sampling distribution September 12, 2017 STAT 101 Class 5 Slide 1 Outline of Topics 1 Sampling 2 Sampling distribution of a mean 3 Sampling distribution of a proportion STAT 101 Class 5 Slide
More informationReview of key points about estimators
Review of key points about estimators Populations can be at least partially described by population parameters Population parameters include: mean, proportion, variance, etc. Because populations are often
More informationChapter 7. Sampling Distributions
Chapter 7 Sampling Distributions Section 7.1 Sampling Distributions and the Central Limit Theorem Sampling Distributions Sampling distribution The probability distribution of a sample statistic. Formed
More informationPoint Estimation. Principle of Unbiased Estimation. When choosing among several different estimators of θ, select one that is unbiased.
Point Estimation Point Estimation Definition A point estimate of a parameter θ is a single number that can be regarded as a sensible value for θ. A point estimate is obtained by selecting a suitable statistic
More informationApplied Statistics I
Applied Statistics I Liang Zhang Department of Mathematics, University of Utah July 14, 2008 Liang Zhang (UofU) Applied Statistics I July 14, 2008 1 / 18 Point Estimation Liang Zhang (UofU) Applied Statistics
More informationDefinition 9.1 A point estimate is any function T (X 1,..., X n ) of a random sample. We often write an estimator of the parameter θ as ˆθ.
9 Point estimation 9.1 Rationale behind point estimation When sampling from a population described by a pdf f(x θ) or probability function P [X = x θ] knowledge of θ gives knowledge of the entire population.
More informationChapter 6: Point Estimation
Chapter 6: Point Estimation Professor Sharabati Purdue University March 10, 2014 Professor Sharabati (Purdue University) Point Estimation Spring 2014 1 / 37 Chapter Overview Point estimator and point estimate
More informationChapter 7: Point Estimation and Sampling Distributions
Chapter 7: Point Estimation and Sampling Distributions Seungchul Baek Department of Statistics, University of South Carolina STAT 509: Statistics for Engineers 1 / 20 Motivation In chapter 3, we learned
More informationSampling Distributions
AP Statistics Ch. 7 Notes Sampling Distributions A major field of statistics is statistical inference, which is using information from a sample to draw conclusions about a wider population. Parameter:
More informationChapter 8 Statistical Intervals for a Single Sample
Chapter 8 Statistical Intervals for a Single Sample Part 1: Confidence intervals (CI) for population mean µ Section 8-1: CI for µ when σ 2 known & drawing from normal distribution Section 8-1.2: Sample
More informationMuch of what appears here comes from ideas presented in the book:
Chapter 11 Robust statistical methods Much of what appears here comes from ideas presented in the book: Huber, Peter J. (1981), Robust statistics, John Wiley & Sons (New York; Chichester). There are many
More informationPoint Estimation. Edwin Leuven
Point Estimation Edwin Leuven Introduction Last time we reviewed statistical inference We saw that while in probability we ask: given a data generating process, what are the properties of the outcomes?
More informationStatistics for Business and Economics
Statistics for Business and Economics Chapter 7 Estimation: Single Population Copyright 010 Pearson Education, Inc. Publishing as Prentice Hall Ch. 7-1 Confidence Intervals Contents of this chapter: Confidence
More informationStatistical Intervals. Chapter 7 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage
7 Statistical Intervals Chapter 7 Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage Confidence Intervals The CLT tells us that as the sample size n increases, the sample mean X is close to
More informationChapter 6 Confidence Intervals
Chapter 6 Confidence Intervals Section 6-1 Confidence Intervals for the Mean (Large Samples) VOCABULARY: Point Estimate A value for a parameter. The most point estimate of the population parameter is the
More informationHomework: (Due Wed) Chapter 10: #5, 22, 42
Announcements: Discussion today is review for midterm, no credit. You may attend more than one discussion section. Bring 2 sheets of notes and calculator to midterm. We will provide Scantron form. Homework:
More informationInstitute for the Advancement of University Learning & Department of Statistics
Institute for the Advancement of University Learning & Department of Statistics Descriptive Statistics for Research (Hilary Term, 00) Lecture 4: Estimation (I.) Overview of Estimation In most studies or
More informationChapter 7. Sampling Distributions and the Central Limit Theorem
Chapter 7. Sampling Distributions and the Central Limit Theorem 1 Introduction 2 Sampling Distributions related to the normal distribution 3 The central limit theorem 4 The normal approximation to binomial
More informationHOMEWORK: Due Mon 11/8, Chapter 9: #15, 25, 37, 44
This week: Chapter 9 (will do 9.6 to 9.8 later, with Chap. 11) Understanding Sampling Distributions: Statistics as Random Variables ANNOUNCEMENTS: Shandong Min will give the lecture on Friday. See website
More informationBIO5312 Biostatistics Lecture 5: Estimations
BIO5312 Biostatistics Lecture 5: Estimations Yujin Chung September 27th, 2016 Fall 2016 Yujin Chung Lec5: Estimations Fall 2016 1/34 Recap Yujin Chung Lec5: Estimations Fall 2016 2/34 Today s lecture and
More informationBack to estimators...
Back to estimators... So far, we have: Identified estimators for common parameters Discussed the sampling distributions of estimators Introduced ways to judge the goodness of an estimator (bias, MSE, etc.)
More informationChapter 5. Sampling Distributions
Lecture notes, Lang Wu, UBC 1 Chapter 5. Sampling Distributions 5.1. Introduction In statistical inference, we attempt to estimate an unknown population characteristic, such as the population mean, µ,
More informationSTAT 509: Statistics for Engineers Dr. Dewei Wang. Copyright 2014 John Wiley & Sons, Inc. All rights reserved.
STAT 509: Statistics for Engineers Dr. Dewei Wang Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger 7 Point CHAPTER OUTLINE 7-1 Point Estimation 7-2
More informationChapter 7. Sampling Distributions and the Central Limit Theorem
Chapter 7. Sampling Distributions and the Central Limit Theorem 1 Introduction 2 Sampling Distributions related to the normal distribution 3 The central limit theorem 4 The normal approximation to binomial
More information8.1 Estimation of the Mean and Proportion
8.1 Estimation of the Mean and Proportion Statistical inference enables us to make judgments about a population on the basis of sample information. The mean, standard deviation, and proportions of a population
More informationStatistical analysis and bootstrapping
Statistical analysis and bootstrapping p. 1/15 Statistical analysis and bootstrapping Michel Bierlaire michel.bierlaire@epfl.ch Transport and Mobility Laboratory Statistical analysis and bootstrapping
More informationConfidence Intervals Introduction
Confidence Intervals Introduction A point estimate provides no information about the precision and reliability of estimation. For example, the sample mean X is a point estimate of the population mean μ
More informationSTA Module 3B Discrete Random Variables
STA 2023 Module 3B Discrete Random Variables Learning Objectives Upon completing this module, you should be able to 1. Determine the probability distribution of a discrete random variable. 2. Construct
More informationLearning Objectives for Ch. 7
Chapter 7: Point and Interval Estimation Hildebrand, Ott and Gray Basic Statistical Ideas for Managers Second Edition 1 Learning Objectives for Ch. 7 Obtaining a point estimate of a population parameter
More informationLecture 3. Sampling distributions. Counts, Proportions, and sample mean.
Lecture 3 Sampling distributions. Counts, Proportions, and sample mean. Statistical Inference: Uses data and summary statistics (mean, variances, proportions, slopes) to draw conclusions about a population
More informationReview of key points about estimators
Review of key points about estimators Populations can be at least partially described by population parameters Population parameters include: mean, proportion, variance, etc. Because populations are often
More informationSTAT Chapter 7: Confidence Intervals
STAT 515 -- Chapter 7: Confidence Intervals With a point estimate, we used a single number to estimate a parameter. We can also use a set of numbers to serve as reasonable estimates for the parameter.
More informationSection 0: Introduction and Review of Basic Concepts
Section 0: Introduction and Review of Basic Concepts Carlos M. Carvalho The University of Texas McCombs School of Business mccombs.utexas.edu/faculty/carlos.carvalho/teaching 1 Getting Started Syllabus
More informationChapter 6 Confidence Intervals Section 6-1 Confidence Intervals for the Mean (Large Samples) Estimating Population Parameters
Chapter 6 Confidence Intervals Section 6-1 Confidence Intervals for the Mean (Large Samples) Estimating Population Parameters VOCABULARY: Point Estimate a value for a parameter. The most point estimate
More informationSTA Rev. F Learning Objectives. What is a Random Variable? Module 5 Discrete Random Variables
STA 2023 Module 5 Discrete Random Variables Learning Objectives Upon completing this module, you should be able to: 1. Determine the probability distribution of a discrete random variable. 2. Construct
More informationProbability & Statistics
Probability & Statistics BITS Pilani K K Birla Goa Campus Dr. Jajati Keshari Sahoo Department of Mathematics Statistics Descriptive statistics Inferential statistics /38 Inferential Statistics 1. Involves:
More informationChapter 7: SAMPLING DISTRIBUTIONS & POINT ESTIMATION OF PARAMETERS
Chapter 7: SAMPLING DISTRIBUTIONS & POINT ESTIMATION OF PARAMETERS Part 1: Introduction Sampling Distributions & the Central Limit Theorem Point Estimation & Estimators Sections 7-1 to 7-2 Sample data
More informationBias Reduction Using the Bootstrap
Bias Reduction Using the Bootstrap Find f t (i.e., t) so that or E(f t (P, P n ) P) = 0 E(T(P n ) θ(p) + t P) = 0. Change the problem to the sample: whose solution is so the bias-reduced estimate is E(T(P
More informationChapter 8: Sampling distributions of estimators Sections
Chapter 8 continued Chapter 8: Sampling distributions of estimators Sections 8.1 Sampling distribution of a statistic 8.2 The Chi-square distributions 8.3 Joint Distribution of the sample mean and sample
More informationStatistics 13 Elementary Statistics
Statistics 13 Elementary Statistics Summer Session I 2012 Lecture Notes 5: Estimation with Confidence intervals 1 Our goal is to estimate the value of an unknown population parameter, such as a population
More informationDetermining Sample Size. Slide 1 ˆ ˆ. p q n E = z α / 2. (solve for n by algebra) n = E 2
Determining Sample Size Slide 1 E = z α / 2 ˆ ˆ p q n (solve for n by algebra) n = ( zα α / 2) 2 p ˆ qˆ E 2 Sample Size for Estimating Proportion p When an estimate of ˆp is known: Slide 2 n = ˆ ˆ ( )
More informationChapter 14 : Statistical Inference 1. Note : Here the 4-th and 5-th editions of the text have different chapters, but the material is the same.
Chapter 14 : Statistical Inference 1 Chapter 14 : Introduction to Statistical Inference Note : Here the 4-th and 5-th editions of the text have different chapters, but the material is the same. Data x
More informationChapter 3 Discrete Random Variables and Probability Distributions
Chapter 3 Discrete Random Variables and Probability Distributions Part 2: Mean and Variance of a Discrete Random Variable Section 3.4 1 / 16 Discrete Random Variable - Expected Value In a random experiment,
More information10/1/2012. PSY 511: Advanced Statistics for Psychological and Behavioral Research 1
PSY 511: Advanced Statistics for Psychological and Behavioral Research 1 Pivotal subject: distributions of statistics. Foundation linchpin important crucial You need sampling distributions to make inferences:
More informationSection 7-2 Estimating a Population Proportion
Section 7- Estimating a Population Proportion 1 Key Concept In this section we present methods for using a sample proportion to estimate the value of a population proportion. The sample proportion is the
More informationIntroduction Dickey-Fuller Test Option Pricing Bootstrapping. Simulation Methods. Chapter 13 of Chris Brook s Book.
Simulation Methods Chapter 13 of Chris Brook s Book Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 April 26, 2017 Christopher
More informationHigh Volatility Medium Volatility /24/85 12/18/86
Estimating Model Limitation in Financial Markets Malik Magdon-Ismail 1, Alexander Nicholson 2 and Yaser Abu-Mostafa 3 1 malik@work.caltech.edu 2 zander@work.caltech.edu 3 yaser@caltech.edu Learning Systems
More informationMATH 3200 Exam 3 Dr. Syring
. Suppose n eligible voters are polled (randomly sampled) from a population of size N. The poll asks voters whether they support or do not support increasing local taxes to fund public parks. Let M be
More informationComparison of design-based sample mean estimate with an estimate under re-sampling-based multiple imputations
Comparison of design-based sample mean estimate with an estimate under re-sampling-based multiple imputations Recai Yucel 1 Introduction This section introduces the general notation used throughout this
More informationThe Estimation of Expected Stock Returns on the Basis of Analysts' Forecasts
The Estimation of Expected Stock Returns on the Basis of Analysts' Forecasts by Wolfgang Breuer and Marc Gürtler RWTH Aachen TU Braunschweig October 28th, 2009 University of Hannover TU Braunschweig, Institute
More information6.041SC Probabilistic Systems Analysis and Applied Probability, Fall 2013 Transcript Lecture 23
6.041SC Probabilistic Systems Analysis and Applied Probability, Fall 2013 Transcript Lecture 23 The following content is provided under a Creative Commons license. Your support will help MIT OpenCourseWare
More informationChapter 16. Random Variables. Copyright 2010 Pearson Education, Inc.
Chapter 16 Random Variables Copyright 2010 Pearson Education, Inc. Expected Value: Center A random variable assumes a value based on the outcome of a random event. We use a capital letter, like X, to denote
More information4.1 Introduction Estimating a population mean The problem with estimating a population mean with a sample mean: an example...
Chapter 4 Point estimation Contents 4.1 Introduction................................... 2 4.2 Estimating a population mean......................... 2 4.2.1 The problem with estimating a population mean
More information4-1. Chapter 4. Commonly Used Distributions by The McGraw-Hill Companies, Inc. All rights reserved.
4-1 Chapter 4 Commonly Used Distributions 2014 by The Companies, Inc. All rights reserved. Section 4.1: The Bernoulli Distribution 4-2 We use the Bernoulli distribution when we have an experiment which
More informationThe Two-Sample Independent Sample t Test
Department of Psychology and Human Development Vanderbilt University 1 Introduction 2 3 The General Formula The Equal-n Formula 4 5 6 Independence Normality Homogeneity of Variances 7 Non-Normality Unequal
More informationWeek 1 Quantitative Analysis of Financial Markets Basic Statistics A
Week 1 Quantitative Analysis of Financial Markets Basic Statistics A Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 October
More informationLecture 12: The Bootstrap
Lecture 12: The Bootstrap Reading: Chapter 5 STATS 202: Data mining and analysis October 20, 2017 1 / 16 Announcements Midterm is on Monday, Oct 30 Topics: chapters 1-5 and 10 of the book everything until
More informationChapter 10 Estimating Proportions with Confidence
Chapter 10 Estimating Proportions with Confidence Copyright 2011 Brooks/Cole, Cengage Learning Principle Idea: Confidence interval: an interval of estimates that is likely to capture the population value.
More informationMidterm Exam. b. What are the continuously compounded returns for the two stocks?
University of Washington Fall 004 Department of Economics Eric Zivot Economics 483 Midterm Exam This is a closed book and closed note exam. However, you are allowed one page of notes (double-sided). Answer
More informationEstimating parameters 5.3 Confidence Intervals 5.4 Sample Variance
Estimating parameters 5.3 Confidence Intervals 5.4 Sample Variance Prof. Tesler Math 186 Winter 2017 Prof. Tesler Ch. 5: Confidence Intervals, Sample Variance Math 186 / Winter 2017 1 / 29 Estimating parameters
More informationTime Observations Time Period, t
Operations Research Models and Methods Paul A. Jensen and Jonathan F. Bard Time Series and Forecasting.S1 Time Series Models An example of a time series for 25 periods is plotted in Fig. 1 from the numerical
More informationEcon 8602, Fall 2017 Homework 2
Econ 8602, Fall 2017 Homework 2 Due Tues Oct 3. Question 1 Consider the following model of entry. There are two firms. There are two entry scenarios in each period. With probability only one firm is able
More informationChapter 5: Statistical Inference (in General)
Chapter 5: Statistical Inference (in General) Shiwen Shen University of South Carolina 2016 Fall Section 003 1 / 17 Motivation In chapter 3, we learn the discrete probability distributions, including Bernoulli,
More informationIntroduction to Probability and Inference HSSP Summer 2017, Instructor: Alexandra Ding July 19, 2017
Introduction to Probability and Inference HSSP Summer 2017, Instructor: Alexandra Ding July 19, 2017 Please fill out the attendance sheet! Suggestions Box: Feedback and suggestions are important to the
More informationShifting our focus. We were studying statistics (data, displays, sampling...) The next few lectures focus on probability (randomness) Why?
Probability Introduction Shifting our focus We were studying statistics (data, displays, sampling...) The next few lectures focus on probability (randomness) Why? What is Probability? Probability is used
More informationSection 1.4: Learning from data
Section 1.4: Learning from data Jared S. Murray The University of Texas at Austin McCombs School of Business Suggested reading: OpenIntro Statistics, Chapter 4.1, 4.2, 4.4, 5.3 1 A First Modeling Exercise
More informationSampling Distributions and the Central Limit Theorem
Sampling Distributions and the Central Limit Theorem February 18 Data distributions and sampling distributions So far, we have discussed the distribution of data (i.e. of random variables in our sample,
More informationStatistics & Statistical Tests: Assumptions & Conclusions
Degrees of Freedom Statistics & Statistical Tests: Assumptions & Conclusions Kinds of degrees of freedom Kinds of Distributions Kinds of Statistics & assumptions required to perform each Normal Distributions
More informationData analysis methods in weather and climate research
Data analysis methods in weather and climate research Dr. David B. Stephenson Department of Meteorology University of Reading www.met.rdg.ac.uk/cag 5. Parameter estimation Fitting probability models he
More informationNumerical Descriptive Measures. Measures of Center: Mean and Median
Steve Sawin Statistics Numerical Descriptive Measures Having seen the shape of a distribution by looking at the histogram, the two most obvious questions to ask about the specific distribution is where
More informationWeek 7 Quantitative Analysis of Financial Markets Simulation Methods
Week 7 Quantitative Analysis of Financial Markets Simulation Methods Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036 November
More informationThe Assumption(s) of Normality
The Assumption(s) of Normality Copyright 2000, 2011, 2016, J. Toby Mordkoff This is very complicated, so I ll provide two versions. At a minimum, you should know the short one. It would be great if you
More informationStatistical Intervals (One sample) (Chs )
7 Statistical Intervals (One sample) (Chs 8.1-8.3) Confidence Intervals The CLT tells us that as the sample size n increases, the sample mean X is close to normally distributed with expected value µ and
More informationDiscrete Random Variables and Probability Distributions. Stat 4570/5570 Based on Devore s book (Ed 8)
3 Discrete Random Variables and Probability Distributions Stat 4570/5570 Based on Devore s book (Ed 8) Random Variables We can associate each single outcome of an experiment with a real number: We refer
More informationUNIVERSITY OF VICTORIA Midterm June 2014 Solutions
UNIVERSITY OF VICTORIA Midterm June 04 Solutions NAME: STUDENT NUMBER: V00 Course Name & No. Inferential Statistics Economics 46 Section(s) A0 CRN: 375 Instructor: Betty Johnson Duration: hour 50 minutes
More informationWeek 2 Quantitative Analysis of Financial Markets Hypothesis Testing and Confidence Intervals
Week 2 Quantitative Analysis of Financial Markets Hypothesis Testing and Confidence Intervals Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg :
More informationExpected Value and Variance
Expected Value and Variance MATH 472 Financial Mathematics J Robert Buchanan 2018 Objectives In this lesson we will learn: the definition of expected value, how to calculate the expected value of a random
More informationChapter 7 Sampling Distributions and Point Estimation of Parameters
Chapter 7 Sampling Distributions and Point Estimation of Parameters Part 1: Sampling Distributions, the Central Limit Theorem, Point Estimation & Estimators Sections 7-1 to 7-2 1 / 25 Statistical Inferences
More informationBoth the quizzes and exams are closed book. However, For quizzes: Formulas will be provided with quiz papers if there is any need.
Both the quizzes and exams are closed book. However, For quizzes: Formulas will be provided with quiz papers if there is any need. For exams (MD1, MD2, and Final): You may bring one 8.5 by 11 sheet of
More informationLinear functions Increasing Linear Functions. Decreasing Linear Functions
3.5 Increasing, Decreasing, Max, and Min So far we have been describing graphs using quantitative information. That s just a fancy way to say that we ve been using numbers. Specifically, we have described
More informationSTAT 201 Chapter 6. Distribution
STAT 201 Chapter 6 Distribution 1 Random Variable We know variable Random Variable: a numerical measurement of the outcome of a random phenomena Capital letter refer to the random variable Lower case letters
More informationEcon 300: Quantitative Methods in Economics. 11th Class 10/19/09
Econ 300: Quantitative Methods in Economics 11th Class 10/19/09 Statistical thinking will one day be as necessary for efficient citizenship as the ability to read and write. --H.G. Wells discuss test [do
More informationChapter 3 Discrete Random Variables and Probability Distributions
Chapter 3 Discrete Random Variables and Probability Distributions Part 4: Special Discrete Random Variable Distributions Sections 3.7 & 3.8 Geometric, Negative Binomial, Hypergeometric NOTE: The discrete
More informationSection 2: Estimation, Confidence Intervals and Testing Hypothesis
Section 2: Estimation, Confidence Intervals and Testing Hypothesis Carlos M. Carvalho The University of Texas at Austin McCombs School of Business http://faculty.mccombs.utexas.edu/carlos.carvalho/teaching/
More informationE509A: Principle of Biostatistics. GY Zou
E509A: Principle of Biostatistics (Week 2: Probability and Distributions) GY Zou gzou@robarts.ca Reporting of continuous data If approximately symmetric, use mean (SD), e.g., Antibody titers ranged from
More informationThe Simple Regression Model
Chapter 2 Wooldridge: Introductory Econometrics: A Modern Approach, 5e Definition of the simple linear regression model "Explains variable in terms of variable " Intercept Slope parameter Dependent var,
More informationChapter 16. Random Variables. Copyright 2010, 2007, 2004 Pearson Education, Inc.
Chapter 16 Random Variables Copyright 2010, 2007, 2004 Pearson Education, Inc. Expected Value: Center A random variable is a numeric value based on the outcome of a random event. We use a capital letter,
More informationA useful modeling tricks.
.7 Joint models for more than two outcomes We saw that we could write joint models for a pair of variables by specifying the joint probabilities over all pairs of outcomes. In principal, we could do this
More information12 The Bootstrap and why it works
12 he Bootstrap and why it works For a review of many applications of bootstrap see Efron and ibshirani (1994). For the theory behind the bootstrap see the books by Hall (1992), van der Waart (2000), Lahiri
More informationReview: Population, sample, and sampling distributions
Review: Population, sample, and sampling distributions A population with mean µ and standard deviation σ For instance, µ = 0, σ = 1 0 1 Sample 1, N=30 Sample 2, N=30 Sample 100000000000 InterquartileRange
More informationStat 139 Homework 2 Solutions, Fall 2016
Stat 139 Homework 2 Solutions, Fall 2016 Problem 1. The sum of squares of a sample of data is minimized when the sample mean, X = Xi /n, is used as the basis of the calculation. Define g(c) as a function
More information