Random Samples. Mathematics 47: Lecture 6. Dan Sloughter. Furman University. March 13, 2006

Similar documents
UQ, STAT2201, 2017, Lectures 3 and 4 Unit 3 Probability Distributions.

Applications of the Central Limit Theorem

Lecture 10: Point Estimation

The Normal Distribution

Point Estimators. STATISTICS Lecture no. 10. Department of Econometrics FEM UO Brno office 69a, tel

Random Variables Handout. Xavier Vilà

Chapter 4: Asymptotic Properties of MLE (Part 3)

ECE 340 Probabilistic Methods in Engineering M/W 3-4:15. Lecture 10: Continuous RV Families. Prof. Vince Calhoun

Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER

IEOR 165 Lecture 1 Probability Review

Random Variable: Definition

continuous rv Note for a legitimate pdf, we have f (x) 0 and f (x)dx = 1. For a continuous rv, P(X = c) = c f (x)dx = 0, hence

Statistics 6 th Edition

Continuous random variables

Chapter 5: Statistical Inference (in General)

Definition 9.1 A point estimate is any function T (X 1,..., X n ) of a random sample. We often write an estimator of the parameter θ as ˆθ.

Lecture 23. STAT 225 Introduction to Probability Models April 4, Whitney Huang Purdue University. Normal approximation to Binomial

Chapter 7: Point Estimation and Sampling Distributions

Sampling and sampling distribution

IEOR 3106: Introduction to OR: Stochastic Models. Fall 2013, Professor Whitt. Class Lecture Notes: Tuesday, September 10.

MATH 3200 Exam 3 Dr. Syring

Section The Sampling Distribution of a Sample Mean

Basic notions of probability theory: continuous probability distributions. Piero Baraldi

Actuarial Mathematics and Statistics Statistics 5 Part 2: Statistical Inference Tutorial Problems

Chapter 3 Common Families of Distributions. Definition 3.4.1: A family of pmfs or pdfs is called exponential family if it can be expressed as

Statistics for Managers Using Microsoft Excel 7 th Edition

Non-informative Priors Multiparameter Models

Posterior Inference. , where should we start? Consider the following computational procedure: 1. draw samples. 2. convert. 3. compute properties

2. The sum of all the probabilities in the sample space must add up to 1

Probability Theory and Simulation Methods. April 9th, Lecture 20: Special distributions

Chapter 14 : Statistical Inference 1. Note : Here the 4-th and 5-th editions of the text have different chapters, but the material is the same.

MAS187/AEF258. University of Newcastle upon Tyne

Statistical estimation

Probability & Statistics

M.I.T Fall Practice Problems

Chapter 7: Random Variables

Chapter 7 - Lecture 1 General concepts and criteria

Chapter 7: Estimation Sections

Chapter 7: Estimation Sections

A potentially useful approach to model nonlinearities in time series is to assume different behavior (structural break) in different subsamples

4-1. Chapter 4. Commonly Used Distributions by The McGraw-Hill Companies, Inc. All rights reserved.

Confidence Intervals Introduction

Data Analysis and Statistical Methods Statistics 651

Some Discrete Distribution Families

The Binomial Distribution

Reliability and Risk Analysis. Survival and Reliability Function

Chapter 5 Univariate time-series analysis. () Chapter 5 Univariate time-series analysis 1 / 29

Chapter 8: Sampling distributions of estimators Sections

Chapter 9: Sampling Distributions

Chapter 3 Discrete Random Variables and Probability Distributions

Review for Final Exam Spring 2014 Jeremy Orloff and Jonathan Bloom

Probability Theory. Mohamed I. Riffi. Islamic University of Gaza

Two hours UNIVERSITY OF MANCHESTER. 23 May :00 16:00. Answer ALL SIX questions The total number of marks in the paper is 90.

Normal Distribution. Definition A continuous rv X is said to have a normal distribution with. the pdf of X is

ECO220Y Continuous Probability Distributions: Normal Readings: Chapter 9, section 9.10

6. Genetics examples: Hardy-Weinberg Equilibrium

Normal Distribution. Notes. Normal Distribution. Standard Normal. Sums of Normal Random Variables. Normal. approximation of Binomial.

Commonly Used Distributions

Back to estimators...

STATISTICS and PROBABILITY

4.2 Probability Distributions

Chapter 5. Statistical inference for Parametric Models

Continuous probability distribution

(Practice Version) Midterm Exam 1

1. For a special whole life insurance on (x), payable at the moment of death:

Intro to Likelihood. Gov 2001 Section. February 2, Gov 2001 Section () Intro to Likelihood February 2, / 44

BIO5312 Biostatistics Lecture 5: Estimations

INSTITUTE AND FACULTY OF ACTUARIES. Curriculum 2019 SPECIMEN EXAMINATION

BROWNIAN MOTION Antonella Basso, Martina Nardon

Lecture 17. The model is parametrized by the time period, δt, and three fixed constant parameters, v, σ and the riskless rate r.

MATH 264 Problem Homework I

Chapter 6 Analyzing Accumulated Change: Integrals in Action

Statistics for Managers Using Microsoft Excel 7 th Edition

Bus 701: Advanced Statistics. Harald Schmidbauer

Central Limit Theorem (CLT) RLS

Computer Statistics with R

12. Conditional heteroscedastic models (ARCH) MA6622, Ernesto Mordecki, CityU, HK, 2006.

Chapter 8. Introduction to Statistical Inference

14.30 Introduction to Statistical Methods in Economics Spring 2009

A probability distribution shows the possible outcomes of an experiment and the probability of each of these outcomes.

Probability. An intro for calculus students P= Figure 1: A normal integral

Chapter 4 Continuous Random Variables and Probability Distributions

Conjugate Models. Patrick Lam

1. Covariance between two variables X and Y is denoted by Cov(X, Y) and defined by. Cov(X, Y ) = E(X E(X))(Y E(Y ))

Chapter 7: Estimation Sections

Applied Statistics I

STAT/MATH 395 PROBABILITY II

Bivariate Birnbaum-Saunders Distribution

STAT Chapter 7: Central Limit Theorem

Statistics 251: Statistical Methods Sampling Distributions Module

CHOICE THEORY, UTILITY FUNCTIONS AND RISK AVERSION

Confidence Intervals: Review

Statistics and Their Distributions

STAT 111 Recitation 4

Lecture 3: Return vs Risk: Mean-Variance Analysis

ECE 295: Lecture 03 Estimation and Confidence Interval

Statistics for Business and Economics

EVA Tutorial #1 BLOCK MAXIMA APPROACH IN HYDROLOGIC/CLIMATE APPLICATIONS. Rick Katz

Point Estimation. Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage

Much of what appears here comes from ideas presented in the book:

Transcription:

Random Samples Mathematics 47: Lecture 6 Dan Sloughter Furman University March 13, 2006 Dan Sloughter (Furman University) Random Samples March 13, 2006 1 / 9

Random sampling Definition We call a sequence X 1, X 2,..., X n of random variables a sample. We call a sample X 1, X 2,..., X n a random sample if X 1, X 2,..., X n are independent and identically distributed. If X 1, X 2,..., X n is a random sample, g : R n R, and W = g(x 1, X 2,..., X n ), we call W a statistic. Dan Sloughter (Furman University) Random Samples March 13, 2006 2 / 9

Random sampling Definition We call a sequence X 1, X 2,..., X n of random variables a sample. We call a sample X 1, X 2,..., X n a random sample if X 1, X 2,..., X n are independent and identically distributed. If X 1, X 2,..., X n is a random sample, g : R n R, and W = g(x 1, X 2,..., X n ), we call W a statistic. Note: A random sample corresponds to repeated independent experiments. Dan Sloughter (Furman University) Random Samples March 13, 2006 2 / 9

Random sampling Definition We call a sequence X 1, X 2,..., X n of random variables a sample. We call a sample X 1, X 2,..., X n a random sample if X 1, X 2,..., X n are independent and identically distributed. If X 1, X 2,..., X n is a random sample, g : R n R, and W = g(x 1, X 2,..., X n ), we call W a statistic. Note: A random sample corresponds to repeated independent experiments. When sampling from a finite population (such as balls in an urn, or voters in a state), this would mean sampling with replacement. In particular, sampling without replacement does not, by this definition, produce a random sample. Dan Sloughter (Furman University) Random Samples March 13, 2006 2 / 9

Random sampling Definition We call a sequence X 1, X 2,..., X n of random variables a sample. We call a sample X 1, X 2,..., X n a random sample if X 1, X 2,..., X n are independent and identically distributed. If X 1, X 2,..., X n is a random sample, g : R n R, and W = g(x 1, X 2,..., X n ), we call W a statistic. Note: A random sample corresponds to repeated independent experiments. When sampling from a finite population (such as balls in an urn, or voters in a state), this would mean sampling with replacement. In particular, sampling without replacement does not, by this definition, produce a random sample. However, sampling without replacement from a finite population is often referred to as random sampling, even if it does not produce a random sample. Dan Sloughter (Furman University) Random Samples March 13, 2006 2 / 9

Sampling The question of how one actually obtains a random sample, or performs random sampling in the case of a finite population, is not a simple one. Dan Sloughter (Furman University) Random Samples March 13, 2006 3 / 9

Sampling The question of how one actually obtains a random sample, or performs random sampling in the case of a finite population, is not a simple one. For scientific procedures, such as measuring the length of the lifetime of a light bulb, the process is a matter of carefully replicating a given experiment. Dan Sloughter (Furman University) Random Samples March 13, 2006 3 / 9

Sampling The question of how one actually obtains a random sample, or performs random sampling in the case of a finite population, is not a simple one. For scientific procedures, such as measuring the length of the lifetime of a light bulb, the process is a matter of carefully replicating a given experiment. Sampling a human population is far more difficult. People who have to do this routinely, such as the Bureau of Labor Statistics, develop sophisticated multiple layer sampling procedures, the properties of which may be analyzed mathematically. Dan Sloughter (Furman University) Random Samples March 13, 2006 3 / 9

Sample (cont d) Samples which are drawn from voluntary response, such as call-in surveys or mail response surveys, are not examples of random sampling. Dan Sloughter (Furman University) Random Samples March 13, 2006 4 / 9

Sample (cont d) Samples which are drawn from voluntary response, such as call-in surveys or mail response surveys, are not examples of random sampling. Famous examples of large samples producing erroneous results abound, such as the 1936 Literary Digest prediction that Alf Landon would defeat Franklin Roosevelt in the presidential election. Dan Sloughter (Furman University) Random Samples March 13, 2006 4 / 9

Sample (cont d) Samples which are drawn from voluntary response, such as call-in surveys or mail response surveys, are not examples of random sampling. Famous examples of large samples producing erroneous results abound, such as the 1936 Literary Digest prediction that Alf Landon would defeat Franklin Roosevelt in the presidential election. On the other hand, when done properly, random sampling can produce results which are even better than attempts at complete population enumeration. The current problems with producing an accurate census of the population of the United States is an example. Dan Sloughter (Furman University) Random Samples March 13, 2006 4 / 9

Likelihood Definition Suppose X 1, X 2,..., X n is a sample with joint probability function f having parameters θ 1, θ 2,..., θ m. We call L a likelihood function for X 1, X 2,..., X n if for some constant k. L(θ 1, θ 2,..., θ m ) = kf (x 1, x 2,..., x n θ 1, θ 2,..., θ m ), Dan Sloughter (Furman University) Random Samples March 13, 2006 5 / 9

Likelihood Definition Suppose X 1, X 2,..., X n is a sample with joint probability function f having parameters θ 1, θ 2,..., θ m. We call L a likelihood function for X 1, X 2,..., X n if for some constant k. L(θ 1, θ 2,..., θ m ) = kf (x 1, x 2,..., x n θ 1, θ 2,..., θ m ), Note: given observations x 1, x 2,..., x n, L contains all the information about θ 1, θ 2,..., θ m. Dan Sloughter (Furman University) Random Samples March 13, 2006 5 / 9

Likelihood Definition Suppose X 1, X 2,..., X n is a sample with joint probability function f having parameters θ 1, θ 2,..., θ m. We call L a likelihood function for X 1, X 2,..., X n if for some constant k. L(θ 1, θ 2,..., θ m ) = kf (x 1, x 2,..., x n θ 1, θ 2,..., θ m ), Note: given observations x 1, x 2,..., x n, L contains all the information about θ 1, θ 2,..., θ m. Moreover, if X 1, X 2,..., X n is a random sample and each X i, i = 1, 2,..., n, has probability function f, then L(θ 1, θ 2,..., θ m ) n f (x i θ 1, θ 2,..., θ m ). i=1 Dan Sloughter (Furman University) Random Samples March 13, 2006 5 / 9

Example Dan Sloughter (Furman University) Random Samples March 13, 2006 6 / 9

Example Suppose X 1, X 2,..., X n is a random sample from an exponential distribution { λe λx, if x > 0, f (x λ) = 0, otherwise. Dan Sloughter (Furman University) Random Samples March 13, 2006 6 / 9

Example Suppose X 1, X 2,..., X n is a random sample from an exponential distribution { λe λx, if x > 0, f (x λ) = 0, otherwise. Then n f (x i λ) = i=1 {λ n e λ P n i=1 x i, if x i > 0, i = 1, 2,..., n, 0, otherwise. Dan Sloughter (Furman University) Random Samples March 13, 2006 6 / 9

Example Suppose X 1, X 2,..., X n is a random sample from an exponential distribution { λe λx, if x > 0, f (x λ) = 0, otherwise. Then Hence n f (x i λ) = i=1 L(λ) = is a likelihood function. {λ n e λ P n i=1 x i, if x i > 0, i = 1, 2,..., n, 0, otherwise. {λ n e λ P n i=1 x i, if λ > 0, 0, otherwise, Dan Sloughter (Furman University) Random Samples March 13, 2006 6 / 9

Example Dan Sloughter (Furman University) Random Samples March 13, 2006 7 / 9

Example Suppose X 1, X 2,..., X n is a random sample from N(µ, σ 2 ) (that is, a normal distribution with mean µ and variance σ 2 ). Dan Sloughter (Furman University) Random Samples March 13, 2006 7 / 9

Example Suppose X 1, X 2,..., X n is a random sample from N(µ, σ 2 ) (that is, a normal distribution with mean µ and variance σ 2 ). Then n i=1 ( ) n 1 ( ) e 1 2σ 2 (x i µ) 2 1 2 1 = 2πσ 2π σ n e 1 P n 2σ 2 i=1 (x i µ) 2. Dan Sloughter (Furman University) Random Samples March 13, 2006 7 / 9

Example Suppose X 1, X 2,..., X n is a random sample from N(µ, σ 2 ) (that is, a normal distribution with mean µ and variance σ 2 ). Then Hence n i=1 L(µ, σ 2 ) = ( ) n 1 ( ) e 1 2σ 2 (x i µ) 2 1 2 1 = 2πσ 2π σ n e 1 P n 2σ 2 i=1 (x i µ) 2. is a likelihood function. { ( 1 σ 2 ) n 2 e 1 2σ 2 P n i=1 (x i µ) 2, if σ 2 > 0, < µ <, 0, otherwise, Dan Sloughter (Furman University) Random Samples March 13, 2006 7 / 9

Example Dan Sloughter (Furman University) Random Samples March 13, 2006 8 / 9

Example Suppose X 1, X 2,..., X n is a random sample from a uniform distribution on the interval (0, θ). Dan Sloughter (Furman University) Random Samples March 13, 2006 8 / 9

Example Suppose X 1, X 2,..., X n is a random sample from a uniform distribution on the interval (0, θ). That is, X 1, X 2,..., X n is a random sample from a distribution with probability function 1, if 0 < x < θ, f (x θ) = θ 0, otherwise, for some θ > 0. Dan Sloughter (Furman University) Random Samples March 13, 2006 8 / 9

Example Suppose X 1, X 2,..., X n is a random sample from a uniform distribution on the interval (0, θ). That is, X 1, X 2,..., X n is a random sample from a distribution with probability function 1, if 0 < x < θ, f (x θ) = θ 0, otherwise, for some θ > 0. Then n 1 f (x i θ) = θ n, if 0 < x i < θ, i = 1, 2,..., n, 0, otherwise. i=1 Dan Sloughter (Furman University) Random Samples March 13, 2006 8 / 9

Example (cont d) Dan Sloughter (Furman University) Random Samples March 13, 2006 9 / 9

Example (cont d) Hence 1 L(θ) = θ n, if θ > x (n), 0, otherwise, is a likelihood function. Dan Sloughter (Furman University) Random Samples March 13, 2006 9 / 9