Computer Statistics with R
|
|
- Brett McBride
- 5 years ago
- Views:
Transcription
1 MAREK GAGOLEWSKI KONSTANCJA BOBECKA-WESO LOWSKA PRZEMYS LAW GRZEGORZEWSKI Computer Statistics with R 5. Point Estimation Faculty of Mathematics and Information Science Warsaw University of Technology [] Copyright Marek G agolewski This work is licensed under a Creative Commons Attribution 3.0 Unported License.
2 CONTENTS 0 Contents 5.1 Examples Exercises Hints and Answers Info These tutorials are likely to contain bugs and typos. hesitate to contact us! Thanks in advance! In case you find any don t
3 5.1. EXAMPLES Examples Remarks. MVUE = minimum-variance unbiased estimator, MME = method-of-moments estimator, MQE = method-of-quantiles estimator, MLE = maximal-likelihood estimator. Ex Generate a random sample Y = (Y 1,..., Y n, n = 500, from the standard Cauchy distribution C(a, b with a = 0, b = Compute mean(y i and median(y i of each subsample Y i = (Y 1,..., Y i, i = 1,..., n, and plot the following sets of points: {(i, mean(y i : i = 1,..., n}, and {(i, median(y i : i = 1,..., n}. Comment whether the sample mean and the sample median may be used as reasonable estimators of the a parameter. 2. Compute the standard deviation sd(y i and the half interquartile range IQR(Y i /2. May these sample statistics be considered as well-behaving estimators of the b parameter? Solution. First let us compare the two estimators of the location parameter a = 0. n <- 500 X <- rcauchy(n # sample of size n from the standardized Cauchy distribution mn <- numeric(n # means of all subsamples md <- numeric(n # medians of all subsamples for (i in 1:n { } mn[i] <- mean(x[1:i] md[i] <- median(x[1:i] plot(1:i, mn, type="l", col="blue", xlab="", ylab="", lty=1 lines(1:i, md, col="red", lty=2 abline(h=0, col="gray", lty=3 legend("bottom", c("mean.i", "median.i", "a=0", col=c("blue", "red", "gray", lty=1:3
4 5.1. EXAMPLES mean.i median.i a= Next we compare the two estimators of the scale parameter b = 1. n <- 500 X <- rcauchy(n s <- numeric(n-1 r <- numeric(n-1 for (i in 2:n { s[i-1] <- sd(x[1:i] r[i-1] <- IQR(X[1:i]*0.5 } plot(2:i, s, type="l", col="blue", xlab="", ylab="", ylim=c(0,max(s lines(2:i, r, col="red", lty=2 abline(h=1, col="gray", lty=3 legend("topleft", c("s.i", "r.i", "b=1", col=c("blue", "red", "gray", lty=1: s.i r.i b= Draw conclusions.
5 5.1. EXAMPLES 3 Ex We are given an i.i.d. random sample of size n = 20 from the uniform distribution U ([0, θ], where θ = 1. Determine method of moments (MME, and maximum likelihood (MLE estimators of θ. Compare their bias and mean square errors. Solution. Let X = (X 1,..., X n be a sample from the uniform distribution U ([0, θ]. It may be shown that the estimators of θ are of the form: The MME estimator: The MLE estimator: ˆθ 1 = 2 X, (5.1 ˆθ 2 = X n:n, (5.2 The minimum variance unbiased (MVUE estimator: ˆθ 3 = n + 1 n X n:n = n + 1 n ˆθ 2. (5.3 Moreover, the bias b (ˆθ = E (ˆθ θ = E (ˆθ θ (5.4 and the mean square errors (MSE of the above estimators are as follows: 2 [ (ˆθ] 2 MSE (ˆθ = E (ˆθ θ = Var (ˆθ + b (5.5 i b (ˆθi MSE (ˆθi 1 0 θ 2 3n 2 θ n+1 2θ 2 (n+1(n θ 2 n(n+2 Let θ = 1. We will illustrate how to approximate these values using MC simulation. Let us generate m = random samples of size n = 20 from the uniform distribution U ([0, 1] and calculate the value of the three estimators of θ. The results will be stored in a m 3 matrix. m < n <- 20 theta <- 1 results <- replicate(m, { } X <- runif(n, 0, theta c(2 * mean(x, max(x, max(x * (n + 1/n # result of a single experiment results[, 1:5] # show the results of the first 5 experiments: ## [,1] [,2] [,3] [,4] [,5] ## [1,] ## [2,] ## [3,] Let us compute the empirical bias and mean square errors of our estimators. empirical bias: The
6 5.1. EXAMPLES 4 mean(results[1, ] - theta ## [1] mean(results[2, ] - theta ## [1] mean(results[3, ] - theta ## [1] The empirical MSEs: var(results[1, ] + (mean(results[1, ] - theta^2 ## [1] var(results[2, ] + (mean(results[2, ] - theta^2 ## [1] var(results[3, ] + (mean(results[3, ] - theta^2 ## [1] Compare them to the theoretical (exact bias and mean square errors of our estimators: The bias of b (ˆθ2 : -theta/(n + 1 ## [1] MSE (ˆθ1 : (theta^2/(3 * n ## [1] MSE (ˆθ2 : 2 * (theta^2/(n + 1/(n + 2 ## [1] MSE (ˆθ3 : (theta^2/n/(n + 2 ## [1] Monte Carlo simulation methods are often used to examine the properties of estimators in cases where analytic solutions are unavailable. By the law of large numbers, we expect that as m increases, the approximation error becomes smaller.
7 5.2. EXERCISES Exercises Preliminaries Ex Generate a random sample Y = (Y 1,..., Y n, n = 500, from the standard Normal distribution N(µ, σ with µ = 0, σ = Compute mean(y i and median(y i of each subsample Y i = (Y 1,..., Y i, i = 1,..., n, and plot the following sets of points: {(i, mean(y i : i = 1,..., n}, and {(i, median(y i : i = 1,..., n}. Comment whether we can use the sample mean and the sample median as reasonable estimators of µ. 2. Compute the scaled interquartile range IQR(Y i /1.35 and the standard deviation sd(y i. May these sample statistics be considered as well-behaving estimators of σ? Ex Given a statistical space (X, σ(x, P = {P θ : θ Θ} n, show that if θ is an unbiased estimator of θ, then MSE θ θ = Var θ ( θ. Ex Given a sequence (X 1,..., X n of i.i.d. random variables from the Exponential distribution Exp(λ, show that the statistic T (X 1,..., X n = n i=1 Xi 2 /2n is an unbiased estimator of the distribution variance. Ex Given a sequence (X 1,..., X n of i.i.d. random variables from the Uniform distribution U[a, a + 1], check whether the statistic T (X 1,..., X n = n i=1 X i /n 0.5 is an unbiased estimator of the parameter a. Ex Let X = (X 1, X 2,..., X n denote an i.i.d. random sample with finite (known expectation µ and finite (unknown variance σ 2. Let s 2 = 1 n (X i µ 2. (5.6 n i=1 Show that s 2 is an unbiased estimator of variance σ 2. Ex Let X = (X 1, X 2,..., X n denote a sample of i.i.d. random variables with finite variance σ 2. Show that 1. s 2 n = 1 ( ni=1 2 n X i X is a biased estimator of σ 2, and 2. s 2 = 1 ( ni=1 2 n 1 X i X is an unbiased estimator of σ 2. Ex Let T 1 = T 1 (X 1, X 2,..., X n and T 2 = T 2 (X 1, X 2,..., X n denote two independent unbiased estimators of a parameter θ. 1. Show that for each γ [0, 1], the statistic T = γt 1 + (1 γt 2 is also an unbiased estimator of θ. 2. Find such γ that leads to the smallest mean squared error of θ. Ex Let X = (X 1, X 2,..., X n denote a sample of independent Bern(θ-distributed random variables, θ (0, 1. Show that: 1. θ(x = i X i/n is an unbiased estimator of θ, 2. T (X = n n 1 ( θ(x θ(x 2 is an unbiased estimator of g(θ = θ(1 θ.
8 5.2. EXERCISES 6 Ex Let X = (X 1, X 2,..., X n denote an i.i.d. sample from the Poisson distribution with a parameter λ > 0. Show that X = 1 ni=1 n X i is a consistent MVUE estimator of λ. Ex Let X = (X 1, X 2,..., X n denote an i.i.d. random sample from the normal distribution with finite (known expectation µ and finite (unknown variance σ 2. Show that s 2 is a MVUE estimator of σ 2. Show that s 2 is an unbiased, but only asymptotically effective estimator of σ 2. Ex Let X = (X 1, X 2,..., X n denote an i.i.d. random sample from the exponential distribution with a parameter λ > 0. Let g(λ = 1/λ. Show that X is a consistent UMVU estimator of g(λ. Ex Let X = (X 1, X 2,..., X n denote an i.i.d. random sample from the exponential distribution with a parameter λ > 0. Let g(λ = 1/λ. Show that although T (X = nx 1:n is an unbiased estimator of g(λ, it is not consistent. Ex Let X = (X 1, X 2,..., X n denote an i.i.d. random sample from the normal distribution N(µ, σ. Compare the effectiveness of two estimators of µ the sample average and sample median. MLE, MME, MQE Ex A terrorist tapped the line of a very important person (V.I.P.. He noted down the times between successive calls [in minutes] of his victim: 1.74, 21.26, 11.19, 6.64, 0.07, 8.67, 43.25, 55.03, 7.64, 12.54, 39.19, 3.42, 4.89, 2.32, The attacker suspects that the V.I.P. is going to make a very important call (V.I.C.. Unfortunately, he needs to go to the toilet!!! He calls you because he is truly in state of emergency and you are famous of being a great statistician. You assume the data follow exponential distribution with an unknown parameter λ > Find estimators of λ using MME, MLE and MQE. 2. Calculate the probability that the V.I.P. will make a V.I.C. in next 5 minutes. Ex Find the MME and MLE of the parameter λ > 0 of the Poisson distribution. Ex Find the MME and MLE of the probability of success θ, in n independent Bernoulli trials. Ex Generate a random sample of size 25 for the uniform distribution U[0, θ], where θ = 1. Find the MME, MLE and a unbiased estimator of θ. Compare their bias, MSE and variance. Evaluate the estimators for the sample. Ex Find the MME, MLE and MQE of mean and variance of the normal distribution. Ex Find the MME and MLE of the shape a and scale b parameters of the gamma distribution Γ(a, b. Ex Let X = (X 1, X 2,..., X n denote an i.i.d. random sample from the Rayleigh distribution with unknown parameter σ > 0. The probability distribution function is: f(x = x exp( x2 for x 0. EX σ 2 2σ 2 i = σ 0.5π for i = 1,..., n. Find the MME and MLE of σ and compute their values for the following data: 4.93, 2.79, 7.19, 10.50, 9.19, 13.13, 0.47, 5.89, 9.90, 11.70, 6.83, 9.36, 2.04, Ex Let (X 1,..., X n denote the sequence of observations of proper operation times of n devices, each working independently. We assume that the time of proper work is exponentially distributed with an unknown parameter θ. The devices are not observed continuously, but the measurements are performed at discrete moments 1, 2,..., k. Hence, we observe only Y 1,..., Y n where { i if i 1 < Xj i, for some i = 1,..., k, Y j = k + 1 if X j > k,
9 5.3. HINTS AND ANSWERS 7 where j = 1,..., n. Let N i = # {j : Y j = i}, i = 1,..., k+1. Find the maximum likelihood estimator of the parameter θ. Perform the calculations for n = 10, k = 2 and N 1 = 5, N 2 = 2 i N 3 = Hints and Answers Hint to Ex n s 2 σ 2 χ 2 n Γ(n/2, 1/2. Hint to Ex (n 1s 2 σ 2 χ 2 n 1. Hint to Ex X N(µ, σ Y = X µ σ N(0, 1. Then EY 2k+1 = 0 and EY 2k = (2k 1. Hint to Ex πσ If X N(µ, σ, then Med X N(µ, 2 2n.
Two hours. To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER
Two hours MATH20802 To be supplied by the Examinations Office: Mathematical Formula Tables and Statistical Tables THE UNIVERSITY OF MANCHESTER STATISTICAL METHODS Answer any FOUR of the SIX questions.
More informationMATH 3200 Exam 3 Dr. Syring
. Suppose n eligible voters are polled (randomly sampled) from a population of size N. The poll asks voters whether they support or do not support increasing local taxes to fund public parks. Let M be
More informationChapter 7 - Lecture 1 General concepts and criteria
Chapter 7 - Lecture 1 General concepts and criteria January 29th, 2010 Best estimator Mean Square error Unbiased estimators Example Unbiased estimators not unique Special case MVUE Bootstrap General Question
More informationChapter 7: Point Estimation and Sampling Distributions
Chapter 7: Point Estimation and Sampling Distributions Seungchul Baek Department of Statistics, University of South Carolina STAT 509: Statistics for Engineers 1 / 20 Motivation In chapter 3, we learned
More informationApplied Statistics I
Applied Statistics I Liang Zhang Department of Mathematics, University of Utah July 14, 2008 Liang Zhang (UofU) Applied Statistics I July 14, 2008 1 / 18 Point Estimation Liang Zhang (UofU) Applied Statistics
More informationLecture 10: Point Estimation
Lecture 10: Point Estimation MSU-STT-351-Sum-17B (P. Vellaisamy: MSU-STT-351-Sum-17B) Probability & Statistics for Engineers 1 / 31 Basic Concepts of Point Estimation A point estimate of a parameter θ,
More informationPoint Estimation. Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage
6 Point Estimation Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage Point Estimation Statistical inference: directed toward conclusions about one or more parameters. We will use the generic
More informationChapter 8: Sampling distributions of estimators Sections
Chapter 8 continued Chapter 8: Sampling distributions of estimators Sections 8.1 Sampling distribution of a statistic 8.2 The Chi-square distributions 8.3 Joint Distribution of the sample mean and sample
More informationActuarial Mathematics and Statistics Statistics 5 Part 2: Statistical Inference Tutorial Problems
Actuarial Mathematics and Statistics Statistics 5 Part 2: Statistical Inference Tutorial Problems Spring 2005 1. Which of the following statements relate to probabilities that can be interpreted as frequencies?
More informationDefinition 9.1 A point estimate is any function T (X 1,..., X n ) of a random sample. We often write an estimator of the parameter θ as ˆθ.
9 Point estimation 9.1 Rationale behind point estimation When sampling from a population described by a pdf f(x θ) or probability function P [X = x θ] knowledge of θ gives knowledge of the entire population.
More informationStatistical analysis and bootstrapping
Statistical analysis and bootstrapping p. 1/15 Statistical analysis and bootstrapping Michel Bierlaire michel.bierlaire@epfl.ch Transport and Mobility Laboratory Statistical analysis and bootstrapping
More informationPoint Estimation. Principle of Unbiased Estimation. When choosing among several different estimators of θ, select one that is unbiased.
Point Estimation Point Estimation Definition A point estimate of a parameter θ is a single number that can be regarded as a sensible value for θ. A point estimate is obtained by selecting a suitable statistic
More informationmay be of interest. That is, the average difference between the estimator and the truth. Estimators with Bias(ˆθ) = 0 are called unbiased.
1 Evaluating estimators Suppose you observe data X 1,..., X n that are iid observations with distribution F θ indexed by some parameter θ. When trying to estimate θ, one may be interested in determining
More informationLecture Notes 6. Assume F belongs to a family of distributions, (e.g. F is Normal), indexed by some parameter θ.
Sufficient Statistics Lecture Notes 6 Sufficiency Data reduction in terms of a particular statistic can be thought of as a partition of the sample space X. Definition T is sufficient for θ if the conditional
More informationSection 2.4. Properties of point estimators 135
Section 2.4. Properties of point estimators 135 The fact that S 2 is an estimator of σ 2 for any population distribution is one of the most compelling reasons to use the n 1 in the denominator of the definition
More informationChapter 4: Commonly Used Distributions. Statistics for Engineers and Scientists Fourth Edition William Navidi
Chapter 4: Commonly Used Distributions Statistics for Engineers and Scientists Fourth Edition William Navidi 2014 by Education. This is proprietary material solely for authorized instructor use. Not authorized
More informationPractice Exercises for Midterm Exam ST Statistical Theory - II The ACTUAL exam will consists of less number of problems.
Practice Exercises for Midterm Exam ST 522 - Statistical Theory - II The ACTUAL exam will consists of less number of problems. 1. Suppose X i F ( ) for i = 1,..., n, where F ( ) is a strictly increasing
More informationChapter 5: Statistical Inference (in General)
Chapter 5: Statistical Inference (in General) Shiwen Shen University of South Carolina 2016 Fall Section 003 1 / 17 Motivation In chapter 3, we learn the discrete probability distributions, including Bernoulli,
More informationChapter 8. Introduction to Statistical Inference
Chapter 8. Introduction to Statistical Inference Point Estimation Statistical inference is to draw some type of conclusion about one or more parameters(population characteristics). Now you know that a
More informationExercise. Show the corrected sample variance is an unbiased estimator of population variance. S 2 = n i=1 (X i X ) 2 n 1. Exercise Estimation
Exercise Show the corrected sample variance is an unbiased estimator of population variance. S 2 = n i=1 (X i X ) 2 n 1 Exercise S 2 = = = = n i=1 (X i x) 2 n i=1 = (X i µ + µ X ) 2 = n 1 n 1 n i=1 ((X
More informationدرس هفتم یادگیري ماشین. (Machine Learning) دانشگاه فردوسی مشهد دانشکده مهندسی رضا منصفی
یادگیري ماشین توزیع هاي نمونه و تخمین نقطه اي پارامترها Sampling Distributions and Point Estimation of Parameter (Machine Learning) دانشگاه فردوسی مشهد دانشکده مهندسی رضا منصفی درس هفتم 1 Outline Introduction
More informationPoint Estimators. STATISTICS Lecture no. 10. Department of Econometrics FEM UO Brno office 69a, tel
STATISTICS Lecture no. 10 Department of Econometrics FEM UO Brno office 69a, tel. 973 442029 email:jiri.neubauer@unob.cz 8. 12. 2009 Introduction Suppose that we manufacture lightbulbs and we want to state
More informationLearning From Data: MLE. Maximum Likelihood Estimators
Learning From Data: MLE Maximum Likelihood Estimators 1 Parameter Estimation Assuming sample x1, x2,..., xn is from a parametric distribution f(x θ), estimate θ. E.g.: Given sample HHTTTTTHTHTTTHH of (possibly
More informationChapter 4: Asymptotic Properties of MLE (Part 3)
Chapter 4: Asymptotic Properties of MLE (Part 3) Daniel O. Scharfstein 09/30/13 1 / 1 Breakdown of Assumptions Non-Existence of the MLE Multiple Solutions to Maximization Problem Multiple Solutions to
More informationChapter 8: Sampling distributions of estimators Sections
Chapter 8: Sampling distributions of estimators Sections 8.1 Sampling distribution of a statistic 8.2 The Chi-square distributions 8.3 Joint Distribution of the sample mean and sample variance Skip: p.
More information4-1. Chapter 4. Commonly Used Distributions by The McGraw-Hill Companies, Inc. All rights reserved.
4-1 Chapter 4 Commonly Used Distributions 2014 by The Companies, Inc. All rights reserved. Section 4.1: The Bernoulli Distribution 4-2 We use the Bernoulli distribution when we have an experiment which
More informationPoint Estimation. Some General Concepts of Point Estimation. Example. Estimator quality
Point Estimation Some General Concepts of Point Estimation Statistical inference = conclusions about parameters Parameters == population characteristics A point estimate of a parameter is a value (based
More informationMuch of what appears here comes from ideas presented in the book:
Chapter 11 Robust statistical methods Much of what appears here comes from ideas presented in the book: Huber, Peter J. (1981), Robust statistics, John Wiley & Sons (New York; Chichester). There are many
More informationUQ, STAT2201, 2017, Lectures 3 and 4 Unit 3 Probability Distributions.
UQ, STAT2201, 2017, Lectures 3 and 4 Unit 3 Probability Distributions. Random Variables 2 A random variable X is a numerical (integer, real, complex, vector etc.) summary of the outcome of the random experiment.
More informationBack to estimators...
Back to estimators... So far, we have: Identified estimators for common parameters Discussed the sampling distributions of estimators Introduced ways to judge the goodness of an estimator (bias, MSE, etc.)
More informationStatistical estimation
Statistical estimation Statistical modelling: theory and practice Gilles Guillot gigu@dtu.dk September 3, 2013 Gilles Guillot (gigu@dtu.dk) Estimation September 3, 2013 1 / 27 1 Introductory example 2
More informationcontinuous rv Note for a legitimate pdf, we have f (x) 0 and f (x)dx = 1. For a continuous rv, P(X = c) = c f (x)dx = 0, hence
continuous rv Let X be a continuous rv. Then a probability distribution or probability density function (pdf) of X is a function f(x) such that for any two numbers a and b with a b, P(a X b) = b a f (x)dx.
More informationPoint Estimation. Copyright Cengage Learning. All rights reserved.
6 Point Estimation Copyright Cengage Learning. All rights reserved. 6.2 Methods of Point Estimation Copyright Cengage Learning. All rights reserved. Methods of Point Estimation The definition of unbiasedness
More informationChapter 5. Statistical inference for Parametric Models
Chapter 5. Statistical inference for Parametric Models Outline Overview Parameter estimation Method of moments How good are method of moments estimates? Interval estimation Statistical Inference for Parametric
More informationSTRESS-STRENGTH RELIABILITY ESTIMATION
CHAPTER 5 STRESS-STRENGTH RELIABILITY ESTIMATION 5. Introduction There are appliances (every physical component possess an inherent strength) which survive due to their strength. These appliances receive
More informationUNIVERSITY OF VICTORIA Midterm June 2014 Solutions
UNIVERSITY OF VICTORIA Midterm June 04 Solutions NAME: STUDENT NUMBER: V00 Course Name & No. Inferential Statistics Economics 46 Section(s) A0 CRN: 375 Instructor: Betty Johnson Duration: hour 50 minutes
More informationChapter 6: Point Estimation
Chapter 6: Point Estimation Professor Sharabati Purdue University March 10, 2014 Professor Sharabati (Purdue University) Point Estimation Spring 2014 1 / 37 Chapter Overview Point estimator and point estimate
More information4.1 Introduction Estimating a population mean The problem with estimating a population mean with a sample mean: an example...
Chapter 4 Point estimation Contents 4.1 Introduction................................... 2 4.2 Estimating a population mean......................... 2 4.2.1 The problem with estimating a population mean
More informationHomework Problems Stat 479
Chapter 10 91. * A random sample, X1, X2,, Xn, is drawn from a distribution with a mean of 2/3 and a variance of 1/18. ˆ = (X1 + X2 + + Xn)/(n-1) is the estimator of the distribution mean θ. Find MSE(
More informationGenerating Random Numbers
Generating Random Numbers Aim: produce random variables for given distribution Inverse Method Let F be the distribution function of an univariate distribution and let F 1 (y) = inf{x F (x) y} (generalized
More informationSTAT 509: Statistics for Engineers Dr. Dewei Wang. Copyright 2014 John Wiley & Sons, Inc. All rights reserved.
STAT 509: Statistics for Engineers Dr. Dewei Wang Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger 7 Point CHAPTER OUTLINE 7-1 Point Estimation 7-2
More informationMVE051/MSG Lecture 7
MVE051/MSG810 2017 Lecture 7 Petter Mostad Chalmers November 20, 2017 The purpose of collecting and analyzing data Purpose: To build and select models for parts of the real world (which can be used for
More informationPoint Estimation. Edwin Leuven
Point Estimation Edwin Leuven Introduction Last time we reviewed statistical inference We saw that while in probability we ask: given a data generating process, what are the properties of the outcomes?
More informationChapter 5 Univariate time-series analysis. () Chapter 5 Univariate time-series analysis 1 / 29
Chapter 5 Univariate time-series analysis () Chapter 5 Univariate time-series analysis 1 / 29 Time-Series Time-series is a sequence fx 1, x 2,..., x T g or fx t g, t = 1,..., T, where t is an index denoting
More informationUnit 5: Sampling Distributions of Statistics
Unit 5: Sampling Distributions of Statistics Statistics 571: Statistical Methods Ramón V. León 6/12/2004 Unit 5 - Stat 571 - Ramon V. Leon 1 Definitions and Key Concepts A sample statistic used to estimate
More informationECE 340 Probabilistic Methods in Engineering M/W 3-4:15. Lecture 10: Continuous RV Families. Prof. Vince Calhoun
ECE 340 Probabilistic Methods in Engineering M/W 3-4:15 Lecture 10: Continuous RV Families Prof. Vince Calhoun 1 Reading This class: Section 4.4-4.5 Next class: Section 4.6-4.7 2 Homework 3.9, 3.49, 4.5,
More informationUnit 5: Sampling Distributions of Statistics
Unit 5: Sampling Distributions of Statistics Statistics 571: Statistical Methods Ramón V. León 6/12/2004 Unit 5 - Stat 571 - Ramon V. Leon 1 Definitions and Key Concepts A sample statistic used to estimate
More informationRandom Variables Handout. Xavier Vilà
Random Variables Handout Xavier Vilà Course 2004-2005 1 Discrete Random Variables. 1.1 Introduction 1.1.1 Definition of Random Variable A random variable X is a function that maps each possible outcome
More informationLecture 5: Fundamentals of Statistical Analysis and Distributions Derived from Normal Distributions
Lecture 5: Fundamentals of Statistical Analysis and Distributions Derived from Normal Distributions ELE 525: Random Processes in Information Systems Hisashi Kobayashi Department of Electrical Engineering
More informationThe Bernoulli distribution
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike License. Your use of this material constitutes acceptance of that license and the conditions of use of materials on this
More informationProbability & Statistics
Probability & Statistics BITS Pilani K K Birla Goa Campus Dr. Jajati Keshari Sahoo Department of Mathematics Statistics Descriptive statistics Inferential statistics /38 Inferential Statistics 1. Involves:
More informationIEOR 165 Lecture 1 Probability Review
IEOR 165 Lecture 1 Probability Review 1 Definitions in Probability and Their Consequences 1.1 Defining Probability A probability space (Ω, F, P) consists of three elements: A sample space Ω is the set
More informationINSTITUTE AND FACULTY OF ACTUARIES. Curriculum 2019 SPECIMEN EXAMINATION
INSTITUTE AND FACULTY OF ACTUARIES Curriculum 2019 SPECIMEN EXAMINATION Subject CS1A Actuarial Statistics Time allowed: Three hours and fifteen minutes INSTRUCTIONS TO THE CANDIDATE 1. Enter all the candidate
More informationBIO5312 Biostatistics Lecture 5: Estimations
BIO5312 Biostatistics Lecture 5: Estimations Yujin Chung September 27th, 2016 Fall 2016 Yujin Chung Lec5: Estimations Fall 2016 1/34 Recap Yujin Chung Lec5: Estimations Fall 2016 2/34 Today s lecture and
More informationSTAT 830 Convergence in Distribution
STAT 830 Convergence in Distribution Richard Lockhart Simon Fraser University STAT 830 Fall 2013 Richard Lockhart (Simon Fraser University) STAT 830 Convergence in Distribution STAT 830 Fall 2013 1 / 31
More informationMachine Learning for Quantitative Finance
Machine Learning for Quantitative Finance Fast derivative pricing Sofie Reyners Joint work with Jan De Spiegeleer, Dilip Madan and Wim Schoutens Derivative pricing is time-consuming... Vanilla option pricing
More informationModule 4: Point Estimation Statistics (OA3102)
Module 4: Point Estimation Statistics (OA3102) Professor Ron Fricker Naval Postgraduate School Monterey, California Reading assignment: WM&S chapter 8.1-8.4 Revision: 1-12 1 Goals for this Module Define
More informationCSE 312 Winter Learning From Data: Maximum Likelihood Estimators (MLE)
CSE 312 Winter 2017 Learning From Data: Maximum Likelihood Estimators (MLE) 1 Parameter Estimation Given: independent samples x1, x2,..., xn from a parametric distribution f(x θ) Goal: estimate θ. Not
More informationAnalysis of truncated data with application to the operational risk estimation
Analysis of truncated data with application to the operational risk estimation Petr Volf 1 Abstract. Researchers interested in the estimation of operational risk often face problems arising from the structure
More informationChapter 7: Estimation Sections
1 / 40 Chapter 7: Estimation Sections 7.1 Statistical Inference Bayesian Methods: Chapter 7 7.2 Prior and Posterior Distributions 7.3 Conjugate Prior Distributions 7.4 Bayes Estimators Frequentist Methods:
More informationRandom Samples. Mathematics 47: Lecture 6. Dan Sloughter. Furman University. March 13, 2006
Random Samples Mathematics 47: Lecture 6 Dan Sloughter Furman University March 13, 2006 Dan Sloughter (Furman University) Random Samples March 13, 2006 1 / 9 Random sampling Definition We call a sequence
More information3 ˆθ B = X 1 + X 2 + X 3. 7 a) Find the Bias, Variance and MSE of each estimator. Which estimator is the best according
STAT 345 Spring 2018 Homework 9 - Point Estimation Name: Please adhere to the homework rules as given in the Syllabus. 1. Mean Squared Error. Suppose that X 1, X 2 and X 3 are independent random variables
More information**BEGINNING OF EXAMINATION** A random sample of five observations from a population is:
**BEGINNING OF EXAMINATION** 1. You are given: (i) A random sample of five observations from a population is: 0.2 0.7 0.9 1.1 1.3 (ii) You use the Kolmogorov-Smirnov test for testing the null hypothesis,
More informationROM SIMULATION Exact Moment Simulation using Random Orthogonal Matrices
ROM SIMULATION Exact Moment Simulation using Random Orthogonal Matrices Bachelier Finance Society Meeting Toronto 2010 Henley Business School at Reading Contact Author : d.ledermann@icmacentre.ac.uk Alexander
More informationSTAT/MATH 395 PROBABILITY II
STAT/MATH 395 PROBABILITY II Distribution of Random Samples & Limit Theorems Néhémy Lim University of Washington Winter 2017 Outline Distribution of i.i.d. Samples Convergence of random variables The Laws
More informationA New Hybrid Estimation Method for the Generalized Pareto Distribution
A New Hybrid Estimation Method for the Generalized Pareto Distribution Chunlin Wang Department of Mathematics and Statistics University of Calgary May 18, 2011 A New Hybrid Estimation Method for the GPD
More informationCase Study: Heavy-Tailed Distribution and Reinsurance Rate-making
Case Study: Heavy-Tailed Distribution and Reinsurance Rate-making May 30, 2016 The purpose of this case study is to give a brief introduction to a heavy-tailed distribution and its distinct behaviors in
More informationHomework Problems Stat 479
Chapter 2 1. Model 1 is a uniform distribution from 0 to 100. Determine the table entries for a generalized uniform distribution covering the range from a to b where a < b. 2. Let X be a discrete random
More informationSimulation Wrap-up, Statistics COS 323
Simulation Wrap-up, Statistics COS 323 Today Simulation Re-cap Statistics Variance and confidence intervals for simulations Simulation wrap-up FYI: No class or office hours Thursday Simulation wrap-up
More informationRandom Variable: Definition
Random Variables Random Variable: Definition A Random Variable is a numerical description of the outcome of an experiment Experiment Roll a die 10 times Inspect a shipment of 100 parts Open a gas station
More informationQualifying Exam Solutions: Theoretical Statistics
Qualifying Exam Solutions: Theoretical Statistics. (a) For the first sampling plan, the expectation of any statistic W (X, X,..., X n ) is a polynomial of θ of degree less than n +. Hence τ(θ) cannot have
More informationExam 2 Spring 2015 Statistics for Applications 4/9/2015
18.443 Exam 2 Spring 2015 Statistics for Applications 4/9/2015 1. True or False (and state why). (a). The significance level of a statistical test is not equal to the probability that the null hypothesis
More informationSTA 532: Theory of Statistical Inference
STA 532: Theory of Statistical Inference Robert L. Wolpert Department of Statistical Science Duke University, Durham, NC, USA 2 Estimating CDFs and Statistical Functionals Empirical CDFs Let {X i : i n}
More informationLikelihood Methods of Inference. Toss coin 6 times and get Heads twice.
Methods of Inference Toss coin 6 times and get Heads twice. p is probability of getting H. Probability of getting exactly 2 heads is 15p 2 (1 p) 4 This function of p, is likelihood function. Definition:
More informationFinancial Risk Management
Financial Risk Management Professor: Thierry Roncalli Evry University Assistant: Enareta Kurtbegu Evry University Tutorial exercices #3 1 Maximum likelihood of the exponential distribution 1. We assume
More informationPosterior Inference. , where should we start? Consider the following computational procedure: 1. draw samples. 2. convert. 3. compute properties
Posterior Inference Example. Consider a binomial model where we have a posterior distribution for the probability term, θ. Suppose we want to make inferences about the log-odds γ = log ( θ 1 θ), where
More informationNon-informative Priors Multiparameter Models
Non-informative Priors Multiparameter Models Statistics 220 Spring 2005 Copyright c 2005 by Mark E. Irwin Prior Types Informative vs Non-informative There has been a desire for a prior distributions that
More informationSubject CS1 Actuarial Statistics 1 Core Principles. Syllabus. for the 2019 exams. 1 June 2018
` Subject CS1 Actuarial Statistics 1 Core Principles Syllabus for the 2019 exams 1 June 2018 Copyright in this Core Reading is the property of the Institute and Faculty of Actuaries who are the sole distributors.
More informationStatistics for Business and Economics
Statistics for Business and Economics Chapter 5 Continuous Random Variables and Probability Distributions Ch. 5-1 Probability Distributions Probability Distributions Ch. 4 Discrete Continuous Ch. 5 Probability
More informationFINITE SAMPLE DISTRIBUTIONS OF RISK-RETURN RATIOS
Available Online at ESci Journals Journal of Business and Finance ISSN: 305-185 (Online), 308-7714 (Print) http://www.escijournals.net/jbf FINITE SAMPLE DISTRIBUTIONS OF RISK-RETURN RATIOS Reza Habibi*
More informationPractice Exam 1. Loss Amount Number of Losses
Practice Exam 1 1. You are given the following data on loss sizes: An ogive is used as a model for loss sizes. Determine the fitted median. Loss Amount Number of Losses 0 1000 5 1000 5000 4 5000 10000
More informationCommonly Used Distributions
Chapter 4: Commonly Used Distributions 1 Introduction Statistical inference involves drawing a sample from a population and analyzing the sample data to learn about the population. We often have some knowledge
More informationIntroduction to Algorithmic Trading Strategies Lecture 8
Introduction to Algorithmic Trading Strategies Lecture 8 Risk Management Haksun Li haksun.li@numericalmethod.com www.numericalmethod.com Outline Value at Risk (VaR) Extreme Value Theory (EVT) References
More informationSTAT 425: Introduction to Bayesian Analysis
STAT 45: Introduction to Bayesian Analysis Marina Vannucci Rice University, USA Fall 018 Marina Vannucci (Rice University, USA) Bayesian Analysis (Part 1) Fall 018 1 / 37 Lectures 9-11: Multi-parameter
More informationHardy Weinberg Model- 6 Genotypes
Hardy Weinberg Model- 6 Genotypes Silvelyn Zwanzig Hardy -Weinberg with six genotypes. In a large population of plants (Mimulus guttatus there are possible alleles S, I, F at one locus resulting in six
More informationContinuous random variables
Continuous random variables probability density function (f(x)) the probability distribution function of a continuous random variable (analogous to the probability mass function for a discrete random variable),
More informationChapter 7: SAMPLING DISTRIBUTIONS & POINT ESTIMATION OF PARAMETERS
Chapter 7: SAMPLING DISTRIBUTIONS & POINT ESTIMATION OF PARAMETERS Part 1: Introduction Sampling Distributions & the Central Limit Theorem Point Estimation & Estimators Sections 7-1 to 7-2 Sample data
More informationNormal Distribution. Definition A continuous rv X is said to have a normal distribution with. the pdf of X is
Normal Distribution Normal Distribution Definition A continuous rv X is said to have a normal distribution with parameter µ and σ (µ and σ 2 ), where < µ < and σ > 0, if the pdf of X is f (x; µ, σ) = 1
More informationModule 2: Monte Carlo Methods
Module 2: Monte Carlo Methods Prof. Mike Giles mike.giles@maths.ox.ac.uk Oxford University Mathematical Institute MC Lecture 2 p. 1 Greeks In Monte Carlo applications we don t just want to know the expected
More information1/2 2. Mean & variance. Mean & standard deviation
Question # 1 of 10 ( Start time: 09:46:03 PM ) Total Marks: 1 The probability distribution of X is given below. x: 0 1 2 3 4 p(x): 0.73? 0.06 0.04 0.01 What is the value of missing probability? 0.54 0.16
More informationChapter 7. Sampling Distributions and the Central Limit Theorem
Chapter 7. Sampling Distributions and the Central Limit Theorem 1 Introduction 2 Sampling Distributions related to the normal distribution 3 The central limit theorem 4 The normal approximation to binomial
More informationBayesian Linear Model: Gory Details
Bayesian Linear Model: Gory Details Pubh7440 Notes By Sudipto Banerjee Let y y i ] n i be an n vector of independent observations on a dependent variable (or response) from n experimental units. Associated
More information12. Conditional heteroscedastic models (ARCH) MA6622, Ernesto Mordecki, CityU, HK, 2006.
12. Conditional heteroscedastic models (ARCH) MA6622, Ernesto Mordecki, CityU, HK, 2006. References for this Lecture: Robert F. Engle. Autoregressive Conditional Heteroscedasticity with Estimates of Variance
More information6. Genetics examples: Hardy-Weinberg Equilibrium
PBCB 206 (Fall 2006) Instructor: Fei Zou email: fzou@bios.unc.edu office: 3107D McGavran-Greenberg Hall Lecture 4 Topics for Lecture 4 1. Parametric models and estimating parameters from data 2. Method
More informationIEOR 3106: Introduction to OR: Stochastic Models. Fall 2013, Professor Whitt. Class Lecture Notes: Tuesday, September 10.
IEOR 3106: Introduction to OR: Stochastic Models Fall 2013, Professor Whitt Class Lecture Notes: Tuesday, September 10. The Central Limit Theorem and Stock Prices 1. The Central Limit Theorem (CLT See
More informationStrategies for Improving the Efficiency of Monte-Carlo Methods
Strategies for Improving the Efficiency of Monte-Carlo Methods Paul J. Atzberger General comments or corrections should be sent to: paulatz@cims.nyu.edu Introduction The Monte-Carlo method is a useful
More informationReview for Final Exam Spring 2014 Jeremy Orloff and Jonathan Bloom
Review for Final Exam 18.05 Spring 2014 Jeremy Orloff and Jonathan Bloom THANK YOU!!!! JON!! PETER!! RUTHI!! ERIKA!! ALL OF YOU!!!! Probability Counting Sets Inclusion-exclusion principle Rule of product
More information9. Statistics I. Mean and variance Expected value Models of probability events
9. Statistics I Mean and variance Expected value Models of probability events 18 Statistic(s) Consider a set of distributed data (values) E.g., age of first marriage and average salary of Japanese If we
More informationProbability Theory and Simulation Methods. April 9th, Lecture 20: Special distributions
April 9th, 2018 Lecture 20: Special distributions Week 1 Chapter 1: Axioms of probability Week 2 Chapter 3: Conditional probability and independence Week 4 Chapters 4, 6: Random variables Week 9 Chapter
More informationSTATISTICAL LABORATORY, May 18th, 2010 CENTRAL LIMIT THEOREM ILLUSTRATION
STATISTICAL LABORATORY, May 18th, 2010 CENTRAL LIMIT THEOREM ILLUSTRATION Mario Romanazzi 1 BINOMIAL DISTRIBUTION The binomial distribution Bi(n, p), being the sum of n independent Bernoulli distributions,
More information