DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

Similar documents
X i = 124 MARTINGALES

Point Estimation. Stat 4570/5570 Material from Devore s book (Ed 8), and Cengage

Rowan University Department of Electrical and Computer Engineering

X(t) = B(t), t 0,

(Practice Version) Midterm Exam 1

ECEn 370 Introduction to Probability

Chapter 4: Commonly Used Distributions. Statistics for Engineers and Scientists Fourth Edition William Navidi

Central Limit Theorem, Joint Distributions Spring 2018

Review for Final Exam Spring 2014 Jeremy Orloff and Jonathan Bloom

Point Estimation. Some General Concepts of Point Estimation. Example. Estimator quality

Two hours UNIVERSITY OF MANCHESTER. 23 May :00 16:00. Answer ALL SIX questions The total number of marks in the paper is 90.

STAT/MATH 395 PROBABILITY II

MATH/STAT 3360, Probability FALL 2012 Toby Kenney

Modes of Convergence

Statistics and Their Distributions

6. Martingales. = Zn. Think of Z n+1 as being a gambler s earnings after n+1 games. If the game if fair, then E [ Z n+1 Z n

CS134: Networks Spring Random Variables and Independence. 1.2 Probability Distribution Function (PDF) Number of heads Probability 2 0.

Lecture Notes 6. Assume F belongs to a family of distributions, (e.g. F is Normal), indexed by some parameter θ.

Chapter 4 Random Variables & Probability. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables

Probability Theory and Simulation Methods. April 9th, Lecture 20: Special distributions

Homework Assignments

Lecture Quantitative Finance Spring Term 2015

DRAFT. 1 exercise in state (S, t), π(s, t) = 0 do not exercise in state (S, t) Review of the Risk Neutral Stock Dynamics

5.3 Statistics and Their Distributions

Point Estimation. Copyright Cengage Learning. All rights reserved.

درس هفتم یادگیري ماشین. (Machine Learning) دانشگاه فردوسی مشهد دانشکده مهندسی رضا منصفی

Chapter 7: Estimation Sections

Final exam solutions

Exam P Flashcards exams. Key concepts. Important formulas. Efficient methods. Advice on exam technique

Regret Minimization and Correlated Equilibria

ECON 6022B Problem Set 2 Suggested Solutions Fall 2011

Part 1 In which we meet the law of averages. The Law of Averages. The Expected Value & The Standard Error. Where Are We Going?

6 Central Limit Theorem. (Chs 6.4, 6.5)

. (i) What is the probability that X is at most 8.75? =.875

MAS187/AEF258. University of Newcastle upon Tyne

5.7 Probability Distributions and Variance

Characterization of the Optimum

Simple Random Sample

Discrete Mathematics for CS Spring 2008 David Wagner Final Exam

5.2 Random Variables, Probability Histograms and Probability Distributions

M.Sc. ACTUARIAL SCIENCE. Term-End Examination

Theoretical Statistics. Lecture 4. Peter Bartlett

Q1. [?? pts] Search Traces

Some Discrete Distribution Families

EE266 Homework 5 Solutions

4 Martingales in Discrete-Time

MATH/STAT 3360, Probability FALL 2013 Toby Kenney

PROBABILITY DISTRIBUTIONS

EE/AA 578 Univ. of Washington, Fall Homework 8

Math-Stat-491-Fall2014-Notes-V

variance risk Alice & Bob are gambling (again). X = Alice s gain per flip: E[X] = Time passes... Alice (yawning) says let s raise the stakes

1. If four dice are rolled, what is the probability of obtaining two identical odd numbers and two identical even numbers?

Introduction to Fall 2007 Artificial Intelligence Final Exam

Commonly Used Distributions

INSTITUTE AND FACULTY OF ACTUARIES. Curriculum 2019 SPECIMEN EXAMINATION

Version A. Problem 1. Let X be the continuous random variable defined by the following pdf: 1 x/2 when 0 x 2, f(x) = 0 otherwise.

Lecture 19: March 20

Convergence. Any submartingale or supermartingale (Y, F) converges almost surely if it satisfies E Y n <. STAT2004 Martingale Convergence

Yao s Minimax Principle

Homework 9 (for lectures on 4/2)

1. For two independent lives now age 30 and 34, you are given:

4-1. Chapter 4. Commonly Used Distributions by The McGraw-Hill Companies, Inc. All rights reserved.

Lecture 23: April 10

General Examination in Macroeconomic Theory SPRING 2016

**BEGINNING OF EXAMINATION** A random sample of five observations from a population is:

Chapter 5. Statistical inference for Parametric Models

Tutorial 6. Sampling Distribution. ENGG2450A Tutors. 27 February The Chinese University of Hong Kong 1/6

ECE 302 Spring Ilya Pollak

Chapter 6: Point Estimation

This exam contains 8 pages (including this cover page) and 5 problems. Check to see if any pages are missing.

Self-organized criticality on the stock market

5. In fact, any function of a random variable is also a random variable

UNIVERSITY OF VICTORIA Midterm June 2014 Solutions

UNIVERSITY OF OSLO. Please make sure that your copy of the problem set is complete before you attempt to answer anything.

Practice Exercises for Midterm Exam ST Statistical Theory - II The ACTUAL exam will consists of less number of problems.

Expectations. Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or

X ln( +1 ) +1 [0 ] Γ( )

Midterm Exam: Tuesday 28 March in class Sample exam problems ( Homework 5 ) available tomorrow at the latest

IEOR 165 Lecture 1 Probability Review

Chapter 5. Continuous Random Variables and Probability Distributions. 5.1 Continuous Random Variables

Probability Theory. Mohamed I. Riffi. Islamic University of Gaza

Martingales. Will Perkins. March 18, 2013

Ph.D. Preliminary Examination MICROECONOMIC THEORY Applied Economics Graduate Program June 2015

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 11 10/9/2013. Martingales and stopping times II

Discrete Random Variables

Simulation Wrap-up, Statistics COS 323

Market Liquidity and Performance Monitoring The main idea The sequence of events: Technology and information

The Ohio State University Department of Economics Second Midterm Examination Answers

Chapter 3 Discrete Random Variables and Probability Distributions

Probability Theory. Probability and Statistics for Data Science CSE594 - Spring 2016

Econ 250 Fall Due at November 16. Assignment 2: Binomial Distribution, Continuous Random Variables and Sampling

Problem 1: Random variables, common distributions and the monopoly price

Random variables. Discrete random variables. Continuous random variables.

Binomial Random Variables. Binomial Random Variables

Midterm Exam International Trade Economics 6903, Fall 2008 Donald Davis

(b) per capita consumption grows at the rate of 2%.

Chapter 3 Common Families of Distributions. Definition 3.4.1: A family of pmfs or pdfs is called exponential family if it can be expressed as

Data Analytics (CS40003) Practice Set IV (Topic: Probability and Sampling Distribution)

INSTITUTE OF ACTUARIES OF INDIA EXAMINATIONS. 20 th May Subject CT3 Probability & Mathematical Statistics

The value of foresight

Transcription:

QUESTION BOOKLET EE 126 Spring 2006 Final Exam Wednesday, May 17, 8am 11am DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO You have 180 minutes to complete the final. The final consists of five (FIVE) problems, provided in the question booklet (THIS BOOKLET), that are in no particular order of difficulty. Write your solution to each problem in the space provided in the solution booklet (THE OTHER BOOKLET). Try to be neat! If we can t read it, we can t grade it. You may give an answer in the form of an arithmetic expression (sums, products, ratios, factorials) that could be evaluated using a calculator. Expressions like ( 8 3) or 5 k=0 ()k are also fine. A correct answer does not guarantee full credit and a wrong answer does not guarantee loss of credit. You should concisely explain your reasoning and show all relevant work. The grade on each problem is based on our judgment of your understanding as reflected by what you have written. This is a closed-book exam except for three sheets of handwritten notes, plus a calculator.

Problem 1: (22 points) Suppose that a pair of random variables X and Y has a joint PDF that is uniform over the shaded region shown in the figure below: x exp(y) PSfrag replacements 1 +1 y Figure 1: Joint PDF of random variables X and Y. (a) (7 pt) (b) (7 pt) Compute the Bayes least squares estimator (BLSE) of X based on Y. (Note: You should evaluate the required integrals; however, your answer can be left in terms of quantities like 1/e or 2). Compute the linear least squares estimator (LLSE) of X based on Y, as well as the associated error variance of this estimator. Is the LLSE the same as the BLSE in this case? Why or why not? (Note: You should evaluate the required integrals; however, your answer can be left in terms of quantities like 1/e or 2). (c) (8 pt) Now suppose that in addition to observing some value Y = y, we also know that X 1/e. Compute the BLSE and LLSE estimators of X based on both pieces of information. Are the estimators the same or different? Explain why in either case. 2

Problem 2 (18 points): Bob the Gambler: Bob is addicted to gambling, and does so frequently. The time between any two consecutive visits that Bob makes to the local casino can be modeled as an exponentially-distributed random variable X with mean of 1 day. The times between different pairs of consecutive visits are independent random variables. Every time i = 1, 2, 3... that Bob gambles, he wins/loses a random amount of money modeled as a Gaussian random variable Y i with mean 0 and variance σ 2. The amounts of money that he wins/loses on different occasions are independent random variables. Suppose that Bob starts off at time t = 0 with 0 dollars. (a) (3 pt) What is the distribution of N(t), the number of times Bob goes to the casino before some fixed time t 0? (b) (2 pt) What is PDF of M(t), the amount of money Bob has won/lost by time t? (c) (5 pt) Using Chernoff bound, bound the probability that M(t) is greater that a dollars. (Note: You do NOT need to solve the minimization problem in the Chernoff bound.) (d) (6 pt) For a given t > 0, define a random variable Z t = M(t), t corresponding to the average amount of money that Bob has won/lost at time t. As t, does Z t converge in probability? If so, prove it. If not, explain intuitively why not. 3

Problem 3: (20 points) True or false: For each of the following statements, either give a counterexample to show that it is false, or provide an argument to justify that it is true. (Note: You will receive no points for just guessing the correct answer; full points will be awarded only when an answer is justified with an example or argument.) (a) (4 pt) If the Bayes least squares estimator of X given Y is equal to E[X], then X and Y are independent. (b) (4 pt) If random variables X and Y are independent, then they are conditionally independent given any other random variable Z. (c) (4 pt) Given a sequence of random variables {X n } such that E[X n ] = n and E[Xn] 2 = (n + 1) 2, the sequence Y n : = Xn n converges in probability to some real number. (d) (4 pt) If the linear-least squares estimator X LLSE and Bayes least-squares estimator X BLSE (of X based on Y ) are equal, then the random variables X and Y must be jointly Gaussian. (e) (4 pt) There do not exist any pairs of random variables X and Y with E[X 4 ] = 4, E[Y 4 ] = 1 and E[X 2 Y 2 ] = 3. 4

Problem 4: (20 points) The position of a moving particle can be modeled by a Markov chain on the states {0, 1, 2} with the state transition diagram shown below 1 0 2 1/3 2/3 (a) (3 pt) Classify the states in this Markov chain. Is the chain periodic or aperiodic? (b) (3 pt) In the long run, what fraction of time does the particle spend in state 1? (c) (4 pt) Suppose that the starting position X 0 is chosen according to the the steady state distribution. Conditioned on X 2 = 2, what is the probability that X 0 = 0? (d) (5 pt) Suppose that X 0 = 0, and let T denote the first time by which the particle has visited all the states. Compute E[T ]. [Hint: This problem is made easier by a suitable choice of conditioning.] (e) (5 pt) Suppose now we have two particles, both independently moving according to the Markov chain. One particle starts in state 2, and the other particle starts in state 1. What is the average time before at least one of the particles is in state 0? 5

Problem 5: (20 points) Action figure collector: Acme Brand Chocolate Eggs are hollow on the inside, and each egg contains an action figure, chosen uniformly from a set of four different action figures. The price of any given egg (in dollars) is an exponentially distributed random variable with parameter λ, and the prices of different eggs are independent. For i = 1,..., 4, let T i be a random variable corresponding to the number of eggs that you purchase in order to have i different action figures. To be precise, after purchasing T 2 eggs (and not before), you have at least one copy of exactly 2 different action figures; and after purchasing T 4 eggs (and not before), you have at least one copy of all four action figures. (a) (2 pt) What is the PMF, expected value, and variance of T 1? (b) (4 pt) What is the PMF and expected value of T 2? (c) (5 pt) Compute E[T 4 ] and var(t 4 ). (Hint: Find a representation of T 4 as a sum of independent RVs.) (d) (5 pt) Compute the moment generating function of T 4. (e) (4 pt) You keep buying eggs until you have collected all four action figures. Letting Z be a random variable representing the total amount of money (in dollars) that you spend, compute E[Z] and var[z]. 6