Chapter 1: Stochastic Processes

Similar documents
Lecture 2. Main Topics: (Part II) Chapter 2 (2-7), Chapter 3. Bayes Theorem: Let A, B be two events, then. The probabilities P ( B), probability of B.

A random variable X is a function that assigns (real) numbers to the elements of the sample space S of a random experiment.

Midterm Exam: Tuesday 28 March in class Sample exam problems ( Homework 5 ) available tomorrow at the latest

TOPIC: PROBABILITY DISTRIBUTIONS

***SECTION 7.1*** Discrete and Continuous Random Variables

Policyholder Outcome Death Disability Neither Payout, x 10,000 5, ,000

and their probabilities p

(Practice Version) Midterm Exam 1

Chapter 3 Discrete Random Variables and Probability Distributions

STA 6166 Fall 2007 Web-based Course. Notes 10: Probability Models

Ordering a deck of cards... Lecture 3: Binomial Distribution. Example. Permutations & Combinations

UQ, STAT2201, 2017, Lectures 3 and 4 Unit 3 Probability Distributions.

Probability Theory. Probability and Statistics for Data Science CSE594 - Spring 2016

Binomial Random Variables. Binomial Random Variables

Part 1 In which we meet the law of averages. The Law of Averages. The Expected Value & The Standard Error. Where Are We Going?

Simple Random Sample

Objectives. 3.3 Toward statistical inference

Random Variables Handout. Xavier Vilà

Dr. Maddah ENMG 625 Financial Eng g II 10/16/06. Chapter 11 Models of Asset Dynamics (1)

Section Distributions of Random Variables

Statistics 6 th Edition

Chapter 8. Variables. Copyright 2004 Brooks/Cole, a division of Thomson Learning, Inc.

2/20/2013. of Manchester. The University COMP Building a yes / no classifier

Section Distributions of Random Variables

Section Random Variables and Histograms

6 If and then. (a) 0.6 (b) 0.9 (c) 2 (d) Which of these numbers can be a value of probability distribution of a discrete random variable

Probability Models.S2 Discrete Random Variables

Review for Final Exam Spring 2014 Jeremy Orloff and Jonathan Bloom

Examples: Random Variables. Discrete and Continuous Random Variables. Probability Distributions

1 < = α σ +σ < 0. Using the parameters and h = 1/365 this is N ( ) = If we use h = 1/252, the value would be N ( ) =

Discrete Random Variables and Probability Distributions

Probability Theory and Simulation Methods. April 9th, Lecture 20: Special distributions

Probability mass function; cumulative distribution function

Drunken Birds, Brownian Motion, and Other Random Fun

IEOR 165 Lecture 1 Probability Review

Chapter 3 - Lecture 5 The Binomial Probability Distribution

Statistical Methods in Practice STAT/MATH 3379

Have you ever wondered whether it would be worth it to buy a lottery ticket every week, or pondered on questions such as If I were offered a choice

continuous rv Note for a legitimate pdf, we have f (x) 0 and f (x)dx = 1. For a continuous rv, P(X = c) = c f (x)dx = 0, hence

Chapter 5 Discrete Probability Distributions. Random Variables Discrete Probability Distributions Expected Value and Variance

Statistics for Managers Using Microsoft Excel 7 th Edition

INDEX NUMBERS. Introduction

Discrete Probability Distributions

Econ 6900: Statistical Problems. Instructor: Yogesh Uppal

Discrete Random Variables and Probability Distributions. Stat 4570/5570 Based on Devore s book (Ed 8)

An Introduction to Stochastic Calculus

Lecture Notes 6. Assume F belongs to a family of distributions, (e.g. F is Normal), indexed by some parameter θ.

Asian Economic and Financial Review A MODEL FOR ESTIMATING THE DISTRIBUTION OF FUTURE POPULATION. Ben David Nissim.

Central Limit Theorem 11/08/2005

Statistics for Business and Economics

MA : Introductory Probability

Objectives. 5.2, 8.1 Inference for a single proportion. Categorical data from a simple random sample. Binomial distribution

Binomial formulas: The binomial coefficient is the number of ways of arranging k successes among n observations.

2017 Fall QMS102 Tip Sheet 2

Binomial and multinomial distribution

CS134: Networks Spring Random Variables and Independence. 1.2 Probability Distribution Function (PDF) Number of heads Probability 2 0.

Central Limit Thm, Normal Approximations

1/2 2. Mean & variance. Mean & standard deviation

Section M Discrete Probability Distribution

Shifting our focus. We were studying statistics (data, displays, sampling...) The next few lectures focus on probability (randomness) Why?

II - Probability. Counting Techniques. three rules of counting. 1multiplication rules. 2permutations. 3combinations

Lecture 23. STAT 225 Introduction to Probability Models April 4, Whitney Huang Purdue University. Normal approximation to Binomial

Chapter 5. Discrete Probability Distributions. McGraw-Hill, Bluman, 7 th ed, Chapter 5 1

Chapter 7: Random Variables

Lecture 23: April 10

MATH/STAT 3360, Probability FALL 2013 Toby Kenney

LESSON 9: BINOMIAL DISTRIBUTION

Chapter 4: Commonly Used Distributions. Statistics for Engineers and Scientists Fourth Edition William Navidi

Chapter 3 Discrete Random Variables and Probability Distributions

Chapter 3. Discrete Probability Distributions

5.2 Random Variables, Probability Histograms and Probability Distributions

4-1. Chapter 4. Commonly Used Distributions by The McGraw-Hill Companies, Inc. All rights reserved.

Chapter 4 and 5 Note Guide: Probability Distributions

Probability and distributions

Elementary Statistics Lecture 5

Business Statistics. Chapter 5 Discrete Probability Distributions QMIS 120. Dr. Mohammad Zainal

Chapter 5 Student Lecture Notes 5-1. Department of Quantitative Methods & Information Systems. Business Statistics

Reliability and Risk Analysis. Survival and Reliability Function

Business Statistics Midterm Exam Fall 2013 Russell

Chapter 7: Random Variables and Discrete Probability Distributions

Mean of a Discrete Random variable. Suppose that X is a discrete random variable whose distribution is : :

The Normal Distribution

Homework 10 Solution Section 4.2, 4.3.

Binomial distribution

5. In fact, any function of a random variable is also a random variable

Math 180A. Lecture 5 Wednesday April 7 th. Geometric distribution. The geometric distribution function is

MATH/STAT 3360, Probability FALL 2012 Toby Kenney

MA 1125 Lecture 14 - Expected Values. Wednesday, October 4, Objectives: Introduce expected values.

Econ 250 Fall Due at November 16. Assignment 2: Binomial Distribution, Continuous Random Variables and Sampling

MAS187/AEF258. University of Newcastle upon Tyne

Annex 4 - Poverty Predictors: Estimation and Algorithm for Computing Predicted Welfare Function

Stochastic Processes and Financial Mathematics (part one) Dr Nic Freeman

ME3620. Theory of Engineering Experimentation. Spring Chapter III. Random Variables and Probability Distributions.

Normal Distribution. Notes. Normal Distribution. Standard Normal. Sums of Normal Random Variables. Normal. approximation of Binomial.

No, because np = 100(0.02) = 2. The value of np must be greater than or equal to 5 to use the normal approximation.

Basics of Probability

Assignment 3 - Statistics. n n! (n r)!r! n = 1,2,3,...

STAT/MATH 395 PROBABILITY II

Chapter 4 Probability Distributions

Chapter 6: Random Variables. Ch. 6-3: Binomial and Geometric Random Variables

Transcription:

Chater 1: Stochastic Processes 4 What are Stochastic Processes, and how do they fit in? STATS 210 Foundations of Statistics and Probability Tools for understanding randomness (random variables, distributions) STATS 310 Statistics Randomness in Pattern STATS 325 Probability Randomness in Process Stats 210: laid the foundations of both Statistics and Probability: the tools for understanding randomness. Stats 310: develos the theory for understanding randomness in attern: tools for estimating arameters(maximum likelihood), testing hyotheses, modelling atterns in data (regression models). Stats 325: develos the theory for understanding randomness in rocess. A rocess is a sequence of events where each ste follows from the last after a random choice. What sort of roblems will we cover in Stats 325? Here are some examles of the sorts of roblems that we study in this course. Gambler s Ruin You start with $30 and toss a fair coin reeatedly. Every time you throw a Head, you win $5. Every time you throw a Tail, you lose $5. You will sto when you reach $100 or when you lose everything. What is the robability that you lose everything? Answer: 70%.

5 Winning at tennis What is your robability of winning a game of tennis, starting from the even score Deuce (40-40), if your robability of winning each oint is 0.3 and your oonent s is 0.7? Answer: 15%. DEUCE (D) q q VENUS AHEAD (A) VENUS BEHIND (B) q VENUS WINS (W) VENUS LOSES (L) Winning a lottery A million eole have bought tickets for the weekly lottery draw. Each erson has a robability of one-in-a-million of selecting the winning numbers. If more than one erson selects the winning numbers, the winner will be chosen at random from all those with matching numbers. You watch the lottery draw on TV and your numbers match the winning numbers!!! Only a one-in-a-million chance, and there were only a million layers, so surely you will win the rize? Not quite... What is the robability you will win? Answer: only 63%. Drunkard s walk A very drunk erson staggers to left and right as he walks along. With each ste he takes, he staggers one ace to the left with robability 0.5, and one ace to the right with robability 0.5. What is the exected number of aces he must take before he ends u one ace to the left of his starting oint? Arrived! Answer: the exectation is infinite!

6 Pyramid selling schemes Have you received a chain letter like this one? Just send $10 to the erson whose name comes at the to of the list, and add your own name to the bottom of the list. Send the letter to as many eole as you can. Within a few months, the letter romises, you will have received $77,000 in $10 notes! Will you? Answer: it deends uon the resonse rate. However, with a fairly realistic assumtion about resonse rate, we can calculate an exected return of $76 with a 64% chance of getting nothing! Note: Pyramid selling schemes like this are rohibited under the Fair Trading Act, and it is illegal to articiate in them. Sread of SARS The figure to the right shows the sread of the disease SARS (Severe Acute Resiratory Syndrome) through Singaore in 2003. With this attern of infections, what is the robability that the disease eventually dies out of its own accord? Answer: 0.997.

7 Markov s Marvellous Mystery Tours Mr Markov s Marvellous Mystery Tours romises an All-Stochastic Tourist Exerience for the town of Rotorua. Mr Markov has eight tourist attractions, to which he will take his clients comletely at random with the robabilities shown below. He romises at least three exciting attractions er tour, ending at either the Lady Knox Geyser or the Tarawera Volcano. (Unfortunately he makes no mention of how the haless tourist might get home from these laces.) What is the exected number of activitiesfor a tour startingfrom the museum? 2. Cruise 4. Flying Fox 1 6. Geyser 1 1. Museum 3. Buried Village 5. Hangi 7. Helicoter 8. Volcano 1 1 Answer: 4.2. Structure of the course Probability. Probability and random variables, with secial focus on conditional robability. Finding hitting robabilities for stochastic rocesses. Exectation. Exectation and variance. Introduction to conditional exectation, and its alication in finding exected reaching times in stochastic rocesses. Generating functions. Introduction to robability generating functions, and their alications to stochastic rocesses, esecially the Random Walk. Branching rocess. This rocess is a simle model for reroduction. Examles are the yramid selling scheme and the sread of SARS above.

8 Markov chains. Almost all the examles we look at throughout the course can be formulated as Markov chains. By develoing a single unifying theory, we can easily tackle comlex roblems with many states and transitions like Markov s Marvellous Mystery Tours above. The rest of this chater covers: quick revision of samle saces and random variables; formal definition of stochastic rocesses. 1.1 Revision: Samle saces and random variables Definition: A random exeriment is a hysical situation whose outcome cannot be redicted until it is observed. Definition: A samle sace, Ω, is a set of ossible outcomes of a random exeriment. Examle: Random exeriment: Toss a coin once. Samle sace: Ω ={head, tail} Definition: A random variable, X, is defined as a function from the samle sace to the real numbers: X : Ω R. That is, a random variable assigns a real number to every ossible outcome of a random exeriment. Examle: Random exeriment: Toss a coin once. Samle sace: Ω = {head, tail}. An examle of a random variable: X : Ω R mas head 1, tail 0. Essential oint: A random variable is a way of roducing random real numbers.

9 1.2 Stochastic Processes Definition: A stochastic rocess is a family of random variables, {X(t) : t T}, where t usually denotes time. That is, at every time t in the set T, a random number X(t) is observed. Definition: {X(t) : t T} is a discrete-time rocess if the set T is finite or countable. In ractice, this generally means T = {0,1,2,3,...} Thus a discrete-timerocess is {X(0),X(1),X(2),X(3),...}: a new random number recorded at every time 0, 1, 2, 3,... Definition: {X(t) : t T} is a continuous-time rocess if T is not finite or countable. In ractice, this generally means T = [0, ), or T = [0,K] for some K. Thus a continuous-timerocess {X(t) : t T} has a random number X(t) recorded at every instant in time. (Note that X(t) need not change at every instant in time, but it is allowed to change at any time; i.e. not just at t = 0,1,2,..., like a discrete-time rocess.) Definition: The state sace, S, is the set of real values that X(t) can take. Every X(t) takes a value in R, but S will often be a smaller set: S R. For examle, if X(t) is the outcome of a coin tossed at time t, then the state sace is S = {0,1}. Definition: The state sace S is discrete if it is finite or countable. Otherwise it is continuous. The state sace S is the set of states that the stochastic rocess can be in.

10 For Reference: Discrete Random Variables 1. Binomial distribution Notation: X Binomial(n, ). Descrition: number of successes in n indeendent trials, each with robability of success. Probability function: Mean: E(X) = n. f X (x) = P(X = x) = ( ) n x (1 ) n x x Variance: Var(X) = n(1 ) = nq, where q = 1. for x = 0,1,...,n. Sum: If X Binomial(n,), Y Binomial(m,), and X and Y are indeendent, then X +Y Bin(n+m, ). 2. Poisson distribution Notation: X Poisson(λ). Descrition: arises out of the Poisson rocess as the number of events in a fixed time or sace, when events occur at a constant average rate. Also used in many other situations. Probability function: f X (x) = P(X = x) = λx x! e λ for x = 0,1,2,... Mean: E(X) = λ. Variance: Var(X) = λ. Sum: If X Poisson(λ), Y Poisson(µ), and X and Y are indeendent, then X +Y Poisson(λ+µ).

11 3. Geometric distribution Notation: X Geometric(). Descrition: number of failures before the first success in a sequence of indeendent trials, each with P(success) =. Probability function: f X (x) = P(X = x) = (1 ) x for x = 0,1,2,... Mean: E(X) = 1 = q, where q = 1. Variance: Var(X) = 1 2 = q 2, where q = 1. Sum: if X 1,...,X k are indeendent, and each X i Geometric(), then X 1 +...+X k Negative Binomial(k,). 4. Negative Binomial distribution Notation: X NegBin(k, ). Descrition: number of failures before the kth success in a sequence of indeendent trials, each with P(success) =. Probability function: ( ) k +x 1 f X (x) = P(X = x) = k (1 ) x for x = 0,1,2,... x Mean: E(X) = k(1 ) = kq, where q = 1. Variance: Var(X) = k(1 ) 2 = kq 2, where q = 1. Sum: IfX NegBin(k, ),Y NegBin(m, ),andx andy areindeendent, then X +Y NegBin(k +m, ).

12 5. Hyergeometric distribution Notation: X Hyergeometric(N, M, n). Descrition: Samling without relacement from a finite oulation. Given N objects, of which M are secial. Draw n objects without relacement. X is the number of the n objects that are secial. Probability function: f X (x) = P(X = x) = Mean: E(X) = n, where = M N. ( M N M ) x)( n x ( N for n) ( N n ) Variance: Var(X) = n(1 ), where = M N 1 N. { x = max(0, n+m N) to x = min(n, M). 6. Multinomial distribution Notation: X = (X 1,...,X k ) Multinomial(n; 1, 2,..., k ). Descrition: there are n indeendent trials, each with k ossible outcomes. Let i = P(outcome i) for i = 1,...k. Then X = (X 1,...,X k ), where X i is the number of trials with outcome i, for i = 1,...,k. Probability function: n! f X (x) = P(X 1 = x 1,...,X k = x k ) = x 1!...x k! x 1 1 x 2 2...x k k k k for x i {0,...,n} i with x i = n, and where i 0 i, i = 1. Marginal distributions: X i Binomial(n, i ) for i = 1,...,k. i=1 Mean: E(X i ) = n i for i = 1,...,k. Variance: Var(X i ) = n i (1 i ), for i = 1,...,k. Covariance: cov(x i,x j ) = n i j, for all i j. i=1

13 Continuous Random Variables 1. Uniform distribution Notation: X Uniform(a, b). Probability density function (df): f X (x) = 1 b a for a < x < b. Cumulative distribution function: Mean: E(X) = a+b 2. Variance: Var(X) = (b a)2. 12 F X (x) = P(X x) = x a for a < x < b. b a F X (x) = 0 for x a, and F X (x) = 1 for x b. 2. Exonential distribution Notation: X Exonential(λ). Probability density function (df): f X (x) = λe λx for 0 < x <. Cumulative distribution function: F X (x) = P(X x) = 1 e λx for 0 < x <. F X (x) = 0 for x 0. Mean: E(X) = 1 λ. Variance: Var(X) = 1 λ 2. Sum: if X 1,...,X k are indeendent, and each X i Exonential(λ), then X 1 +...+X k Gamma(k,λ).

14 3. Gamma distribution Notation: X Gamma(k, λ). Probability density function (df): f X (x) = λk Γ(k) xk 1 e λx for 0 < x <, where Γ(k) = 0 y k 1 e y dy (the Gamma function). Cumulative distribution function: no closed form. Mean: E(X) = k λ. Variance: Var(X) = k λ 2. Sum: if X 1,...,X n are indeendent, and X i Gamma(k i, λ), then X 1 +...+X n Gamma(k 1 +...+k n, λ). 4. Normal distribution Notation: X Normal(µ, σ 2 ). Probability density function (df): f X (x) = 1 2πσ 2 e{ (x µ)2 /2σ 2 } for < x <. Cumulative distribution function: no closed form. Mean: E(X) = µ. Variance: Var(X) = σ 2. Sum: if X 1,...,X n are indeendent, and X i Normal(µ i, σi 2 ), then X 1 +...+X n Normal(µ 1 +...+µ n, σ 2 1 +...+σ 2 n).

15 Uniform(a, b) Probability Density Functions 1 b a f X (x) 00000000000 11111111111 a b x Exonential(λ) λ = 2 λ = 1 Gamma(k, λ) k = 2, λ = 1 k = 2, λ = 0.3 Normal(µ, σ 2 ) σ = 2 σ = 4 µ