Remarks on Probability

Similar documents
Central Limit Theorem 11/08/2005

Math 180A. Lecture 5 Wednesday April 7 th. Geometric distribution. The geometric distribution function is

Mean, Variance, and Expectation. Mean

4 Martingales in Discrete-Time

Time Resolution of the St. Petersburg Paradox: A Rebuttal

6. Martingales. = Zn. Think of Z n+1 as being a gambler s earnings after n+1 games. If the game if fair, then E [ Z n+1 Z n

The Game-Theoretic Framework for Probability

What do you think "Binomial" involves?

9 Expectation and Variance

Simple Random Sample

3 Stock under the risk-neutral measure

FE 5204 Stochastic Differential Equations

Comparison of proof techniques in game-theoretic probability and measure-theoretic probability

Introduction to Probability Theory and Stochastic Processes for Finance Lecture Notes

Lean Six Sigma: Training/Certification Books and Resources

MA 1125 Lecture 12 - Mean and Standard Deviation for the Binomial Distribution. Objectives: Mean and standard deviation for the binomial distribution.

STOR Lecture 7. Random Variables - I

Math-Stat-491-Fall2014-Notes-V

Random Variables and Probability Functions

STA 220H1F LEC0201. Week 7: More Probability: Discrete Random Variables

MATH20180: Foundations of Financial Mathematics

IE 5441: Financial Decision Making

Outline of Lecture 1. Martin-Löf tests and martingales

Probability mass function; cumulative distribution function

18.440: Lecture 32 Strong law of large numbers and Jensen s inequality

Lecture 17: More on Markov Decision Processes. Reinforcement learning

MATH1215: Mathematical Thinking Sec. 08 Spring Worksheet 9: Solution. x P(x)

Chapter 3 - Lecture 5 The Binomial Probability Distribution

Binomial Random Variables. Binomial Random Variables

Chapter 5. Sampling Distributions

CHAPTER 4 DISCRETE PROBABILITY DISTRIBUTIONS

Lecture 2 Basic Tools for Portfolio Analysis

The Central Limit Theorem

Chapter 4 Discrete Random variables

ECON 214 Elements of Statistics for Economists 2016/2017

Probability Distributions for Discrete RV

A GENERALIZED MARTINGALE BETTING STRATEGY

Chapter 10: Mixed strategies Nash equilibria, reaction curves and the equality of payoffs theorem

Copyright (C) 2001 David K. Levine This document is an open textbook; you can redistribute it and/or modify it under the terms of version 1 of the

The Simple Random Walk

Lecture 2. David Aldous. 28 August David Aldous Lecture 2

Using the Maximin Principle

MA 1125 Lecture 14 - Expected Values. Wednesday, October 4, Objectives: Introduce expected values.

Random variables The binomial distribution The normal distribution Other distributions. Distributions. Patrick Breheny.

Mixed Strategies. Samuel Alizon and Daniel Cownden February 4, 2009

SECTION 4.4: Expected Value

Probability. An intro for calculus students P= Figure 1: A normal integral

N(A) P (A) = lim. N(A) =N, we have P (A) = 1.

Discrete Random Variables and Probability Distributions. Stat 4570/5570 Based on Devore s book (Ed 8)

Chapter 4 Discrete Random variables

5. In fact, any function of a random variable is also a random variable

Lecture 5. 1 Online Learning. 1.1 Learning Setup (Perspective of Universe) CSCI699: Topics in Learning & Game Theory

Central Limit Theorem (cont d) 7/28/2006

Probability without Measure!

MAT25 LECTURE 10 NOTES. = a b. > 0, there exists N N such that if n N, then a n a < ɛ

Chapter 14. From Randomness to Probability. Copyright 2010 Pearson Education, Inc.

CS134: Networks Spring Random Variables and Independence. 1.2 Probability Distribution Function (PDF) Number of heads Probability 2 0.

Part 10: The Binomial Distribution

Experimental Probability - probability measured by performing an experiment for a number of n trials and recording the number of outcomes

ECON FINANCIAL ECONOMICS

ECON 581. Decision making under risk. Instructor: Dmytro Hryshko

6.042/18.062J Mathematics for Computer Science November 30, 2006 Tom Leighton and Ronitt Rubinfeld. Expected Value I

II - Probability. Counting Techniques. three rules of counting. 1multiplication rules. 2permutations. 3combinations

The binomial distribution

Probability Distributions: Discrete

Entropy, Martingales and Finance

Math 489/Math 889 Stochastic Processes and Advanced Mathematical Finance Dunbar, Fall 2007

STAT Mathematical Statistics

The following content is provided under a Creative Commons license. Your support

Stat 20: Intro to Probability and Statistics

Lecture Notes for Chapter 6. 1 Prototype model: a one-step binomial tree

Prediction Market Prices as Martingales: Theory and Analysis. David Klein Statistics 157

The Binomial Distribution

GEK1544 The Mathematics of Games Suggested Solutions to Tutorial 3

Math489/889 Stochastic Processes and Advanced Mathematical Finance Homework 4

Law of Large Numbers, Central Limit Theorem

Convergence. Any submartingale or supermartingale (Y, F) converges almost surely if it satisfies E Y n <. STAT2004 Martingale Convergence

Unit 5: Sampling Distributions of Statistics

X i = 124 MARTINGALES

Stochastic Processes and Financial Mathematics (part one) Dr Nic Freeman

Random Variables and Applications OPRE 6301

Unit 5: Sampling Distributions of Statistics

Statistics for Business and Economics: Random Variables (1)

Game Theory. Lecture Notes By Y. Narahari. Department of Computer Science and Automation Indian Institute of Science Bangalore, India October 2012

Statistical Methods in Practice STAT/MATH 3379

Discrete Random Variables; Expectation Spring 2014

Managerial Economics Uncertainty

Chapter 5 Discrete Probability Distributions Emu

Chapter 3 Discrete Random Variables and Probability Distributions

Module 10:Application of stochastic processes in areas like finance Lecture 36:Black-Scholes Model. Stochastic Differential Equation.

The Binomial Lattice Model for Stocks: Introduction to Option Pricing

Chapter 8 Additional Probability Topics

IEOR 3106: Introduction to OR: Stochastic Models. Fall 2013, Professor Whitt. Class Lecture Notes: Tuesday, September 10.

if a < b 0 if a = b 4 b if a > b Alice has commissioned two economists to advise her on whether to accept the challenge.

Arbitrage Pricing. What is an Equivalent Martingale Measure, and why should a bookie care? Department of Mathematics University of Texas at Austin

Some Computational Aspects of Martingale Processes in ruling the Arbitrage from Binomial asset Pricing Model

ECO 203: Worksheet 4. Question 1. Question 2. (6 marks)

Chapter 6: Risky Securities and Utility Theory

Sampling Distributions and the Central Limit Theorem

TOPIC: PROBABILITY DISTRIBUTIONS

Transcription:

omp2011/2711 S1 2006 Random Variables 1 Remarks on Probability In order to better understand theorems on average performance analyses, it is helpful to know a little about probability and random variables. This will also help in understanding how to design simulation experiments. The material is purposely simplified; no lies, but not the complete truth either. We will restrict it to discrete probability and random variables.

omp2011/2711 S1 2006 Random Variables 2 Events A probability space (for our limited purpose) is a set of events. To each event E we assign a probability P (E) ranging between 0 and 1. We can combine events in standard set-theoretic ways, viz., E 1 E 2, E 1 E 2, Ē (indeed we can even admit countable unions, etc.). So, events form an algebra of sets. Probability functions must obey some simple properties that are quite intuitive, e.g., P (A B) = P (A) + P (B) P (A B), P (Ā) = 1 P (A). E 1 E 2 is event of either E 1 or E 2 (occuring), E 1 E 2 is that of both E 1 and E 2, and Ē is the event that is possibility other than E.

omp2011/2711 S1 2006 Random Variables 3 What is a probability? This is a difficult question, fraught with philosophical controversy. We skirt around it by simply taking a view that is helpful in simulation contexts, without claiming that it is the truth. This can be called the Frequentist view. To make the interpretation concrete, consider the event space of tossing a die A. This has six outcomes, which are the events; we name them 1, 2, 3, 4, 5, 6. What do we mean when we say, e.g., P (5) = 1/6? It is that the chance of getting a 5 in a toss is 1/6. But what does that really mean?

omp2011/2711 S1 2006 Random Variables 4 A Gedanken Experiment Answer The frequentist answer is an operational one. Intuitively, the frequentist view is to imagine repeating the tossing experiment many times and define: S 0 = 0; if at the i-th toss we get a 5, set S i = S i 1 + 1, otherwise S i = S i 1. This S n is a count of how many 5 s have occurred up to the n-th toss. Then consider the ratio: S n n (1)

omp2011/2711 S1 2006 Random Variables 5 The limit interpretation As the number n of tosses increases, intuitively we expect the ratio to gradually converge to some number. S n n The frequentist interpretation is that this limit is the intended probability of the event 5 shows up in a toss ; i.e. P (5) = lim n S n n A fair dice would be one for which all the events 1, 2, 3, 4, 5 and 6 have this same limit, i.e., P (i) = 1/6 for each of them. (2)

omp2011/2711 S1 2006 Random Variables 6 Probabilities are OK The frequentist interpretation is consistent with the usual properties desired of a probability measure. Let us just check out one. P (Ā) = 1 P (A): For this, call each occurrence of the event A a success in a long experiment of repeated trials. Define S i as before but use it to count the number of successes. Then T i is used to count the number of failures (i.e., Ā occuring).

omp2011/2711 S1 2006 Random Variables 7 Probabilities are OK, cont d Since S i + T i = i, we have P (Ā) = lim n T n n = lim n n S n n = 1 lim n S n n = 1 P (A) (3)

omp2011/2711 S1 2006 Random Variables 8 Independence Two events A and B are independent if the occurrence of one has no influence (positively or negatively) on the other. It can be shown from the frequentist approach that this means P (A B) = P (A) P (B). (Comp 2711 people, reflect on this.) So, if I toss a fair coin and a fair dice together, since the outcomes are independent P (coin = H and dice = 6) = 1 2 1 6 = 1 12 If I toss the coin three times, each toss is independent of the others. So the probability of the outcome HHT is P (H) P (H) P (T ) = 1/8.

omp2011/2711 S1 2006 Random Variables 9 Random Variables A random variable is a function that associates a number to events. What use is this? Example: A Payoff Problem Suppose I play this game: A biased coin is tossed repeatedly. The probability of T is 1/3, so that of H is 2/3. I win $n if (from the start) the following sequence of events is observed: T, T, T,..., H where the only H is on the nth toss, all preceeding tosses yielding T. The game stops when I win, so there are no sequences of the form T T HT T T H..., as the game would have stopped at T T H. What is my expected winnings?

omp2011/2711 S1 2006 Random Variables 10 Random Variables cont d Now, each winning event for me is associated with a payoff (my winnings) of some dollar amount, which is a number. This association will be the random variable. More formally: Each winning event for me is some n-length sequence T T T... T T H which I denote by E(n). To say my winnings for this event is $n, we define the random variable W : Events Numbers by: W (E) = k if E = E(k) for some k = 0 otherwise. (4)

omp2011/2711 S1 2006 Random Variables 11 Repeated Runs The frequentist interpretation of the probability P (E(n)) of event E(n) is that it is the proportion of times that E(n) will occur if this game is played over and over again. So, if we play this game N times, I will win roughly this dollar amount: S = 1 P (E(1)) N +2 P (E(2)) N + +n P (E(n)) N + (5) So, the average amount I will win over the N games is SA = 1 N (1 P (E(1)) N + + n P (E(n)) N + ) = 1 P (E(1)) + 2 P (E(2)) + + n P (E(n)) + (6) This is the long run average payoff for me.

omp2011/2711 S1 2006 Random Variables 12 Expectation = Long Run Average The last equation can be re-written using the random variable W as: SA = W (1) P (E(1)) + W (2)P (E(2)) + (7) Generalizing this idea, suppose E n is a set of events on which we can define a random variable X. Thus X(E n ) is a number the payoff for event E n. The Expectation or expected value of X is the sum E(X) = i X(E i ) P (E i ) (8) Often we get lazy and write this simply as E(X) = i X(i) P (i).

omp2011/2711 S1 2006 Random Variables 13 Back to the game So, getting back to our motivating example, what are my expected winnings? Recall that we defined a random variable W (n) associated with the events E(n) of outcomes which are n-length sequence T T T... T T H. W (E) = k if E = E(k) for some k = 0 otherwise. (9) To use equation 8 to calcalute the expectation of W (which is the expected value of my winnings), all we need to do is find P (E(n)).

omp2011/2711 S1 2006 Random Variables 14 First Visit Problems P (E(n)) is ( 1 3 )n 1 2 3 as P (T ) = 1/3 and P (H) = 2/3. Hence E(W ) = (i + 1)( 1 2 3 )i 3 i= i=0 (10) We had actually seen this before when we considered the expected time to success in linear probing! This game is an abstract model of First Visit Problems. In each Bernoulli trial we have a probability of failure q, and of success p. Then the expected number of trials to the first success is i= i=0 (i + 1)qi p, which we had shown how to evaluate in the lecture that analysed the naive model of linear probing.

omp2011/2711 S1 2006 Random Variables 15 Gambling Modern probability theory originated with Pascal who worked out gambling odds for his friends. It was later aximatized by Kolmogorov. Pascal was a religious mathematician, a rare breed today. He used expectation of a random variable as an argument that one should be a theist. Pascal s Wager: By random variable expectation, you should believe that God exists.

omp2011/2711 S1 2006 Random Variables 16 Pascal s Wager Proof Let G be the event that you did not believe in God, but he exists H be the event that you did not believe in God and he does not exist I be the event that you did believe in God, and he exists J be the event that you did believe in God, but he he does not exist Event G leads to a big penalty when you find out too late when you die; say this payoff is 2 1,000,000,000,000,000,000,000. Event I leads to a big reward with you landing in Heaven surrounded by cherubs and angels, basking in eternal happiness; say this payoff is 2 1,000,000,000,000,000,000,000.

omp2011/2711 S1 2006 Random Variables 17 Pascal s Wager Proof cont d Events H and J are neutral, you neither gain nor lose anything. What is your expected payoff with different decisions leading to different probabilities for the events? The expected payoff is 2 1,000,000,000,000,000,000,000 P (I) 2 1,000,000,000,000,000,000,000 P (G) + 0 P (H) + 0 P (J) But I is actually two independent atomic events: Bel you belive in God, AND GE God exists, i.e Bel GE; and G is correspondingly Bel GE. So P (I) = P (Bel) P (GE) and P (G) = P ( Bel) P (GE).

omp2011/2711 S1 2006 Random Variables 18 Pascal s Wager Proof cont d By symmetry (or agnosticism) P (GE) = P ( GE) = 0.5. Suppose you believe, i.e. P (Bel) = 1 (so P ( Bel) = 0) and so P (I) = 0.5 (P(G) = 0). Your payoff is 1 2 21,000,000,000,000,000,000,000. On the contrary, if you did not believe, your payoff is 1 2 21,000,000,000,000,000,000,000. This choice is under your control! QED.

omp2011/2711 S1 2006 Random Variables 19 Other Examples on Expection of a RV Pokies You play the pokies. On each trial your probability of winning is p and losing is q = 1 p. If you win, the payout is $P, but it costs you $Q per trial. What is your expected payoff W? W = P p Q q = (P + Q) p Q The club will so adjust P, Q, p so that your payoff is slightly negative!

omp2011/2711 S1 2006 Random Variables 20 St Petersburg Paradox Expectations may not be finite. A fair coin is used in Bernoulli trials. On each trial I pay $5 to play. I win on the n-th trial if H is the result on this trial, and it is the first time that it has occurred. On such an event, call it E n, I collect $2 n. My net gain then would be G(n) = $2 n 5n. What is my expected winnings? Define a RV X such that X(E n ) = 2 n 5n. The probability of E n is ( 1 2 )n. Hence, the expectation of X is n=1 G(n)P (E n) = n=1 ( 1 2 )n (2 n 5n), which is unbounded.