August 18, 2006
Contents Outline.
Contents..
Contents..
Contents.. Application: 1-d diffusion
Definition Outline Consider M discrete events x = x i, i = 1,2,,M. The probability for the occurrence of x i is: P(x i ) = N i, (N ), N where N i is the number of occurrence of x i, and N = M i=1 total occurrence of all the events. P(x i ) has the following M properties: P(x i ) 0, P(x i ) = 1. i=1 is the
Mean, standard deviation, and moments Mean: < f (x i ) >= M f (x i )P(x i ) i=1 1st moment: < x >= M x i P(x i ) i=1 2st moment: < x 2 >= M xi 2P(x i)... i=1 n th moment: < x n >= M xi np(x i) i=1
Definition Outline For a continuous variable x [a, b], we can define the density of the occurrence n(x), so that n(x) x denotes the number of occurrence in a small interval [x, x + x]. In the limit x 0, b the total occurrence is N = dxn(x). We then define the a probability density function ρ(x) = n(x) n(x) x N. ρ(x) x = N is the probability of x falling into the small interval [x, x + x]: P(x [x, x + x]). When the interval is not small, P(x [x, x + x]) = x+ x x dxρ(x). Obviously we have b ρ(x) = 1. a
Mean, variance, moments Mean: < f (x) >= 1st moment: < x >= b f (x)ρ(x) a b dxxρ(x) b 2nd moment: < x 2 >= dxx 2 ρ(x) a... b n th moment: < x n >= dxx n ρ(x) a a
Joint probability distribution For two sets of events: x = x i, i = 1, 2, ; y = y j, j = 1, 2,. The combined events form a set: (x, y) = (x i, y k ), i, j = 1, 2,. The probability of the combined event (x i, y k ) is denoted by P(x i, y k ) and is called the joint probability. For continuous distribution, the corresponding term is the joint probability density function ρ(x, y). The joint prob is normalized. If x and y are independent, we have the following: 1. P(x i, y k ) = P(x i )P(y k );ρ(x, y) = ρ(x)ρ(y). 2. < xy >=< x >< y >;< ((x + y) < (x + y) >) 2 >=< (x < x >) 2 > + < (y < y >) 2 >. 3. cor(x, y) = dxdy(x < x >)(y < y >)ρ(x, y) = 0.
Addition and multiplication rules addition rule applies to exclusive events: for discrete distribution, P(either x i or x j )= P(x i ) + P(x j ). For continuous distribution, P(either x [a, b] or x [c, d]) = b a dxρ(x) + d c dxρ(x). ([a, b] and [c, d] don t overlap). multiplication rule applies to independent events x and y: for discrete distribution,p(x i, y j ) = P(x i )P(x j ). For continuous distribution, ρ(x, y) = ρ(x)ρ(y).
Binomial distribution Outline Experiment of throwing a coin: each throw has a fixed probability p of face up, and 1 p of face down. We throw N times independently (that means the previous throw does not affect the next throw), the probability to have n times of finding the coin to be face up is easily derived to be: P(n) = N! n!(n n)! pn (1 p) N n, where p n (1 p) N n is the probability to have n specified coin N! face up. n!(n n)! is the number of ways we can specify n coins out from the total number N.
Binomial distribution Outline Memorize: Binomial theorem (a + b) N = N N! k=0 k!(n k)! ak b N k. Mean: prove the binomial distribution is normalized: N i=1 P(n) = 1 (hint: binomial theorem). 1st moment: memorize < n >= pn (hint: partial derivative with respect of p). 2nd moment: < n 2 >= (pn) 2 + Np(1 p) variance: memorize σ N =< (n < n >) 2 >=< n 2 > < n > 2 = Npq
Gaussian distribution Outline ρ(x) = 1 σ 2π e (x µ) 2 2σ 2, where x [, ]. Please memorize the famous Gaussian integral formula I (b) = dye by 2 = π b, where b 0. Please di (b) show that db = dyy 2 e by 2. Please prove the following exist: dxρ(x) = 1; < x >= µ; Please compute the following: < x 2 >; < (x < x >) 2 >; and < x 2 > < x > 2. Please show that < (x < x >) 2 >=< x 2 > < x > 2.
More on Gaussian distribution It is a good approximation to the Binomial distribution at large N and Np.
More on Gaussian distribution It is a good approximation to the Binomial distribution at large N and Np. In statistical physics, ρ(x) e E(x) k B T. In many cases we are dealing with harmonic potential E(x) = 1 2 kx 2.
More on Gaussian distribution It is a good approximation to the Binomial distribution at large N and Np. In statistical physics, ρ(x) e E(x) k B T. In many cases we are dealing with harmonic potential E(x) = 1 2 kx 2. Central limit theorem: x N = N i=1 x i N itself satisfies a Gaussian distribution with µ =< x >, and σ = σ x / N. The central limit theorem says that data which are influenced by many small and unrelated random effects are approximately normally distributed.
Binomial-Gaussian Approx
Application: 1-d random walk A particle move along a line with a step size b for each movement. The probabilities of move left or to right are equal (p=0.5). The net displacement from its original position after N steps (we can assume a large N) is: s = (n r n l )b. What is the distribution function of s?
1-d random walk Outline
1-d random walk: diffusion
Diffusion in 2-d Outline