+ All Categories
Home > Documents > Lecture 10 – Introduction to Probability

Lecture 10 – Introduction to Probability

Date post: 25-Feb-2016
Category:
Upload: kalli
View: 56 times
Download: 1 times
Share this document with a friend
Description:
Lecture 10 – Introduction to Probability. Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities Exponential distribution Poisson distribution. The Monte Hall Problem. There is $1,000,000 behind one door & - PowerPoint PPT Presentation
33
Lecture 10 – Introduction to Probability Topics • Events, sample space, random variables • Examples • Probability distribution function • Conditional probabilities • Exponential distribution • Poisson distribution
Transcript
Page 1: Lecture 10 – Introduction to Probability

Lecture 10 – Introduction to Probability

Topics• Events, sample space, random variables• Examples• Probability distribution function• Conditional probabilities• Exponential distribution• Poisson distribution

Page 2: Lecture 10 – Introduction to Probability

There is $1,000,000 behind one door & $0.00 behind the other two.

You “reserve” a door (say #1) but it remains closed. Then, Monte opens one of the other two doors (say #2).

The door Monte opens will be empty!

(Monte wants to keep the money.)

#1 #2 #3

The Monte Hall Problem

Page 3: Lecture 10 – Introduction to Probability

1. Stay with the original door we chose?

2. Switch to the other unopened door?

What’s the best strategy?

We are interested in probability as it relates to helping us make good or optimal decisions.

Page 4: Lecture 10 – Introduction to Probability

Coins & Drawers

One drawer has 2 gold coinsOne drawer has 1 gold & 1 silver coinOne drawer has 2 silver coins

You select a drawer at random and then randomly select one coin

It turns out that your coin is gold

What is the probability that the other coin is gold ?

Another Example

Page 5: Lecture 10 – Introduction to Probability

Probability OverviewThe sample space, S = set of all possible outcomes of an experiment

Events are subsets of the sample space.An event occurs if any of its elements occur when the experiment is run.

Two events A & B are mutually exclusive if their intersection is null.Elements of S are called realizations outcomes, sample points, or scenarios

The choice of the sample space depends on the question that you are trying to answer.

Page 6: Lecture 10 – Introduction to Probability

Two possible sample spaces are: S1 = { (1,1), (1,2), (1,3), …, (6,4), (6,5), (6,6) }S2 = { 2, 3, 4, . . . , 11, 12 } (sum of the two values)

Examples of Events:

A1 = “the sum of the face values is 3”Under S1 : A1 = { (1,2), (2,1) } ; Under S2 : A1 = { 3 }

A2 = “one die has value 3 & the other has value 1”

Roll Two Dice

Under S1 : A2 = { (1,3), (3,1) } ; Under S2 : not an event

Page 7: Lecture 10 – Introduction to Probability

Mathematical: A probability measure P is defined on the set of all events satisfying the following: (1) P(A) A S

(2) P(S) = P(A1 A2 · · · ) = 1(3) Mutually exclusive events Ai imply that

P(A1 A2 · · · ) = P(A1) + P(A2 ) + · · ·

(4) P(A) = 1 P(A) where A = complement of A(5) P(AB) = P(A) + P(B) P(AB)(6) If A & B are independent, then P(AB) = P(A)P(B)

__

ProbabilityIntuitive: The proportion of time that an event occurs

when the same experiment is repeated

Page 8: Lecture 10 – Introduction to Probability

(7) If A & B are independent events then

P(A | B) = P(AB) P(B) = P(A)P(B)

P(B)

In this case B gives “no information” as to whether A will occur.

= P(A).

(6) The conditional probability of event A given B is

P(A|B) = P(AB) P(B)

Page 9: Lecture 10 – Introduction to Probability

Probability Calculations

Sample space : S = { (H), (T) }Example: roll a balanced dice

Event H: head appears (H)Event T: tail appears (T)

S = H T and H T =

= P (H) + P (T) Equal chance P (H) = P (T) = 1/2

1 = P(S) = P ( H T )According to (2) According to (3)

Proof:

Intuitively: P(H) = P(T) = ½. How to prove it mathematically?

Page 10: Lecture 10 – Introduction to Probability

Basic Conditional Probabilities

The conditional probability of event A given B is

P(A|B) = P(AB) P(B)

Example: You roll 2 dice and were told that the sum is 7. What is probability that the first die is 2?

B = { (1,6), (2,5), (3,4), (4,3), (5,2), (6,1) } and A B = { (2,5) }

P(A | B) =P (A B )

P(B) 1/366/36

= 16=

Define Event B: the sum of the two dice is 7 Event A: the first dice is 2

Page 11: Lecture 10 – Introduction to Probability

You roll 2 dice and double your money if the sum is 8and lose if the sum is 7

However, the rolls occur in sequence and you do not get to observe the first

roll.You are simply told either “the value is 4” or “the value is 3”.

After being told this you get to place your bet.

A = event that you win = { (2,6), (3,5), (3,6), (4,4), (4,5), (4,6), (5,3), (5,4), (5,5), (5,6), (6,2), (6,3), (6,4), (6,5), (6,6) }

B = you are told “ 4” after the first roll= { (4,1), (4,2), (4,3), (4,4), (4,5), (4,6), (5,1), (5,2), (5,3), (5,4), (5,5), (5,6), (6,1),(6,2), (6,3), (6,4), (6,5), (6,6) }

Computing Conditional Probabilities

Page 12: Lecture 10 – Introduction to Probability

The goal is to calculate: win = P(A|B) and lose = P(A|B). _

P(A I B) = P(AB) P(B)

A B = { (4,4), (4,5), (4,6), (5,3), (5,4), (5,5), (5,6), (6,2), (6,3), (6,4), (6,5), (6,6) }Each realization (i,j ) is equally likely

P(A|B) = |AB|

|B| = 12/3618/36 = 2/3.36

36In a similar manner, we can show that

_P(A | B)= 6/18 = 1/3

However, P(A|B) = 1 – P(A|B). _

Page 13: Lecture 10 – Introduction to Probability

More Conditional Probability Calculations

P(A B) P(B) and P(B|A) = P(B A)

P(A)

P(A B) = P(A|B)P(B) and P(B A) = P(B|A)P(A)

This leads to Bayes’ Theorem

P(A|B) = P(B|A)P(A) P(B)

P(A|B) =

Page 14: Lecture 10 – Introduction to Probability

Example: Coins & Drawers

Drawer 1 has 2 gold coinsDrawer 2 has 1 gold & 1 silver coinDrawer 3 has 2 silver coins

123

D1 = event that we pick drawer 1G1 = event that the first coin we select is gold

P(D1|G1) = probability that both coins are gold given that the first is gold

Page 15: Lecture 10 – Introduction to Probability

Coin & Drawer Computations

=(1)(1/3)

(1)(1/3) + (1/2)(1/3) + (0)(1/3)= 2/3

P(Di) = 1/3, i = 1, 2, 3P(G1) = (1)(1/3) + (1/2)(1/3) + (0)(1/3)

P(D1|G1) = P(G1 | D1) P(D1)P(G1)

Page 16: Lecture 10 – Introduction to Probability

Example: The Monte Hall Problem

There is $1,000,000 behind one of the doors & $0.00 behind the others.

#1 #2 #3

If you don’t switch, P(win) = 1/3.

Optimal Strategy: Switch to the door that remains closed; P(win) = 2/3.

Say you pick door #1 and then Monte opens one of the other two doors.

Page 17: Lecture 10 – Introduction to Probability

Events: D1 = you end up choosing door 1D2 = you end up choosing door 2D3 = you end up choosing door 3L = prize is behind door 1M = prize is behind door 2R = prize is behind door 3

Event you win: W = (D1L) (D2 M) (D3 R)

These are mutually exclusive so

P(W) = P(D1L) + P(D2 M) + P(D3 R)

= P(D1|L) P(L) + P(D2|M) P(M) + P(D3|R) P(R) = (0) (1/3) + (1) (1/3) + (1) (1/3) = 2/3

Page 18: Lecture 10 – Introduction to Probability

Random VariablesR.V. is a real-valued function defined on the sample space S

Example: toss two dice, Sample space: (1,1), . . . , (6,6) Quantity of interest: sum of the two dice 2, 3, . . . ,12 RV: function that maps sample space to the quantity of interest

Define : X the RV, which is defined as the sum of 2 fair dice P{X = 2} = P{ (1,1) } = 1/36 P(X = 3} = P{ (1,2), (2,1)} = 2/36 P{X = 4} = P{ (1,3), (2,2), (3,1) } = 3/36 ... P{X = 7} = P{ (1,6), (2,5), (3,4), (4,3), (5,2), (6,1) } = 6/36 … P{X = 12} = P{ (6,6) } = 1/36

Page 19: Lecture 10 – Introduction to Probability

Classification of Random VariablesDiscrete random variables: take finite or a countable num of possible values

Examples: Bernoulli , Binomial, Geometric and Poisson

Probability mass function (pmf):Probability of an outcome, p(a) = P{X = a}

Continuous random variables: takes uncountable number of possible values

Probability density function (pdf): f(a) = P{X a } = - f(x)dx

Examples: Uniform, Exponential, Gamma, NormalGamma, Normal

a

Page 20: Lecture 10 – Introduction to Probability

Expected Value

Continuous RV, X:

E[X ] = x f(x)dx-

Discrete random varible, X:E[X ] = xS x p(x )

Page 21: Lecture 10 – Introduction to Probability

This is the most frequently used distribution for modeling interarrival and service times in queueing systems. In many applications, it is regarded as doing a good job of balancing realism and mathematical tractability.

The Exponential Distribution

• Time between customer arrivals at an ATM• Time until a machine in a workcenter breaks• Time between calls to a reservation system

Page 22: Lecture 10 – Introduction to Probability

F(t ) = Pr{T t ) = 0t

e –u du = 1 – e –t (t 0)

f(t ) = {e –t, t 00, t < 0

probability density function (pdf)

Exponential Random Variable, TLet be parameter of the exponential distribution.

0

Variance of T :

Var[T ] = E[T – 1/]2 = (t – 1/)2e –tdt = 1/2

Expected value of T :

E[T ] = t f(t )dt = te –t dt = 1/ 0

0

Page 23: Lecture 10 – Introduction to Probability

Memoryless Property of an Exponential Random Variable

Pr{T a + b | T b } = Pr{T a } a, b 0

This says that if we’ve already waited b units of time and the event has not occurred (e.g., the next customer hasn’t arrived) then the probability distribution governing the remaining time until its occurrence is the same as it would be if the system were restarted.

That is, the process “forgets” that the event has not yet occurred. This is often a good assumption when the time of the next arrival is not “influenced” by the last arrival.

Page 24: Lecture 10 – Introduction to Probability

Proof:P(A I B) = P(AB)

P(B)

= P(T > a)

P(T > a + b | T > b) = P(T > b)= P(T > a + b and T > b) P(T > a + b) P(T > b)

= e –(a+b) = e –ae –b = e –a

e –b e –b

Page 25: Lecture 10 – Introduction to Probability

Using the Exponential Distribution Calls arrive at an emergency hotline switchboard at a rate of 3 per hour. It is now 8 AM.

(c) Given that no calls have arrived by 8:30 AM what is the probability that one or more calls arrive by 9 AM?

(b) What is the probability that the first call arrives between 8:15 and 8:45 AM?

(a) What is the probability that the first call arrives after 8:30 AM?

Page 26: Lecture 10 – Introduction to Probability

Let T = interarrival time random variable.Assume T is exponentially distributed with = 3, soF(t) = P(T t) = 1 – e –3t and f(t) = 3e –3t

(a) P(T > ½) = e –3(1/2) = 0.223

(b) P(¼ < T < ¾)= 3e-3u du = F(¾) –F(¼) 1/4

3/4

= [ 1 e-9/4] [ 1 e-3/4] = e-3/4 e-9/4 = 0.367

(c) P(T 1 | T ½) = 1 P(T > 1 | T ½

= 1 P(T ½) memoryless property= 1 e –3(1/2) = 0.777

Solutions

Page 27: Lecture 10 – Introduction to Probability

Relationship between Exponential Distribution and the Poisson “Counting” DistributionThe exponential distribution is a continuous (time) distribution that, in our setting, models the waiting time until the next event (e,g., next customer arrival).

The Poisson distribution is a discrete (counting) distribution which governs the total number of events (e.g., arrivals) in the time interval (0, t ).

Fact: If inter-event times are independent exponential RVs with rate then the number of events that occur in the interval (0, t ) is governed by a Poisson distribution with rate . And vice-versa.

Page 28: Lecture 10 – Introduction to Probability

Xt = # of events (e.g., arrivals) in the interval (0, t ) is governed by the following probability mass function

Note: Pr{ Xt = 0 } = Pr{ T > t } = e –t

Pr{ Xt = n } = (t)ne –t/n! , n = 0, 1, . . .

E[ Xt ] = tarrival rate length of interval

Poisson Distribution

Page 29: Lecture 10 – Introduction to Probability

Example (revisited)Calls arrive at an emergency hotline switchboard once every 20 minutes on average. It is now 8:00 AM.

Use the Poisson distribution to answer the following questions.

(a)What is the probability that the first call arrives after 8:30 AM?

Solution: Xt ~ Poisson(3t ); X½ ~ Poisson(3/2), where = # of arrivals in first ½ hour.

P(X½ = 0) = (3/2)0e –3/2 = e –3/2 = 0.223

Page 30: Lecture 10 – Introduction to Probability

Solution: Let Y1 = #of arrivals in first 15 minutesY2 = # of arrivals between 8:15 and 8:45 AMWe must determine: Pr{ Y1 = 0 and Y2 1 }

However, Y1 ~ Poison(3/4), Y2 ~ Poisson(3/2)and the events {Y1 = 0} and {Y2 1} are independent.

(b) What is the probability that the first call arrives between 8:15 AM and 8:45 AM.

P(Y1 = 0 and Y2 1) = P(Y1 = 0)P(Y2 1) = P(Y1 = 0)[1 P(Y2 0)]

= (3/4) 0

0! [1 (3/2) 0

0!e –3/4 e –3/2 ] = e –3/4 e –9/4

= 0.367

Page 31: Lecture 10 – Introduction to Probability

Solution: Pr{ Y2 1 | Y1 = 0 }, where

Y1 = #of arrivals between 8:00 and 8:30 AMY2 = # of arrivals between 8:30 and 9:00 AM

andY1 ~ Poisson(3/2), Y2 ~ Poisson(3/2)

Y1 and Y2 are independent because they are defined on non-overlapping time intervals.

(c) Given that no calls have arrived by 8:30 AM what’s the probability that one or more calls arrive by 9:00 AM?

= 1 (3/2)0

0.l e –3/2 = 1 e –3/2 = 0.777

P(Y2 1 | Y1 = 0 ) = P(Y2 1) = 1 P(Y2 = 0)

Page 32: Lecture 10 – Introduction to Probability

Generalization:Let Ti ~ exp(i), i = 1,…,nand define T = min{ T1,…,Tn }.

Then T ~ exp(), where = 1 + 2 + · · · + n

Multiple Arrival StreamsIf customers of type 1 arrive according to a Poisson process with rate 1 and customers of type 2 arrive according to an independent Poisson process with rate 2 then customers arrive, regardless of type, according to a Poisson process with rate = 1 + 2.

Restated: The minimum of several independent exponential rv’s is an exponential random variable with a rate that is the sum of the individual rates.

Page 33: Lecture 10 – Introduction to Probability

What You Should Know About Probability

• How to identify events and the corresponding sample space.

• How to work with conditional probabilities.• How to work with probability functions

(e.g., normal, exponential, uniform, discrete).

• How to work with the Poisson distribution.


Recommended