+ All Categories
Home > Documents > Chap 003

Chap 003

Date post: 12-Jan-2016
Category:
Upload: shekhar-saurabh-biswal
View: 5 times
Download: 3 times
Share this document with a friend
Description:
Random Variables
Popular Tags:
46
Chapter 3 Chapter 3 Random Variables Random Variables McGraw-Hill/Irwin Copyright © 2009 by The McGraw-Hill Companies, Inc. All rights reserved. COMPLETE BUSINESS STATISTICS by by AMIR D. ACZEL AMIR D. ACZEL & JAYAVEL SOUNDERPANDIAN JAYAVEL SOUNDERPANDIAN 7th edition. 7th edition. Prepared by Prepared by Lloyd Jaisingh, Lloyd Jaisingh, Morehead State University Morehead State University
Transcript
Page 1: Chap 003

Chapter 3Chapter 3Random VariablesRandom Variables

McGraw-Hill/Irwin Copyright © 2009 by The McGraw-Hill Companies, Inc. All rights reserved.

COMPLETE BUSINESS STATISTICS

bybyAMIR D. ACZELAMIR D. ACZEL

&&JAYAVEL SOUNDERPANDIANJAYAVEL SOUNDERPANDIAN

7th edition.7th edition.

Prepared by Prepared by Lloyd Jaisingh, Morehead State Lloyd Jaisingh, Morehead State UniversityUniversity

Page 2: Chap 003

Using Statistics Expected Values of Discrete Random Variables Sum and Linear Composite of Random Variables Bernoulli Random Variable The Binomial Random Variable Continuous Random Variables Uniform Distribution

Random VariablesRandom Variables333-2

Page 3: Chap 003

After studying this chapter you should be able to:After studying this chapter you should be able to: Distinguish between discrete and continuous random variables Explain how a random variable is characterized by its probability distribution Compute statistics about a random variable Compute statistics about a function of a random variable Compute statistics about the sum or a linear composite of a random variable Identify which type of distribution a given random variable is most likely to

follow Solve problems involving standard distributions manually using formulas Solve business problems involving standard distributions using spreadsheet

templates.

LEARNINGLEARNING OBJECTIVESOBJECTIVES333-3

Page 4: Chap 003

Consider the different possible orderings of boy (B) and girl (G) in four sequential births. There are 2*2*2*2=24 = 16 possibilities, so the sample space is:

BBBB BGBB GBBB GGBB BBBG BGBG GBBG GGBGBBGB BGGB GBGB GGGBBBGG BGGG GBGG GGGG

If girl and boy are each equally likely [P(G) = P(B) = 1/2], and the gender of each child is independent of that of the previous child, then the probability of each of these 16 possibilities is:(1/2)(1/2)(1/2)(1/2) = 1/16.

3-1 Using Statistics3-4

Page 5: Chap 003

Now count the number of girls in each set of four sequential births:

BBBB (0) BGBB (1) GBBB (1) GGBB (2)BBBG (1) BGBG (2) GBBG (2) GGBG (3)BBGB (1) BGGB (2) GBGB (2) GGGB (3)BBGG (2) BGGG (3) GBGG (3) GGGG (4)

Notice that:• each possible outcome is assigned a single numeric value,• all outcomes are assigned a numeric value, and• the value assigned varies over the outcomes.

The count of the number of girls is a random variable:

A random variable, say X, is a an uncertain quantity whose value depends on chance.

Random Variables 3-5

Page 6: Chap 003

Random Variables (Continued)

BBBB BGBB GBBB

BBBG BBGB

GGBB GBBG BGBG

BGGB GBGB BBGG BGGG GBGG

GGGB GGBG

GGGG

0

1

2

3

4

XX

Sample Space

Points on the Real Line

3-6

Page 7: Chap 003

Since the random variable X = 3 when any of the four outcomes BGGG, GBGG, GGBG, or GGGB occurs,

P(X = 3) = P(BGGG) + P(GBGG) + P(GGBG) + P(GGGB) = 4/16

The probability distribution of a random variable is a table that lists the possible values of the random variables and their associated probabilities.

x P(x)0 1/161 4/162 6/163 4/164 1/16 16/16=1

Random Variables (Continued)

The Graphical Display for this Probability Distributionis shown on the next Slide.

The Graphical Display for this Probability Distributionis shown on the next Slide.

3-7

Page 8: Chap 003

Random Variables (Continued)

Number of Girls, X

Pro

bability

, P(X

)

43210

0.4

0.3

0.2

0.1

0.0

1/ 16

4/ 16

6/ 16

4/ 16

1/ 16

Probability Distribution of the Number of Girls in Four Births

Number of Girls, X

Pro

bability

, P(X

)

43210

0.4

0.3

0.2

0.1

0.0

1/ 16

4/ 16

6/ 16

4/ 16

1/ 16

Probability Distribution of the Number of Girls in Four Births

3-8

Page 9: Chap 003

Consider the experiment of tossing two six-sided dice. There are 36 possible outcomes. Let the random variable X represent the sum of the numbers on the two dice:

2 3 4 5 6 71,1 1,2 1,3 1,4 1,5 1,6 82,1 2,2 2,3 2,4 2,5 2,6 93,1 3,2 3,3 3,4 3,5 3,6 104,1 4,2 4,3 4,4 4,5 4,6 11

5,1 5,2 5,3 5,4 5,5 5,6 126,1 6,2 6,3 6,4 6,5 6,6

x P(x)*

2 1/363 2/364 3/365 4/366 5/367 6/368 5/369 4/3610 3/3611 2/3612 1/36

1

x P(x)*

2 1/363 2/364 3/365 4/366 5/367 6/368 5/369 4/3610 3/3611 2/3612 1/36

1

12111098765432

0.17

0.12

0.07

0.02

xp

(x)

Probability Distribution of Sum of Two Dice

* ( ) ( ( ) ) / Note that: P x x 6 7 362

Example 3-13-9

Page 10: Chap 003

Probability of at least 1 switch: P(X 1) = 1 - P(0) = 1 - 0.1 = .9Probability of at least 1 switch: P(X 1) = 1 - P(0) = 1 - 0.1 = .9

Probability Distribution of the Number of Switches

x P(x)0 0.11 0.22 0.33 0.24 0.15 0.1

1

x P(x)0 0.11 0.22 0.33 0.24 0.15 0.1

1

Probability of more than 2 switches: P(X > 2) = P(3) + P(4) + P(5) = 0.2 + 0.1 + 0.1 = 0.4Probability of more than 2 switches: P(X > 2) = P(3) + P(4) + P(5) = 0.2 + 0.1 + 0.1 = 0.4

543210

0.4

0.3

0.2

0.1

0.0

x

P(x

)

The Probability Distribution of the Number of Switches

Example 3-23-10

Page 11: Chap 003

A discrete random variable: has a countable number of possible values has discrete jumps (or gaps) between successive values has measurable probability associated with individual values counts

A discrete random variable: has a countable number of possible values has discrete jumps (or gaps) between successive values has measurable probability associated with individual values counts

A continuous random variable: has an uncountably infinite number of possible values moves continuously from value to value has no measurable probability associated with each value measures (e.g.: height, weight, speed, value, duration, length)

A continuous random variable: has an uncountably infinite number of possible values moves continuously from value to value has no measurable probability associated with each value measures (e.g.: height, weight, speed, value, duration, length)

Discrete and Continuous Random Variables

3-11

Page 12: Chap 003

1 0

1

0 1

. for all values of x.

2.

Corollary:

all x

P x

P x

P X

( )

( )

( )

The probability distribution of a discrete random variable X must satisfy the following two conditions.

Rules of Discrete Probability Distributions

3-12

Page 13: Chap 003

F x P X x P iall i x

( ) ( ) ( )

The cumulative distribution function, F(x), of a discrete random variable X is:

x P(x) F(x)0 0.1 0.11 0.2 0.32 0.3 0.63 0.2 0.84 0.1 0.95 0.1 1.0

1.00

x P(x) F(x)0 0.1 0.11 0.2 0.32 0.3 0.63 0.2 0.84 0.1 0.95 0.1 1.0

1.00 543210

1 .0

0 .9

0 .8

0 .7

0 .6

0 .5

0 .4

0 .3

0 .2

0 .1

0 .0

x

F(x

)

Cumulative Probability Distribution of the Number of Switches

Cumulative Distribution Function3-13

Page 14: Chap 003

x P(x) F(x)0 0.1 0.11 0.2 0.32 0.3 0.63 0.2 0.84 0.1 0.95 0.1 1.0

1

x P(x) F(x)0 0.1 0.11 0.2 0.32 0.3 0.63 0.2 0.84 0.1 0.95 0.1 1.0

1

The probability that at most three switches will occur:

Cumulative Distribution Function

Note:Note: P(X < 3) = F(3) = 0.8 = P(0) + P(1) + P(2) + P(3)

3-14

Page 15: Chap 003

x P(x) F(x)0 0.1 0.11 0.2 0.32 0.3 0.63 0.2 0.84 0.1 0.95 0.1 1.0

1

The probability that more than one switch will occur:

Using Cumulative Probability Distributions

Note:Note: P(X > 1) = P(X > 2) = 1 – P(X < 1) = 1 – F(1) = 1 – 0.3 = 0.7

3-15

Page 16: Chap 003

x P(x) F(x)0 0.1 0.11 0.2 0.32 0.3 0.63 0.2 0.84 0.1 0.95 0.1 1.0

1

The probability that anywhere from one to three switches will occur:

Using Cumulative Probability Distributions

Note:Note: P(1 < X < 3) = P(X < 3) – P(X < 0) = F(3) – F(0) = 0.8 – 0.1 = 0.7

3-16

Page 17: Chap 003

The mean of a probability distribution is a measure of its centrality or location, as is the mean or average of a frequency distribution. It is a weighted average, with the values of the random variable weighted by their probabilities.

The mean is also known as the expected value (or expectation) of a random variable, because it is the value that is expected to occur, on average.

The expected value of a discrete random variable X is equal to the sum of each value of the random variable multiplied by its probability.

E X xP xall x

( ) ( )

x P(x) xP(x)0 0.1 0.01 0.2 0.22 0.3 0.63 0.2 0.64 0.1 0.45 0.1 0.5 1.0 2.3 = E(X) =

543210

2.3

3-2 Expected Values of Discrete Random Variables

3-17

Page 18: Chap 003

Suppose you are playing a coin toss game in which you are paid $1 if the coin turns up heads and you lose $1 when the coin turns up tails. The expected value of this game is E(X) = 0. A game of chance with an expected payoff of 0 is called a fair game.

Suppose you are playing a coin toss game in which you are paid $1 if the coin turns up heads and you lose $1 when the coin turns up tails. The expected value of this game is E(X) = 0. A game of chance with an expected payoff of 0 is called a fair game.

x P(x) xP(x)-1 0.5 -0.50 1 0.5 0.50 1.0 0.00 =

E(X)=

-1 1 0

A Fair Game3-18

Page 19: Chap 003

Number of items, x P(x) xP(x) h(x) h(x)P(x) 5000 0.2 1000 2000 400 6000 0.3 1800 4000 1200 7000 0.2 1400 6000 1200 8000 0.2 1600 8000 1600 9000 0.1 900 10000 1000

1.0 6700 5400

Example 3-3Example 3-3: Monthly sales of a certain product are believed to follow the given probability distribution. Suppose the company has a fixed monthly production cost of $8000 and that each item brings $2. Find the expected monthly profit h(X), from product sales.

E h X h x P xall x

[ ( )] ( ) ( ) 5400

The expected value of a function of a discrete random variable X is:

E h X h x P xall x

[ ( )] ( ) ( )

The expected value of a linear function of a random variable is: E(aX+b)=aE(X)+b

In this case: E(2X-8000)=2E(X)-8000=(2)(6700)-8000=5400In this case: E(2X-8000)=2E(X)-8000=(2)(6700)-8000=5400

Expected Value of a Function of a Discrete Random Variables

Note: h (X) = 2X – 8000 where X = # of items sold

3-19

Page 20: Chap 003

The variancevariance of a random variable is the expected squared deviation from the mean:

2 2 2

2 2 2

2

V X E X x P x

E X E X x P x xP x

all x

all x all x

( ) [( ) ] ( ) ( )

( ) [ ( )] ( ) ( )

The standard deviationstandard deviation of a random variable is the square root of its variance: SD X V X( ) ( )

Variance and Standard Deviation of a Random Variable

3-20

Page 21: Chap 003

Number ofSwitches, x P(x) xP(x) (x-) (x-)2 P(x-)2 x2P(x)

0 0.1 0.0 -2.3 5.29 0.529 0.01 0.2 0.2 -1.3 1.69 0.338 0.22 0.3 0.6 -0.3 0.09 0.027 1.23 0.2 0.6 0.7 0.49 0.098 1.84 0.1 0.4 1.7 2.89 0.289 1.65 0.1 0.5 2.7 7.29 0.729 2.5

2.3 2.010 7.3

Number ofSwitches, x P(x) xP(x) (x-) (x-)2 P(x-)2 x2P(x)

0 0.1 0.0 -2.3 5.29 0.529 0.01 0.2 0.2 -1.3 1.69 0.338 0.22 0.3 0.6 -0.3 0.09 0.027 1.23 0.2 0.6 0.7 0.49 0.098 1.84 0.1 0.4 1.7 2.89 0.289 1.65 0.1 0.5 2.7 7.29 0.729 2.5

2.3 2.010 7.3

2 2

2 201

2 2

22

73 232 201

V X E X

xall x

P x

E X E X

xall x

P x xP xall x

( ) [( ) ]

( ) ( ) .

( ) [ ( )]

( ) ( )

. . .

Table 3-8

Variance and Standard Deviation of a Random Variable – using Example 3-2

Recall: = 2.3.

3-21

Page 22: Chap 003

The variance of a linear function of a random variable is:

V a X b a V X a( ) ( ) 2 2 2

Number of items, x P(x) xP(x) x2 P(x) 5000 0.2 1000 5000000 6000 0.3 1800 10800000 7000 0.2 1400 9800000 8000 0.2 1600 12800000 9000 0.1 900 8100000

1.0 6700 46500000

Example 3-Example 3-3:3:

2

2 2

2

2

2

2

2 8000

46500000 6700 1610000

1610000 1268 862 8000 2

4 1610000 6440000

2 80002 2 1268 86 2537 72

V X

E X E X

x P x xP x

SD XV X V X

SD x

all x all x

x

x

( )

( ) [ ( )]

( ) ( )

( )

( ) .( ) ( ) ( )

( )( )

( )( )( . ) .

( )

Variance of a Linear Function of a Random Variable

3-22

Page 23: Chap 003

The mean or expected value of the sum of random variables is the sum of their means or expected values:

( ) ( ) ( ) ( )X Y X YE X Y E X E Y

For example: E(X) = $350 and E(Y) = $200

E(X+Y) = $350 + $200 = $550

The variance of the sum of mutually independent random variables is the sum of their variances:

2 2 2( ) ( ) ( ) ( )X Y X YV X Y V X V Y

if and only if X and Y are independent.

For example: V(X) = 84 and V(Y) = 60 V(X+Y) = 144

3-3 Sum and Linear Composites of Random Variables

3-23

Page 24: Chap 003

The variance of the sum of k mutually independent random variables is the sum of their variances:

3-3 Sum and Linear Composites of Random Variables (Continued)

NOTE:NOTE: )(...)2()1()...21( kXEXEXEkXXXE )(...)2()1()...21( kXEXEXEkXXXE

)(...)2(2)1(1)...2211( kXEkaXEaXEakXkaXaXaE )(...)2(2)1(1)...2211( kXEkaXEaXEakXkaXaXaE

)(...)2()1()...21( kXVXVXVkXXXV

)(2...)2(22

)1(21

)...2211( kXVk

aXVaXVakXkaXaXaV

andand

3-24

Page 25: Chap 003

Example 3-4Example 3-4: A portfolio includes stocks in three industries: financial, energy, and consumer goods. Assume that the three sectors are independent of each other. The expected annual return and standard deviations are as follows: Financial – 1,000, and 700; energy – 1,200 and 1,100; consumer goods – 600 and 300. What is the mean and standard deviation of the annual return on this portfolio?

The mean of the sum of the three random variables is: 1,000 + 1,200 + 600 = $2,800.

The variance of the three random variables for the three sectors, assuming independence is: 7002 + 1,1002 + 3002 = 1,790,000.Thus the standard deviation is (1,790,000) = $1,337.9.

The variance of the three random variables for the three sectors, assuming independence is: 7002 + 1,1002 + 3002 = 1,790,000.Thus the standard deviation is (1,790,000) = $1,337.9.

3-3 Sum and Linear Composites of Random Variables (Continued)

3-25

Page 26: Chap 003

Chebyshev’s Theorem applies to probability distributions just as it applies to frequency distributions.

For a random variable X with mean standard deviation , and for any number k > 1:

P X kk

( ) 11

2

11

21

14

34

75%

11

31

19

89

89%

11

41

116

1516

94%

2

2

2

At least

Lie within

Standarddeviationsof the mean

2

3

4

Chebyshev’s Theorem Applied to Probability Distributions

3-26

Page 27: Chap 003

Using the Template to Calculate statistics of h(x)

3-27

Page 28: Chap 003

Using the Template to Calculate Mean and Variance for the Sum of Independent Random Variables

Output for Example 3-4Output for Example 3-4

3-28

Page 29: Chap 003

• If an experiment consists of a single trial and the outcome of the trial can only be either a success* or a failure, then the trial is called a Bernoulli trial.

• The number of success X in one Bernoulli trial, which can be 1 or 0, is a Bernoulli random variable.

• Note: If p is the probability of success in a Bernoulli experiment, then P(1) = p, P(0) = 1 – p, E(X) = p and V(X) = p(1 – p).

* The terms success and failure are simply statistical terms, and do not have positive or negative implications. In a production setting, finding a defective product may be termed a “success,” although it is not a positive result.

3-4 Bernoulli Random Variable3-29

Page 30: Chap 003

Consider a Bernoulli Process in which we have a sequence of n identical trials satisfying the following conditions:

1. Each trial has two possible outcomes, called success *and failure. The two outcomes are mutually exclusive and exhaustive.

2. The probability of success, denoted by p, remains constant from trial to trial. The probability of failure is denoted by q, where q = 1-p.

3. The n trials are independent. That is, the outcome of any trial does not affect the outcomes of the other trials.

A random variable, X, that counts the number of successes in n Bernoulli trials, where p is the probability of success* in any given trial, is said to follow the binomial probability distribution with parameters n (number of trials) and p (probability of success). We call X the binomial random variable.

* The terms success and failure are simply statistical terms, and do not have positive or negative implications. In a production setting, finding a defective product may be termed a “success,” although it is not a positive result.

3-5 The Binomial Random Variable 3-30

Page 31: Chap 003

Suppose we toss a single fair and balanced coin five times in succession, and let X represent the number of heads.

There are 25 = 32 possible sequences of H and T (S and F) in the sample space for this experiment. Of these, there are 10 in which there are exactly 2 heads (X=2):

HHTTT HTHTH HTTHT HTTTH THHTT THTHT THTTH TTHHT TTHTH TTTHH

The probability of each of these 10 outcomes is p3q3 = (1/2)3(1/2)2=(1/32), so the probability of 2 heads in 5 tosses of a fair and balanced coin is:

P(X = 2) = 10 * (1/32) = (10/32) = 0.3125

10 (1/32)

Number of outcomeswith 2 heads

Probability of eachoutcome with 2 heads

Binomial Probabilities (Introduction)3-31

Page 32: Chap 003

10 (1/32)

Number of outcomeswith 2 heads

Probability of eachoutcome with 2 heads

P(X=2) = 10 * (1/32) = (10/32) = .3125Notice that this probability has two parts:

In general:

1. The probability of a given sequence of x successes out of n trials with probability of success p and probability of failure q is equal to:

pxq(n-x) nCxn

x

nx n x

!

!( )!

2. The number of different sequences of n trials that result in exactly x successes is equal to the number of choices of x elements out of a total of n elements. This number is denoted:

Binomial Probabilities (continued)3-32

Page 33: Chap 003

Number of successes, x Probability P(x)

0

1

2

3

n

1.00

nn

p q

nn

p q

nn

p q

nn

p q

nn n n

p q

n

n

n

n

n n n

!!( )!

!!( )!

!!( )!

!!( )!

!!( )!

( )

( )

( )

( )

( )

0 0

1 1

2 2

3 3

0 0

1 1

2 2

3 3

The binomial probability distribution:

where :p is the probability of success in a single trial,q = 1-p,n is the number of trials, andx is the number of successes.

P xn

xp q

nx n x

p qx n x x n x( )!

!( )!( ) ( )

The Binomial Probability Distribution3-33

Page 34: Chap 003

n=5

p

x 0.01 0.05 0.10 0.20 0.30 0.40 0.50 0.60 0.70 0.80 0.90 0.95 0.99

0 .951 .774 .590 .328 .168 .078 .031 .010 .002 .000 .000 .000 .000

1 .999 .977 .919 .737 .528 .337 .187 .087 .031 .007 .000 .000 .000

2 1.000 .999 .991 .942 .837 .683 .500 .317 .163 .058 .009 .001 .000

3 1.000 1.000 1.000 .993 .969 .913 .813 .663 .472 .263 .081 .023 .001

4 1.000 1.000 1.000 1.000 .998 .990 .969 .922 .832 .672 .410 .226 .049

h F(h) P(h)

0 0.031 0.031

1 0.187 0.156

2 0.500 0.313

3 0.813 0.313

4 0.969 0.156

5 1.000 0.0311.000

Cumulative Binomial Probability Distribution and

Binomial Probability Distribution of H,the

Number of Heads Appearing in Five Tosses of

a Fair Coin

F x P X x P i

P F F

all i x

( ) ( ) ( )

( ) ( ) ( ). ..

P(X) = F(x) - F(x - 1)

For example:

3 3 2813 500313

Deriving Individual Probabilities from Cumulative Probabilities

The Cumulative Binomial Probability Table (Table 1, Appendix C)

3-34

Page 35: Chap 003

002.0)3()3(

)()()(

XPF

iPxXPxF

xiall

n=15p

.50 .60 .700 .000 .000 .0001 .000 .000 .0002 .004 .000 .0003 .018 .002 .0004 .059 .009 .001

... ... ... ...

60% of Brooke shares are owned by LeBow. A random sample of 15 shares is chosen. What is the probability that at most three of them will be found to be owned by LeBow?

60% of Brooke shares are owned by LeBow. A random sample of 15 shares is chosen. What is the probability that at most three of them will be found to be owned by LeBow?

Calculating Binomial Probabilities - Example

3-35

Page 36: Chap 003

Mean of a binomial distribution:

Variance of a binomial distribution:

Standard deviation of a binomial distribution:

= SD(X) = npq

2

E X np

V X npq

( )

( )

Mean of a binomial distribution:

Variance of a binomial distribution:

Standard deviation of a binomial distribution:

= SD(X) = npq

2

E X np

V X npq

( )

( )

118.125.1)(

25.1)5)(.5)(.5()(

5.2)5)(.5()(

2

:coinfair a of tossesfivein heads

ofnumber thecounts H if example,For

HSD

HV

HE

H

H

H

Mean, Variance, and Standard Deviation of the Binomial Distribution

3-36

Page 37: Chap 003

Calculating Binomial Probabilities using the Template

3-37

Page 38: Chap 003

Calculating Binomial Probabilities using Minitab

3-38

Page 39: Chap 003

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

i

43210

0.7

0.6

0.5

0.4

0.3

0.2

0.1

0.0

x

P(x

)

Binomial Probability: n=4 p=0.5

43210

0.7

0.6

0.5

0.4

0.3

0.2

0.1

0.0

x

P(x

)

Binomial Probability: n=4 p=0.1

43210

0.7

0.6

0.5

0.4

0.3

0.2

0.1

0.0

x

P(x

)

Binomial Probability: n=4 p=0.3

109876543210

05

04

03

02

01

00

x

P(x

)

Binomial Probability: n=10 p=0.1

109876543210

05

04

03

02

01

00

x

P(x

)

Binomial Probability: n=10 p=0.3

109876543210

05

04

03

02

01

00

x

P(x

)

Binomial Probabil ty: n=10 p=0.5

20191817161514131211109876543210

0.2

0.1

0.0

x

P(x

)

Binomial Probability: n=20 p=0.1

20191817161514131211109876543210

0.2

0.1

0.0

x

P(x

)

Binomial Probability: n=20 p=0.3

20191817161514131211109876543210

0.2

0.1

0.0

x

P(x

)

Binomial Probability: n=20 p=0.5

Binomial distributions become more symmetric as n increases and as p 0.5.

p = 0.1 p = 0.3 p = 0.5

n = 4

n = 10

n = 20

Shape of the Binomial Distribution3-39

Page 40: Chap 003

• A discrete random variable: counts occurrences has a countable number of possible values has discrete jumps between successive values has measurable probability associated with

individual values probability is height

• A continuous random variable: measures (e.g.: height, weight, speed, value,

duration, length) has an uncountably infinite number of

possible values moves continuously from value to value has no measurable probability associated

with individual values probability is area

For example: Binomial n=3 p=.5

x P(x)0 0.1251 0.3752 0.3753 0.125

1.0003210

0.4

0.3

0.2

0.1

0.0

C1

P(x)

Binomial: n=3 p=.5

For example:In this case, the shaded area epresents the probability that the task takes between 2 and 3 minutes.

654321

0.3

0.2

0.1

0.0

MinutesP(

x)

Minutes to Complete Task

Discrete and Continuous Random Variables - Revisited

3-40

Page 41: Chap 003

6.56.05.55.04.54.03.53.02.52.01.51.0

0.15

0.10

0.05

0.00

Minutes

P(x)

Minutes to Complete Task: By Half-Minutes

0.0. 0 1 2 3 4 5 6 7

Minutes

P(x)

Minutes to Complete Task: Fourths of a Minute

Minutes

P(x)

Minutes to Complete Task: Eighths of a Minute

0 1 2 3 4 5 6 7

The time it takes to complete a task can be subdivided into:

Half-Minute Intervals Quarter-Minute Intervals Eighth-Minute Intervals

Or even infinitesimally small intervals:When a continuous random variable has been subdivided into infinitesimally small intervals, a measurable probability can only be associated with an interval of values, and the probability is given by the area beneath the probability density function corresponding to that interval. In this example, the shaded area represents P(2 X ).

When a continuous random variable has been subdivided into infinitesimally small intervals, a measurable probability can only be associated with an interval of values, and the probability is given by the area beneath the probability density function corresponding to that interval. In this example, the shaded area represents P(2 X ).

Minutes to Complete Task: Probability Density Function

76543210

Minutes

f(z)

From a Discrete to a Continuous Distribution

3-41

Page 42: Chap 003

A continuous random variable is a random variable that can take on any value in an interval of numbers.

The probabilities associated with a continuous random variable X are determined by the probability density function of the random variable. The function, denoted f(x), has the following properties.

1. f(x) 0 for all x. 2. The probability that X will be between two numbers a and b is equal to the area

under f(x) between a and b. 3. The total area under the curve of f(x) is equal to 1.00.

The cumulative distribution function of a continuous random variable:

F(x) = P(X x) =Area under f(x) between the smallest possible value of X (often -) and the point x.

A continuous random variable is a random variable that can take on any value in an interval of numbers.

The probabilities associated with a continuous random variable X are determined by the probability density function of the random variable. The function, denoted f(x), has the following properties.

1. f(x) 0 for all x. 2. The probability that X will be between two numbers a and b is equal to the area

under f(x) between a and b. 3. The total area under the curve of f(x) is equal to 1.00.

The cumulative distribution function of a continuous random variable:

F(x) = P(X x) =Area under f(x) between the smallest possible value of X (often -) and the point x.

3-6 Continuous Random Variables3-42

Page 43: Chap 003

F(x)

f(x)x

x0

0

ba

F(b)

F(a)

1

ba

}

P(a X b) = Area under f(x) between a and b = F(b) - F(a)

P(a X b)=F(b) - F(a)

Probability Density Function and Cumulative Distribution Function

3-43

Page 44: Chap 003

3-7 Uniform Distribution

The uniform [a,b] density:

1/(a – b) for a X b f(x)= 0 otherwise

E(X) = (a + b)/2; V(X) = (b – a)2/12

{

bb1ax

f(x)

The entire area under f(x) = 1/(b – a) * (b – a) = 1.00

The area under f(x) from a1 to b1 = P(a1Xb) = (b1 – a1)/(b – a)

a1

Uniform [a, b] Distribution

3-44

Page 45: Chap 003

The uniform [0,5] density:

1/5 for 0 X 5 f(x)= 0 otherwise

E(X) = 2.5

{

6543210-1

0.5

0.4

0.3

0.2

0.1

0.0.

x

f(x)

Uniform [0,5] Distribution

The entire area under f(x) = 1/5 * 5 = 1.00

The area under f(x) from 1 to 3 = P(1X3) = (1/5)2 = 2/5

Uniform Distribution (continued)3-45

Page 46: Chap 003

Calculating Uniform Distribution Probabilities using the Template

3-46


Recommended