+ All Categories
Home > Documents > Chapter III

Chapter III

Date post: 23-Nov-2014
Category:
Upload: ahmad-assadi
View: 174 times
Download: 0 times
Share this document with a friend
Popular Tags:
37
Random Variables 1
Transcript
Page 1: Chapter III

1

Random Variables

Page 2: Chapter III

2

Today

• Sequential Experiments• Binomial, Multinomial and Geometric Probability laws• Random Variables• Expected Value

Page 3: Chapter III

3

• Some random experiments can be viewed as a sequence of sequential subexperiments. Note that subexperiments may or may not be independent.

• The outcome of A1,A2,…,An sequential experiments is the tuple s={s1,s2...,sn } where sk is the outcome of experiment Ak . The sample space of this sequential experiment is the cartesian product of individual sample spaces S1× S2×.. Sn .

• If subexperiments Ai are independent then:

Sequential Experiments

Page 4: Chapter III

4

Bernoulli Trial

• The outcome of a Bernoulli trial is said to be a “success” if event A occurs and is called a “failure” otherwise.• Example: A=head as the successful outcome of a coin toss in 3

independent trials with p=P[A].P[{HHH}]=P[{H}] P[{H}] P[{H}]=p3

P[{HHT}]=P[{H}] P[{H}] P[{T}]=p2(1-p)…P[{TTH}]=P[{T}] P[{T}] P[{H}]=p(1-p2)…P[{TTT}]=P[{T}] P[{T}] P[{T}]=(1-p)3

P[k =0]=P[{TTT}]=(1-p)3 , 0 successes in 3 trialsP[k =1]=P[{TTH,THT,HTT}]=3p(1-p)2, 1 success in 3 trialsP[k=2]= P[{HHT,HTH,THH}]=3p2(1-p), 2 successes in 3 trialsP[k=3]=P[{HHH}]= p3 . 3 successes in 3 trials

Page 5: Chapter III

5

Binomial Probability Law Let k be the number of successes in n independent Bernoulli trials, then the probability of k is given by:

where pn(k) is the probability of k successes in n Bernoulli trials and

is called the binomial coefficient.

Page 6: Chapter III

6

Binomial Probability Law

Example. Let k be the number of active speakers in a group of 8 noninteracting (i.e. independent) speakers. Suppose a speaker is active with probability 1/3. Find the probability that the number of active speakers is greater than 6. Solution. Ai is the event “ith speaker is active”. The number of active speakers is then the number of successes (7 or 8) in 8 Bernoulli trials with p = 1/3.

Page 7: Chapter III

7

Multinomial Probability Law

Let n be the number of independent repetitions of an experiment. Let kj number of times event Bj occurs, then vector {k1, k2,…kM} is the number of times that event Bj occurs. The probability of the vector {k1, k2,…kM} satisfies the multinomial probability law :

The binomial probability law is the M=2 case of themultinomial.

The binomial probability law is the M = 2 case of the multinomial

Page 8: Chapter III

8

Multinomial Probability Law

Example: A dart is thrown 9 times at a target with 3 areas. Each throw has a probability of 0.2, 0.3, and 0.5 of landing in areas 1,2,3 respectively. Find probability that the dart lands exactly 3 times in each area. Solution:

Page 9: Chapter III

9

Geometric Probability Law

Let m be the outcome of a sequential experiment with repeated Bernoulli trials until the occurrence of first “success”. The probability p(m) that m trials are needed is:

p(m)=P[A’1, A’2,…A’m-1, Am]=(1-p)m-1p

A’i is the event “failure” at trial i.

So sum of probabilities adds to 1.

Page 10: Chapter III

10

Geometric Probability Law

The probability than more than K trials are needed before a success occurs is:

Page 11: Chapter III

11

Geometric Probability Law

Example: Computer A sends a message to computer B over an unreliable radio link. Message is encoded to allow B detect errors. If B detects an error it request A to retransmit it. What is the probability that a message would be transmitted twice if the probability of message transmission error is q = 0.1

Solution:

P[(m>2)] = q2 = 0.12 =10-2

Page 12: Chapter III

12

Random Variables

• Random variables are used to model phenomena in which the experimental outcomes are numbers instead of labels.

•For example we are interested in knowing the number of heads instead of the specific sequence of labels such as {head, tail} that may occur in an experiment.

•X denotes the random number that we observe. It is called a random variable.

Page 13: Chapter III

13

Random Variables

Definition:

A random variable X is a function that assigns a real number X(ζ) to each outcome ζ in the sample space (S) of a random experiment.

A function is a rule for assigning a numerical value to each element of a set.

Page 14: Chapter III

14

Random Variables

Example: A coin toss can be characterized by a random variable X where SX={0,1} and:

Page 15: Chapter III

15

Random Variables

Example: The experiment consists of tossing a coin twice. X denotes the number of heads observed

Outcome (ζ) Value of X (ζ)HH 2HT 1TH 1TT 0X is then a random variable that may take values in the set Sx = {0,1,2}

Note that a statement about a random variable defines an event. In this example the statement “Number of heads equals 2” defines the event “number of heads observed”

Page 16: Chapter III

16

Random Variables

Example: A player pays $1.50 to play the following game: A coin is tossed 3 times and the number of head is counted. The player receives $1 if X = 2 and $8 if X = 3. Let Y be the reward to the player.

Y is a function of random variable X and its outcomes can be related back to the sample space.

Y(ζ)=f(X(ζ))In this example Y is a random variable taking values in the set

Page 17: Chapter III

17

Discrete and Continuous Random Variables • Discrete random variables assume values from a countable set, that is, being SX={x1,x2,x3…,xn}. the sample space

• A discrete random variable is finite if its range is finite i.e. SX={x1,x2,x3…,xn}.

• Continuous random variables assume values from an uncountable infinite set of possible values.

Page 18: Chapter III

18

Probability Mass Function

The Probability Mass Function (PMF) of a discrete random variable X is defined as:

PX(x) = P[X = x] for x a real number

X denotes random variablex denotes possible value of random variable

PX(x) is a function of x over the real line and can be nonzero only at the values x1,x2,….

Note that X = x is an event consisting of all outcomes s of the underlying experiment for which X(s) = x

Page 19: Chapter III

19

Discrete Random Variables and PMF

The PMF satisfies 3 properties:

PMF are still probabilities.

Page 20: Chapter III

20

Discrete Random Variables and PMF

Example: Let X be the number of heads in 3 independent tosses of a coin. Find the PMF of X.

Solution:p0=P[X=0]=P[{TTT}]=(1-p)3

p1=P[X=1]=P[{HTT}]+P[{THT}]+P[{TTH}]=3(1-p)2pp2=P[X=2]=P[{THH}]+P[{HHT}]+P[{HTH}]=3p2(1-p)p3=P[X=3]=P[{HHH}]=p3

This is the same as in the case of Bernoulli trial experiment.Note that as expected pX(0) + pX(1) + pX(2) + pX(3) = 1

Page 21: Chapter III

21

Discrete Random Variables and PMF

Example: A player receives $1 if the number of heads in 3 coin tosses is 2, $8 if number is 3, $0 otherwise. Find the PMF of reward Y.

Solution:pY(Y=0)=P[A {TTT,TTH,THT,HTT}]=4/8=1/2∈pY(Y=1)=P[A {THH,HTH,HHT}]=3/8∈pY(Y=8)=P[A {HHH}]=1/8∈Note that pY(0) + pY(1) + pY(2) + pY(3) = 1

Page 22: Chapter III

22

Uniform Random Variable

X is an uniform random variable in the set if for each k in the set SX={0,1,2…M-1} we have that py(k)=1/M.

pX(0) + pX(1) + pX(2) + pX(3) +…+ pX(M-1) = 1

Example: Experiment tossing a fair dieSX={0,1,2,3,4,5,6}

PX(0)= PX (1)= PX (2)= PX (3)= PX (4)= PX (5)= PX (6)=1/6

Page 23: Chapter III

23

Here we have Matlab code to generate an uniform RV

NUM=10000;Y=floor(rand(NUM,1)*8);X=[0:7];H=hist(Y,X)/NUM;bar(X,H)

The graph of relative frequencies approaches the PMF as the number of repetitions becomes verylarge.

PMF of Uniform RV

Page 24: Chapter III

24

PMF of Bernoulli RV

The Bernoulli RV IA is 1 if event A occurs and 0 otherwise:

The PMF of IA is

Page 25: Chapter III

25

PMF of Geometric RV

The geometric RV X takes values from SX={1,2,3,4..}. The event {X=k} occurs if the underlying experiment finds k-1 consecutive ”failures”, followed by one ”success”. The PMF of X is:

P X(k)=P[X=k]=P[00...01]=(1-p)k-1p=qk-1p= k=1,2,...

Page 26: Chapter III

26

Here we have Matlab code to generate an approximate PMF of a geometric RV by using relative frequencies.

NUM=1000;n=10;R=rand(NUM,1);PMF=zeros(n,n);for i=1:n

for j=1:i Y=(round(R*j));endforPMF(i,:)=hist(Y,n);

endforbar(PMF(:,n)/NUM)

PMF of Geometric RV

Page 27: Chapter III

27

PMF of Binomial RV

The binomial RV X takes values on the set SX ={1,2,3,4,..n}. The event {X=k} occurs if the underlying experiment finds k ”successes”, and n-k ”failures”. The PMF of X is:

Where

Page 28: Chapter III

28

Important Discrete RV

• Certain RV arise in many diverse unrelated applications. These RV model the fundamental mechanisms that underline the random behavior of an experiment. We have:

• Bernoulli• Binomial• Geometric• Negative Binomial• Poisson• Uniform• Zipf

•We have discussed some of them. Look in the textbook for the properties of the rest.

Page 29: Chapter III

29

Generation of Random NumbersOctave provides generators for several types of random numbers: Beta betarndBinomial binorndCauchy cauchy_rndChi-Square chi2rndUnivariate discrete_rndEmpirical empirical_rndExponential exprndF frndGamma gamrndGeometric georndHypergeometric hygerndLaplace laplace_rndLogistic logistic_rndLog-Normal lognrndPascal nbinrndUnivariate Normal normrndPoisson poissrndt (Student) trndUnivariate Discrete unidrndUniform unifrndWeibull wblrndWiener Process wienrnd

Page 30: Chapter III

30

Expected Value (Mean)

• To describe completely the behavior of a discrete RV an entire function such as the PMF pX(x) must be given.

• In some cases we are interested in knowing a few parameters that allows us to characterize the RV. One of these parameters is the expected value.

• The expected value or mean of a discrete RV X is defined as:

Page 31: Chapter III

31

Expected Value

• Bernoulli discrete RV

•Uniform discrete RV

Example: For M=8, E[X]=(8-1)/2=3.5

Page 32: Chapter III

32

Expected Value

Example: A player pays $1.50 to toss a coin 3 times. The player receives $1 if number of heads is 2, $8 if number of heads is 3, and nothing otherwise. Find the expected value of the reward Y. What is he expected value of the gain.

Solution: Y is a Bernoulli RV. The expected value of reward Y is: Expected gain is:

Page 33: Chapter III

33

Expected Value of functions of RV

Page 34: Chapter III

34

Calculation of Expected Value

Example: Noise voltage X is amplified and shifted by a device to obtain Y=2X+10 and then squared to produce Z=Y2=(2X+10)2. Find E[Z] if E[X2]=5 and E[X]=0.

Page 35: Chapter III

35

Exercises

2.97-A block of 100 bits is transmitted over a binary communication channel with probability bit error p=10-2.

A- If the block has 1 or fewer errors then the receiver accepts the block. Find the probability that the block is accepted.

B- If the block has more than 1 error, then is retransmitted. Find the probability that M retransmissions are needed.

Page 36: Chapter III

36

2.99 A student needs 8 chips of a certain type to build a circuit. It in known that 5% of these chips are defective. How many chips should he buy, for there to be a greater than 90% probability of having enough chips for the circuit

3.10 An m-bit password is required to access a system. A hacker systematically works through all possible m-bit patterns. Let X be the number of patterns tested until a correct password is found.

a) Describe the sample space S b) Show the mapping from S to SX the range of X c) Find the probabilities for the various values of X

Exercises

Page 37: Chapter III

37

Exercises

3.17-A modem transmits +2V signal into a channel. The channel adds to this signal a noise term that is drawn from the set {0,-1,-2,-3} with probabilities {4/10,3/10,2/10,1/10}. Find the PMF of the output Y on the channel

3.20- 2 dice are tossed and we let X be the difference in the number of dots facing up. a) Find and plot the PMF X b) Find the probability that X ≤ k for all k

3.48 a) Plot the PMF of the binomial RV with n=4 and p=0.1. b) Plot the PMF of the binomial RV in a) using Octave


Recommended