+ All Categories
Home > Documents > EXERCISES 5.2 X1, X2 - KSU | Faculty Webfacultyweb.kennesaw.edu/jfowle60/Mar21_0828_v1.pdf · What...

EXERCISES 5.2 X1, X2 - KSU | Faculty Webfacultyweb.kennesaw.edu/jfowle60/Mar21_0828_v1.pdf · What...

Date post: 08-Jul-2020
Category:
Upload: others
View: 3 times
Download: 0 times
Share this document with a friend
15
248 Chapter 5: Jointly Distributed Random Variables EXERCISES 5.2 1. Assume that X1, X2 is uniform on the square with corners (-1,0), (0, —1), (0, l),9 7 a. Find the marginal pdf for X 1. b. Find the conditional pdf for X 2, given X 1 = x 1 . 2. You select one integer at random from Ry = fl,2,3}; call the selected integer Y. Then you flip Y fair coins. What is the probability you observe no heads? 3. Assume the joint pdf for X 1, X 2 is fx1,x2(xl,x2) = xi e_ x2) , for x 1 > 0,x2 >0. a. Ar6X 1 and X 2 independent? K Evaluate the conditional pdf for X 1, given X 2 = x 2, and the condi- tional pdf for X 2, given X 1 = x 1. 4. The game of Lotto 6/53 involves drawing 6 numbers from the integers 1, 2,. ., 53. Assume that the draw is completely fair and let X 1, X 2 be the first tivo numbers drawn in One play (without replacement). a. What.is the joint probability law for X 1, X 2 ? b. Evaluate the conditional probability function for X 2, given X 1 = x 1. c. Evaluate the conditional probability function for X 1; given X 2 = X2- 5. A fair die is rolled three times; let X 1 be the number Of l's to occur on the flrt. two rolls, and let X 2 be the number of 2's to occur on the last two rolls. What is the joint probability function for X 1, X 2 ? 6. To generate a "continuous" uniform random number on the interval (0, 1) with three-decimal-place accuracy (three digits to the right of the decimal), a computer program actually generates a sequence of three in- dependent digits, each equally likely to take on the values 0, 1, . . . , 9. The number generated then consists of the decimal point followed by these three digits. a. Ltting Pi 1J 2 , D3 represent the three digits generated, what is their joint 'probability function? b. Let X.= .D1 D2 D3 be the "continuous" uniform random number gen- erated. What would you take as the exact probability law for X? c. Evaluate P(X < 1 ) with this procedure. 7. Suppose a person buys a ticket for her state lottery game on a sporadic basis. Assume she flips a fair coin each Saturday; if a head occurs, she buys a ticket, and if a tail occurs she does not. Further assume that the probability .a purchased ticket will win some prize is 1. Over a 10-week period, what is the probability distribution for the number of prizes she will win in the lottery? 8. Assume the number of people to enter a store, on a given business day, is a random variable N that is binomial with parameters A 1 , n - . Also assume the number of cash sales made by the store on any given business day, given that N = n people entered the store that day, is a binomial random variable X with parameters n, p. What is the marginal probability law for X, the number of sales made that day?
Transcript
Page 1: EXERCISES 5.2 X1, X2 - KSU | Faculty Webfacultyweb.kennesaw.edu/jfowle60/Mar21_0828_v1.pdf · What are the marginal probability laws for X1,X2,X3,X4 in Exercise 3? 5. If X1, X2, ...,

248 Chapter 5: Jointly Distributed Random Variables

EXERCISES 5.2 1. Assume that X1, X2 is uniform on the square with corners

(-1,0), (0, —1), (0, l),9

7 a. Find the marginal pdf for X1. b. Find the conditional pdf for X2, given X1 = x1.

2. You select one integer at random from Ry = fl,2,3}; call the selected integer Y. Then you flip Y fair coins. What is the probability you observe no heads?

3. Assume the joint pdf for X1, X2 is

fx1,x2(xl,x2) = xie_ x2) , for x1 > 0,x2 >0.

a. Ar6X1 and X2 independent? K Evaluate the conditional pdf for X1, given X2 = x2, and the condi-

tional pdf for X2, given X1 = x1. 4. The game of Lotto 6/53 involves drawing 6 numbers from the integers 1,

2,. ., 53. Assume that the draw is completely fair and let X1, X2 be the first tivo numbers drawn in One play (without replacement). a. What.is the joint probability law for X1, X2? b. Evaluate the conditional probability function for X2, given X1 = x1. c. Evaluate the conditional probability function for X1; given X2 = X2-

5. A fair die is rolled three times; let X1 be the number Of l's to occur on the flrt. two rolls, and let X2 be the number of 2's to occur on the last two rolls. What is the joint probability function for X1, X2?

6. To generate a "continuous" uniform random number on the interval (0, 1) with three-decimal-place accuracy (three digits to the right of the decimal), a computer program actually generates a sequence of three in- dependent digits, each equally likely to take on the values 0, 1, . . . , 9. The number generated then consists of the decimal point followed by these

• three digits. a. Ltting Pi 1J2 , D3 represent the three digits generated, what is their

joint 'probability function? b. Let X.= .D1D2D3 be the "continuous" uniform random number gen-

erated. What would you take as the exact probability law for X? c. Evaluate P(X < 1) with this procedure.

7. Suppose a person buys a ticket for her state lottery game on a sporadic basis. Assume she flips a fair coin each Saturday; if a head occurs, she buys a ticket, and if a tail occurs she does not. Further assume that the probability .a purchased ticket will win some prize is 1. Over a 10-week period, what is the probability distribution for the number of prizes she will win in the lottery?

8. Assume the number of people to enter a store, on a given business day, is a random variable N that is binomial with parameters A1, n-. Also assume the number of cash sales made by the store on any given business day, given that N = n people entered the store that day, is a binomial random variable X with parameters n, p. What is the marginal probability law for X, the number of sales made that day?

Page 2: EXERCISES 5.2 X1, X2 - KSU | Faculty Webfacultyweb.kennesaw.edu/jfowle60/Mar21_0828_v1.pdf · What are the marginal probability laws for X1,X2,X3,X4 in Exercise 3? 5. If X1, X2, ...,

5.2 Conditional Distributions and independence 249

9. Sampling inspection A small manufactured item is sold to .a retailer in lots of 1000. Assume that each item is aBernoulli trial (and that the trials are independent) with p being the probability of the itm being defective in some manner. The retailer selects a random sample Of 10 items from the lot of 1000 and inspects each one. What is the marginal probability law for Y, the number of defectives found in the sathple. of 10? (Hint: the conditional probability, letting X be the number Of defectives in the lot of 1000.)

10. A Geiger counter is used to record the number of radioacti'ie particles emitted by a source; unfortunately, the counter is defective and occasion-ally particles do not register. More specifically, assume eaëh particle to enter the counter is a Bernoulli trial, with p being the. prpbabi1iti the. par-ticle will register, and assume these trials are independent; What is the probability law for the number Y of particles COunted in an interval of time of length t, granted the particles are emitted like events in a Poisson process with parameter A? (Hint: Use conditional probability, and let X be the number of particles emitted in the given time period.)

11. Suppose a single point in the (x,y)-plane.is chosen in the following way: First the x-coordinate is determined as the observed vahie for a random variable X, which is uniform on (0, 1). Then the y-coordinate is deter-mined by the observed value for a random variable Y, which is uniform on the interval (0, x), where x is the observed value for X. a. Find the joint pdf for X, Y. b. Evaluate the marginal pdf for Y; then find the conditional pdf for X

given Y =y. 12. Suppose n independent Bernoulli trials are performed, each with proba-

bility of success equal to p. If there are x successes in the first n1 trials, what is the probability law for the number of successes in the remaining n - n1 trials?

13. Suppose n independent Bernoulli trials are performed, each with proba-bility of success equal to p. Assuming x successes are observed, what is the probability law for the number of successes in the first n1 <n trials?

14. Let T1, T2 have the joint pdf fT1,T2(tl)t2) = e_t2 for 0< t1 <t2, and eval-uateP(T1 <11T2 >2).

15. Granted a sequence of independent Bernoulli trials, each with parameter p, let X1 be the trial number of the first success and let X2 be the trial number of the second success. Evaluate the.joint probability function for X1, X2. (Hint: Consider the conditional probability function for X2 given X1 =x1.)

16. Independent Bernoulli trials with parameter p are performed until the occurrence of the second success. If the second success occurs on trial number x2, what is the probability function for the trial number of the first success?

Page 3: EXERCISES 5.2 X1, X2 - KSU | Faculty Webfacultyweb.kennesaw.edu/jfowle60/Mar21_0828_v1.pdf · What are the marginal probability laws for X1,X2,X3,X4 in Exercise 3? 5. If X1, X2, ...,

5.3 Multinomial and B/variate Normal Probability Laws 257

Example 5.13 The bivariate normal probability law given by Eq. (4) is frequently employed in describing scores made on national standardized tests, like the Scholastic Aptitude Test, which you may have taken at some point in time. Such a standardized test will typically have at least two parts. For concreteness, let us assume a particular test has two parts, labeled "verbal" and "mathematics." Furthermore, we shall assume that a randomly selected student who takes this test will make scores X1 and X2 on these two parts, and that X1, X2 are bivariate normal.

The parameters for this bivariate normal distribution will be taken as 1J, I = /L2 = 500, o = = 100, and p = .4. The marginal probab.ility a student will score better than 550 on the verbal part (with no information about his mathematics score) is P(X1 > 550) = P(Z > ) .3085, where Z is standard normal. The marginal probability he scores less than 400 on the mathematics portion (with no information about his verbal score) is P (X2 <400) = P (Z < —1) = .1587.

If we are given that a student scored x2 = 600 on the mathematics por-tion, then the conditional distribution of her verbal score is' normal with mean t1 + p(6OO - .t2) 500 + (.4)(600 - 500) = 540 and standard de-

0"1 100

viation 100v11__—.42 = 91.7. The conditional probability that her verbal score exceeds 550 is P(X1 > 550 1 x2 = 600) = P(Z> (550 - 540)/91.7) = P(Z>

.109) = .4598, greater than the marginal probability for this same event. Similarly, if it is given that ' a student scored x1 = 450 on the verbal portion,

the conditional pdf for his mathefiiatics score is normal with mean 500 + (.4) () (450 - 500) = 480 and standard deviation 91.7. Thus the probability 100 his mathematics score is less than 400 is P (X2 <400 I x1 = 450) P (Z < (400 - 480)191.7) = P(Z < —.872) = .1906.

EXERCISES 5.3 1. Let X1,X2,X3 be multinomial with parameters n = 9, p, = ,P2 M

= , and

P3 = . Evaluate the probability that the three types of outcomes oc- 12 curred equally frequently (i.e., X1 = X3). What is the conditional probability law for X1, X2 given X3 3?

2. A five-dimensional random variable Y1, Y2, ..., Y5 has probability func- tion

G1,

15 15

y5 (Y1,y2) . . . ))5)= ( I/%O 1 Y2, Y3, Y4, Y5/ \-'/

m a. What is the marginal probability law for Y39 b. What is the marginal probability law for 1'2, Y4? c. Evaluate P(Y2 = Y4). d. Evaluate P(Y1 =Y2 =Y3 =Y4 =Y5).

3. A moderate-size university has 8000 full-time registered undergraduate students, of whom 2600 are classified freshmen, 2100 are sophomores, 1900 are juniors, and the remaining 1400 are seniors. The student council

Page 4: EXERCISES 5.2 X1, X2 - KSU | Faculty Webfacultyweb.kennesaw.edu/jfowle60/Mar21_0828_v1.pdf · What are the marginal probability laws for X1,X2,X3,X4 in Exercise 3? 5. If X1, X2, ...,

258 Chapter 5: Jointly Distributed Random Variables

selects 100 of these students at random. If X1, X2 , X3 , X represent the numbers of frehmen, sophomores, juniors, and seniors, respectively, in the sample, what is their probability law? What is the probability the sample includes 33 freshmen, 26 sophomores, 24 juniors, and 17 seniors? Approximate this probability by the appropriate multinomial.

4. What are the marginal probability laws for X1,X2,X3,X4 in Exercise 3? 5. If X1, X2, ..., X,, is multi-hypergeometric with parameters m, r1 , r2 ,...

Tk, n, what is the marginal probability law for X? 6. Granted 33 freshmen occur in the sample of 100 students discussed in

Exercise 3, what is the conditional probability function for X2, X3, X4? Use this to evaluate the conditional probability of getting 26 sophomores, 24 juniors, and 17 seniors in the sample of 100.

7. A bookstore receives 100 copies of a best-selling novel; of these, 50 have a: dust jacket with a red background, 30 have a green background, and the remainder have a yellow background. a. If 20 of these books are sold in the first day, and X1, X2, X3 are the

numbers of these with the different colored jackets in the order given, what is the probability law for X1 , X2, X3? Assume those sold were selected at random from the 100.

b. Evaluate px,,x2,x3(10, 6,4). c. What is the probability that all 20 have a red dust jacket?

8. A town contains 125 fast-food restaurants, of which 30 belong to chain V, 20 belong to chain 13, 40 belong to chain W, and the rest belong to chain M. A total of 400 tourists eat lunch at a fast-food restaurant in this town on a given day. If each of these tourists selects his or her restaurant at random, and X1,X2,X3,X4 count the numbers to select restaurants from chains V, 13, W, and M, respectively, what is the probability law for X1,X2 ,X3 ,X4?

9. The game of bridge requires four players, each of whom is dealt 13 cards from a regular 52-card deck. Suppose you are playing this game, and assume that the 13 cards you receive are selected at random from the 52. Define X1 to be the number of hearts you receive, X2 the number of diamonds you receive, X3 the number of spades you receive, and X4 the number of clubs you receive. What is the probability law for X1,X2,X3,X4? What is the probability law for X1?

10. Many different computer algorithms exist for generating observed values from any desired probability law; consider the probability integral trans-form (Thebrem 4.6), as one example. Suppose one of these algorithms is used to generate 200 independent standard normal observations (ob-served values of standard normal random variables). Let C1 be the num-ber of these generated values that are smaller than —2, let C2 count those between —2 and —1, C3 those between —1 and 0, and so forth up to C6 counting those that exceed 2, so E6

1 C, = 200. What is the probability law for C1, C2,..., C6?

11. The bivarite normal probability law is frequently assumed to describe the impact points of rounds fired at a target in a two-dimensional plane.

Page 5: EXERCISES 5.2 X1, X2 - KSU | Faculty Webfacultyweb.kennesaw.edu/jfowle60/Mar21_0828_v1.pdf · What are the marginal probability laws for X1,X2,X3,X4 in Exercise 3? 5. If X1, X2, ...,

278 Chapter 6: Expectation, Moments

If X1 , X2 are independent random variables, then E[X1X2] = E[XI] E[X2] and the covariance (and correlation) between X1 and X2 is necessarily 0. It is important to recognize that independence implies a covariance of zero, but that zero covariance for two random variables does not by itself imply they must be independent.

EXERCISES 6.1 1. Suppose a rectangle is constructed in the (x1, x2)-plane with base length x1 and height x2, where x1 is the observed value of a random variable X1 that is uniform on the interval (1, 2), while x2 is the observed value

-U- ( of a random variable X2 that, given X1 = x1, is uniform on the interval , (0,x1). What is the expected area of the rectangle?

2. A fair die is rolled twice; let X1, X2 be the numbers of spots on the top face for these two rolls. Evaluate B[X1 + X2], E[X1X2], and E[X1/X2].

3. Let Y1 , Y2, . . ., Yk be independent binomial random variables with pa-rameters flj,. (Note that the n1 parameters may be different, but the p parameters are equal.) Evaluate the mean and variance of Y1 .

4. Let Y, 1'2,. . ., Y, be independent Poisson random variables, with sj the parameter for 1's ; evaluate the mean and variance of Y1 .

5. The width of a rectangle is given by the observed value of a random variable X1 whose pdf is

fx(xi)=2x11, for <x1 <.

The height of the rectangle is given by X2, which is uniform on the interval (xi + , x1 + ). What is the expected area of this rectangle?

6. The expected values for the random variables X and Y are 5 and —5, respectively, while their variances are 4 and 9, and their covariance is —5. Define W = X + Y and V = X - Y, and evaluate the means, variances, and covariance for W, V.

7. Suppose X1,X2 are random variables with the same mean p, and the same variance cr2, while their covariance is pa-2. What is the covariance ofU=X1 +X2 and V=X1 —X2?

8. Let X1,X2,. . . ,X, be independent random variables, each with mean t and variance a-2, and define U = a1X, where a1 , a2,... , a, are

constants satisfying >:7= a = 1. a. Show that the expected value for U is A, the common expected value

for the X1. b. Show that the variance of U is a-2 Ell 1 a, and that this variance is

minimized with ai = 1/n, for i = 1, 2,. .. ,n. 9. Suppose X1,X2 are independent random variables, each with the same

mean A. If a is any constant, then E[aXi + (1 - a)X2] = t; also assume that the variance for X1 is a-? = ko, where k is a known constant and Var[X2] = a-?. With U = aX1 + (1 - a)X2, find the value of a that mini-

2

mizes a, (which will be a function of k).

10. What is the expected position number of the last ace in a well-shuffiec 52-card deck?

11. Show that the magnitude of the correlation between two random vari.

Page 6: EXERCISES 5.2 X1, X2 - KSU | Faculty Webfacultyweb.kennesaw.edu/jfowle60/Mar21_0828_v1.pdf · What are the marginal probability laws for X1,X2,X3,X4 in Exercise 3? 5. If X1, X2, ...,

292 Chapter 6: Expectation, Moments

where the parameter k = 1,2,3,... and is called the "degrees of freedom" for W. Much of this discussion is summarized in the following theorem.

THEOREM 6.8 Let Z1 , Z2,. . . , Zk be independent standard normal ran- dom variables; then W = Zj2 has the x2 distribution with k degrees of freedom and E[W] = k, Var[W] = 2k.

Example 6.19 The x2 distribution with k = 2 degrees of freedom is identical to the ex- nential probability law with A = 1 (the gamma with n = 1 and A =

is allows easy evaluation of probability statements for this special case. r example, if X1, X2 are independent normal random variables, each with an 0 and variance 4, then X1/2,X2/2 are independent standard normal. = (X? + X)/4 then has the x2 distribution with two degrees of freedom at is, W is exponential with A = and

P(X? +X <2) = P(W < = 1— = .221,

ile

X22

leed, for any even number of degrees of freedom k, W is an Erlang random iable with r = k/2, A = , so the cdf for W can be written as a sum of isson probabilities:

Fw(t) = 1— >P(XI2 = 0,

ere X12 is Poisson with parameter t/2. For odd degrees of freedom, nu-•rical integration is called for to evaluate Fw(t).

EXERCISES 6.2 1.

1.10 2.

') 3.

¶ 4

5

on

Evaluate the moment generating function for a binomial random variable with n = 2 and p. Use this function to evaluate the full series of moments for k= 1,2,3..... Let X be discrete uniform with ara,meterrn; evaluate m1 and m2, the first two moments for X. & A random variable U has moment generating function mu(t) (7/(7 - t))3. Evaluate the mean and variance for U. What is the proba-bility law for U? Let Y be a geometric random variable with parameter p; find the moment generating function and the factorial moment generating function for Y. A random variable Y has moment generating function my(t) =

find the mean and variance for Y. What is the probability law for Y? The moment generating function for a discrete random variable G is

et + et MG(t) = cosh =

2

Page 7: EXERCISES 5.2 X1, X2 - KSU | Faculty Webfacultyweb.kennesaw.edu/jfowle60/Mar21_0828_v1.pdf · What are the marginal probability laws for X1,X2,X3,X4 in Exercise 3? 5. If X1, X2, ...,

6.2 Moments and Generating Functions 293

What is the probability law for G? 7. A discrete random variable W, whose range is a subset of the non-

negative integers, has probability generating function

t2 t 4 t 8

What is the probability function for W? 8. Find the cumulant generating function for the negative binomial proba-

bility law with parameters r,p, and use it to re-evaluate the mean and variance of the probability law.

9. If X is a random variable with cumulant generating fundtion C(t), find the cumulant generating function for Y = a + bX, where a, b are con-stants. Use this to show (again) that Ay = a + bpx and that 4 = b2 4.

10. Suppose X is a discrete random variable with factorial moment generat-ing function /ix(t). Let Y = a + bX, where a, b are constants, and express the factorial moment generating function for Y in terms of 1 (t).

11. If the first three moments of a random variable V are 5, 27, and 155, evaluate the second and third moments of V about its mean.

12. If X is a random variable whose mean is ttx and whose second and third moments about the mean are A2 and t3, express the second and third moments of X (about 0)111 terms of /tx, P2, /L3 (the moments about the mean). (Hint: x3 = ((x - px) + px)3.)

13. A random variable W has mean 5, while its second and third moments about the mean are 2 and 4; what are its first three moments m1, m2, m3?

14. Granted '(t) = t100 is the probability generating function for X, what is the probability law for X?

15. Suppose for a random variable X the first two cumulants are, say, 1 and 5, and all successive cumulants are 0. What must be the probability law for X?

16. For the mathematically inclined The kth moment of a random variable exists if and only if E[IXI"] <00. Show that if the kth moment exists, then so does the mth moment, where m <k. (Hint: If IXj <1, then IXI' <1, so JXm <1XIk + 1. If JXJ > 1, then jXj,n <IXI" for any m <k. Thus jxtm < ixi + 1.)

17. If X has mean tLx and variance 4, express the moment generating function for Y = (X - tx)/crx, the standard form for X, in terms of mx(t).

18. Let Y1, Y2,. . . , Yr be independent, geometric random variables, each with the same parameter p. Find the moment generating function for X =

Y1. What is the probability law for X? 19. If X1, X2,. .. , X, are independent Poisson random variables, with param-

eters P 1, pa,. .. , p, respectively, find the moment generating function for W = > 1ajX, where a1,a2,. .. ,a are constants, and identify its prob-ability law.

20. Suppose Y1, Y2,.. . , Y, are independent normal random variables, where the mean and variance for Y are j.j, o 2; find the moment generating func-tion for V = >7 a1 1', where a1, a2,.. . , a,, are constants, and identify its probability law.

Page 8: EXERCISES 5.2 X1, X2 - KSU | Faculty Webfacultyweb.kennesaw.edu/jfowle60/Mar21_0828_v1.pdf · What are the marginal probability laws for X1,X2,X3,X4 in Exercise 3? 5. If X1, X2, ...,

294 Chapter 6: Expectation, Moments

21. Let X1,X2,.. . ,X,, be independent Erlang random variables, where the parameters for X1 are r, A1. What must be true of the parameters for

X, to follo.v the Erlang probability law? 22. Suppose X1, X2,. . . , X are independent negative binomial random vari-

ables where the parameters for X1 are r1, p,. What must be true of these parameters for X, to follow the negative binomial probability law?

23. If U1, U2,.. ., U,, are independent x2 random variables, where U1 has i-'

degrees of freedom, show that V = U j has the x2 distribution with v, degrees of freedom.

24. Let U1, U2,..., U,, be independent uniform (0, 1) random variables, and find the probability law for V = ln(1/U?U22 . .. Ui). (Hint: Write V —2> In. U. What is the probability law for —21n U1?)

6.3 1 Conditional Expectation

Conditional probability is a very useful tool in modeling many real-world phenomena; closely allied with conditional probability is the idea of condi-tional expectation, averages taken with respect to a conditional probability law. Conditional expectation and some of its applications will be discussed in this section. The following example introduces the concept.

Example 6.20 Suppose we observe events whose occurrences through time satisfy the as- sumptions for a Poisson process with parameter A. If we begin our observa-tion at time t = 0 and let T1 be the time the first event occurs and T2 the time the second event occurs, then the marginal probability law for T1 is exponen-tial with mean 1/A. The conditional probability law for T1, given T2 = t2, is uniform on (0, t2) as we saw earlier. The mean (or balance point) of this condi-tional pdf is t2 /2. This latter value is called the conditional expectation of T1, given that T2 = t2; we shall use the stylus (j) with our expected value nota-tion to indicate conditional expectation, as opposed to marginal, or uncondi-tional, expectation. Thus for this case we shall write E[T1 It2] = p 1 1 2 = t2 /2, to stress the fact that conditional probability is involved. For this same case we know that the marginal probability law for T2 is Erlang with parameters r = 2 and A, so the (unconditional) expected value for T2 is E[T2] = 2/A. Recall also that the conditional probability law for T2, given T1 = t1, is the shifted exponential with pdf fT2 I T, (t2 jt1) = Ae_A(12t1) for t2 > t1; the mean of this pdf is t1 + 1/A. Thus E[T2 tl] =tkT2 I t, = tj + 1/A using our conditional expectation notation. Notice that these conditional expectations do not equal the marginal expectations; this happens because T1, T2 are not independent random variables.

Page 9: EXERCISES 5.2 X1, X2 - KSU | Faculty Webfacultyweb.kennesaw.edu/jfowle60/Mar21_0828_v1.pdf · What are the marginal probability laws for X1,X2,X3,X4 in Exercise 3? 5. If X1, X2, ...,

6.3 Conditional Expectation 301

of such sales is clearly not a constant from day to day. Let Xi represent the dollar amount of sale i on this day. Granted N sales are made, the total dollar amount of the sales for this day is then Y X, which gives the total of a random number of random variables. To keep this model simple, let us assume that the individual dollar amounts, the Xi s, while random, each have the same mean fk and the same variance u2, and that the covariance between any two Xe 's is 0 no matter what the value of N. The expected dollar total for the sales made on this day is then

E[Y]=E[X].

If N = n, then the conditional mean of the total is

Xi I =>E[Xj}=np.;

the unconditional expected total of the sales for this day is then the expected value of this conditional expectation:

E[Y]=E[E[YIN]1=E[Np.]== p.E[N]. Not surprisingly, this is the product of the expected number of sales to be made times the expected amount for each sale. The conditional variance of Y, given N = n, is

2

Var[YIn}=E[(Y_n)2 In]=E[[(Xi_)] n]

= E[(X - p.)( - p.) I n]. (7)

i=1 j=1

Because of our assumption that the covariance of X, and X1, given N = n, is 0 for all i L j, only the terms in this sum for which i = j will be nonzero. Thus Eq. (7) reduces to Var[ Y I ii] = no-2, where a-2 is the common variance of the X, 's. The (marginal or unconditional) variance for Y is

Var[Y] = E[Var[Y I N]] +Var[E[Y I N]] =B[Ncr2]+Var[Np.]= o-2 E[N]+p.2 Var[N]. (8)

Notice that the variance of the total sales is larger because of the randomness of N; that is, if the number of sales were a constant n each day, then the variance of Y would simply be no-2 (because then E[n] = n and Var[n] = 0). The second term in Eq. (8) reflects the inflation in the variance of Y caused by the randomness or variability in the number of sales.

U EXERCISES 6.3 1. Let X, Y be jointly distributed random variables, and assume that E[Y I x] = a + bx, where a and b are constants. Show that Ay = a + bp..

2. A computer simulation requires knowledge of the times to failure of cer-

Page 10: EXERCISES 5.2 X1, X2 - KSU | Faculty Webfacultyweb.kennesaw.edu/jfowle60/Mar21_0828_v1.pdf · What are the marginal probability laws for X1,X2,X3,X4 in Exercise 3? 5. If X1, X2, ...,

302 Chapter 6: Expectation, Moments

tain types of gear; when a failure time is required, the computer selects an observed value x from the probability law whose pdf is fx(x) = 2x for 0 <x < 1. The reciprocal of this value, 1/x, is then used as the pa-rameter for the exponential distribution of Y, the time to failure; thus the conditional pdf for Y, given X = x, is

fy!x(yIx)=e_Y/x, for y>0.

The actual failure time employed is then selected from this (conditional) probability law for Y. Evaluate the expected value and the variance for Y.

3. An enthusiastic lottery player tosses a fair coin until the first head occurs; the (random) flip number N on which this first head occurs then gives the number of tickets he will buy. Assume that the tickets he buys are independent Bernoulli trials with parameter p (success meaning the ticket wins some prize), and let X be the number of winning tickets he buys. Evaluate the mean and variance for X.

4. The marginal pdf for X is

fx(x)—bx, for 0<x< 4b

,

and the conditional pdf for Y, given X = x, is Erlang with parameters r and A = 1/x. Thus the joint pdf for X, Y is

br F(r ) e for 0< x < y >0, fx,v(x,y) = x r

where r > 0, and b > 0. a. Evaluate E[Y I x], and use this to find E[XY]. b. Find the correlation between Y and X.

5. Let X, Y be jointly distributed random variables, and assume that E[Y I x] = bx, where b 0 is a constant and p 0. a. Show that b = jty/tx. b. Show that E[Y/X] = E[Y]/E[X].

6. Assume that X, Y are jointly distributed random variables, and that E[Y I x] = bx, where b =A 0 is a constant and ,ax 0. Show that the cor-relation between X and Y is p = /tyo-x/,axoy. (Hint: Consider Cov[X, Y} = E[(X px)YI.)

7. In a computer video game, you are to annihilate as many invaders as pos-sible. Assume that the number of invaders presented to you per minute is a Poisson random variable X1 with parameter jt; you have one oppor-tunity to hit each one presented. Also assume that the number you hit, given x1 are presented, is a binomial random variable X2 with parameters x1,p. a. Evaluate E[X2], the number of invaders you hit (in a 1-minute game). b. If you play a 3-minute game, evaluate the expected number, you hit.

8. Let X1 have density 2(1 - x1), for 0 <x1 <1, and let the conditional pdf for X2, given X1 = x1, be uniform for —1 + x <x2 <1 - x1. Evaluate E[X2 I xi], E[X1 I x21, and Cov[X1, X2]. This is a case for which E[X2 I x1] =

E[X2] but E[X1 I x] E[X1].

Page 11: EXERCISES 5.2 X1, X2 - KSU | Faculty Webfacultyweb.kennesaw.edu/jfowle60/Mar21_0828_v1.pdf · What are the marginal probability laws for X1,X2,X3,X4 in Exercise 3? 5. If X1, X2, ...,

6.4 Summary 303

9. Let X1,X2,.. ,Xk be multi-hypergeometric random variables with pa-rameters m, r1, r2 ,. .. , rJ , n. Evaluate the covariance and correlation be-tween X1 , Xk, the ith and kth components of X1, X2,... Xj.

10. Suppose X, V are jointly distributed random variables, with E[X y] = c, where c 0 is constant, and E[Y I x] = g(x), a function of x. Show that

=E[g(X)]

11. Suppose X1, X2 are jointly distributed, each with mean i, variance a- 2, correlation p. Also suppose that the conditional mean for X2, given X1 =

x1, is linear in x1. Let V = X1 + X2, and find the conditional mean for V given X1 = xi Also evaluate the expected value of the conditional variance for V given X1 = x1

12. Let XI, X2 be independent, each with mean i and variance a-2 . Define V = X1X2, and find the conditional mean for Y, given X1 = x1. Also evaluate the expected value of the conditional variance of Y, given X1 =

X1. 13 Let X1,X2,X3 be identically distributed random variables each with

mean A and variance the correlation between each pair is p and the conditional expectation of X1 given Xj = x,, is linear in x1 for all

j Define Y = X1 + X2 + X3. Evaluate the conditional mean for Y given X = x, and find the expected value of the conditional variance for V given X1 = Xi Given V = y evaluate the conditional expectation for X1 and find the expected value of the conditional variance for X1

14 Let X1, X2 be jointly distributed random variables, each with mean JL and variance y2, the correlation between them is p, and the conditional mean of X2, given X1 = x, is linear in x1 Define V = X1 - X2 Evaluate the conditional mean for V, given X = x1, and find the expected value of the conditional variance for Y given X1 = x1

15 Show that if X, V are jointly distributed then

Var[V] ~ min(E[Var[V IX]],Var[E[VIX]]).

16. The number of sales made by a retail store per business day is a random variable N with mean 200 and standard deviation 20. For any given value of N, the dollar amounts of the individual sales are uncorrelated, each with mean 50 and standard deviation 10. What is the expected total dollar amount of the sales made by this store in 1 day? What is the standard deviation of the daily total dollar amounts of sales?

6.41 Summary

The expected value of a function of two random variables is given by the following:

Page 12: EXERCISES 5.2 X1, X2 - KSU | Faculty Webfacultyweb.kennesaw.edu/jfowle60/Mar21_0828_v1.pdf · What are the marginal probability laws for X1,X2,X3,X4 in Exercise 3? 5. If X1, X2, ...,

7.3 Some Limit Theorems 343

hard to distinguish the x2 and Cornish—Fisherpdfs. The quantiles gotten from Cornish and Fisher's transformation are clearly much closer to the x2 quan-tiles for any desired k. Their scheme for adjusting standard normal quantiles can give great accuracy in approximating the quantiles for many continuous probability laws.

'I

EXERCISES 7.3 1. Let X1,X2,. . . ,X,,,... be independent geometric random variables, each with the same parameter p, and define the sequence of averages whose nth term is

I 1z

In Xi.

Will the sequence 11 A2 . . . ,,,... converge in probability? If so, to what value?

2. Let U1, U2,. . ., U,,,... be independent uniform random variables on the interval (a, b), and define the sequence of averages whose nth term is

1 It

Ui •

Will the sequence 01, J2,. .. , U,... converge in probability? If so, to what value?

3. Let V1, T'2,. .. , V,,,... be independent Poisson random variables, each with mean A, and define the sequence of averages whose nth term is

Will the sequence V1, V2, . .. , Va,... converge in probability? If so, to what value?

4. Let X1,X2,. .. ,X,,,... be independent negative binomial random vari-ables, each with parameters r, p, and define the sequence of averages whose nth term is

J fl = z1xi .

Will the sequence Xj,X2,... ,X,,,... converge in probability? If so, to what value?

5. A real estate consultant assumes that the arrivals of telephone calls to a suburban office of a large real estate firm during business hours behave like a Poisson process with parameter A. He counts the number of in-coming telephone calls for n = 10 working days (each working day being 9 hours long) and observes the following sequence of numbers of calls received on these days: 25, 31, 29, 24, 18, 26, 23, 26, 31, 27. Based on these observed values, what might he guess to be the expected number of calls per business day?

Page 13: EXERCISES 5.2 X1, X2 - KSU | Faculty Webfacultyweb.kennesaw.edu/jfowle60/Mar21_0828_v1.pdf · What are the marginal probability laws for X1,X2,X3,X4 in Exercise 3? 5. If X1, X2, ...,

344 Chapter 7: Transformations and Limit Theoi-'ems

6. A mathematical exercise Let U1 , U2, . . . , U,,,... be indepeiident uniform random variables on the interval (a, b), and define the sequehce of max imum values whose nth term is U() = max(Ui , U2,..., Us). Will the se-quence

U(1), U(2),..., U(,,) ,...

converge in prObability? If so, to what value? 7. Let W1, W2,. .. , W,,,... be independent gamma random variables, each

with parameters m, A, and define the sequence of averages whose nth term is

wWj.

Will the, Sequence TV1, W2,..., Wni converge in probability? If so, to what -\'aliie?

8. Let Z11 Z2 ,... ,Zn) .. . be independent standard normal random variables, and define the sequence of averages whose nth term is

n =Zi.

Will the sequence Z1, Z2,. .. , Z,,,.,. converge in probability? If so, to what value?

9. Let Z1, Z2,. . , Z,... be independent standard normal random variables, and define the sequence of averages of absolute values whose nth term is

In ii

Will the sequence 12 1, 12 12,..., Z,,,., converge in probability? If so, to what value?

10. Let Z1 , Z2,. . . ,Z,,,... be independent standard normal random variables and define the sequence of averages of squares of these values whose ntl -term is

Will the sequence Z? , Z , . .. ,Z, ..... converge in probability? If so, t

what value? 11. Let X1,X2,.. . ,X,,,. .. be a sequence of independent exponèntialrando

variables, each with parameter A. For each fixed n = 1,2,3,... defir. V,, = miii(X1,X2,. . . ,X,,) to be the smallest of the first n exponenti random variables. Does the sequence V1, V2,. . . , V,,, .. . converge in mci square? If SO, to what value?

12. As a model for the spacings between rings in a tree trunk, a biolog assumed that the observed spacings would be well modeled by a sequen

Page 14: EXERCISES 5.2 X1, X2 - KSU | Faculty Webfacultyweb.kennesaw.edu/jfowle60/Mar21_0828_v1.pdf · What are the marginal probability laws for X1,X2,X3,X4 in Exercise 3? 5. If X1, X2, ...,

7.3 Sorrie Limit Theorems 345

of exponential random variables X1, X2,... , X,,,. . ;.wiere the parameter for X,, is nA/(n + A), the reciprocal of the sum Of the reciprocals of n and A, where the rings are counted from the center outward; thus X1 is the distance from the center of the trunk to the first 'ring, X2 . is the distance from the first ring to the second, and so on. Does the sequence X1,X2,.. . ,X,... converge in distribution and, if so, to what probability law does it converge?

13. Let the sequence X1, X2,. . . , X,1 ,... be defined, as described in Example 7.11 but now suppose, that E[XZ] = - rE[X_i] where 0 < r < 1; in this case the expected sales decrease in coming years. Show that the sequence

x1,x2,... ,x,,..

converges in distribution, and find the probability law to which it con-verges,

14. If X is Poisson with p. = 900, find the interval (900 - c, 900 + c) for which P(900—c <X< 900+c)= .99.

,

15. Assume that during the rainy season in a certain locatioh the probability of rain occurring each day is .6, and that separate days are well modeled as independent Bernoulli trials. Approximate the probability that it takes at least 100 days for the 50th rainy day to occur after the start of the rainy season.

16. Asa gift you receive 100 lottery tickets. Assume that the probability each one wins some prize is 1, and approximate the probability that you will win more than 15 prizes with these tickets,.......

17. A New Zealand tree farm contains .1 square mile of land that Was clear cut and then planted with 750,000 Monterey iinë trees. Assume that the increase in trunk diameter (in inches) per year of any one of these trees, as measured 6 inches above the ground, is modeled as an exponential random variable with parameter A = 1.1,. Increases in trunk diameter to occur from year to year or from tree to tree, are assumed to be indepen- dent random variables. ,

a. What is a good approximate model for the trunk diatheter of, one of these' trees that is 20 years old? , ..

b. What proportion of 20-year-old trees would you expect to exceed 20 inches in trunk diameter? .

.

18. For the situation described in Example 7.15, where X was, assumed to be negative binomial with r = 20 and p = .1, evaluate the normal approxi-mation to P (X < 190) using the continuity correction.

19. Assume that toll-free calls to the 800 number of a major car rental firm Arrive at a rate of 400 per hour, like events in a Poisson process, between the hours of 6 AM and 9PM (Eastern Standard Time). If X represents the number of such calls in a 1-hour period, approximate the value for P (375 < X < 425), both with and without the continuity correction.

20. A major metropolitan taxi company has 300 taxis on the street during business hOurs. Assumd the probability is .05 that any one of these taxis will have a breakdown of some sort on any given day; also assume that the breakdowns occur independently. Approximate the probability that

Page 15: EXERCISES 5.2 X1, X2 - KSU | Faculty Webfacultyweb.kennesaw.edu/jfowle60/Mar21_0828_v1.pdf · What are the marginal probability laws for X1,X2,X3,X4 in Exercise 3? 5. If X1, X2, ...,

346 Chapter 7: Transformations and Limit Theorems

the taxi company will suffer at least 10 such breakdowns on any given day.

21. At the height of the season assume that injuries requiring medical as-sistance occur on a ski slope between 10 AM and 3 PM at the rate of 2 per hour like events in a Poisson process. Let X be the number of such injuries to occur during these hours in a 7-day week. Approximate the value for P (X < 60).

22. Use the same heuristic reasoning employed in the text for the binomial to derive the rule that the normal approximation to the Poisson probability law should be "satisfactory" if the Poisson parameter is at least 9.

23. Suppose that X is binomial with parameters n and p = , and that n is very large and even; the probability that X equals its mean value is then

P, (x= ) = ()

(1)f/

Show that if Stirling's formula is used for the factorials in this expression, the result is the normal approximation for P (X = n/2).

7.41 Summary

If Y1, Y2,.. . ,Y, are independent continuous random variables with the same cdf Fy(t) and pdf fy(t), while Y(l),Y(2),. . . , Y(n) represent their ordered values, then the cdf for Y(J) is

Fyu) (t) ('f) (Fy(t))t (1 - Fy(t))"', for j = 1,2,... ,n.

i=j

The pdf for Y(j) is given by

(fy,) (t) = nfy(t).

n -1 (Fy(t))' 1 (1— F(t)) 1

The joint pdf for Y(), Y(J), i 0 j, is

fy(1),y(j) (tl, t2)it1it2 = nfy(t1)zti(n - 1)fy(t2)t2

( n —2

) (F(t1))1 1

- j—i—,n j

X (Fy(t2) - F(t1))' 1 (1 -

If Y = g(X1, X2) then the conditional cdf for Y, given X2 = x2, is

Fyj 2(t 1x2) = fx1 iX2(Xi Ix2)dxi. 4XI,X2):~ t


Recommended