+ All Categories
Home > Documents > Bivariate Normal Property

Bivariate Normal Property

Date post: 14-Apr-2018
Category:
Upload: san-li
View: 221 times
Download: 0 times
Share this document with a friend

of 14

Transcript
  • 7/27/2019 Bivariate Normal Property

    1/14

  • 7/27/2019 Bivariate Normal Property

    2/14

  • 7/27/2019 Bivariate Normal Property

    3/14

  • 7/27/2019 Bivariate Normal Property

    4/14

    Example 3.7.13

    P Beta(3, 2) and Y|P = p Geo(p). Note that

    E(Y|P = p) =1p

    p

    =1

    p

    1 = E(Y|P) =1

    P

    1

    V ar(Y|P = p) =1p

    p2=

    1

    p2

    1

    p= E(Y|P) =

    1

    P2

    1

    P

    You can show that E1

    P

    = 2 and E

    1

    P2

    = 6. Therefore,

    V ar(Y) = V ar[E(Y|P)] + E[V ar(Y|P)] = V ar

    1

    P 1

    + E

    1

    P2

    1

    P

    = V ar

    1

    P

    + E

    1

    P2

    E

    1

    P

    = E 1P2

    E2 1P + E 1

    P2 E 1

    P

    = 6 22 + 6 2

    = 6

    1

  • 7/27/2019 Bivariate Normal Property

    5/14

    Double Expectation Theorem and Law of Total Variance

    (Double Expectation Theorem) E[E(Y|X)] = E(Y)

    Proof (when X and Y are continuous):

    E[E(X|Y)] =

    E(X|Y = y)fy(y)dy

    =

    xf1(x|y)dxfy(y)dy

    =

    xf1(x|y)fy(y)dxdy

    =

    xf(x, y)dxdy

    =

    xf(x, y)dydx (Switching the order of the integrals)

    =

    x

    f(x, y)dydx

    =

    xf1(x)dx

    = E(X)

    (Law of Total Variance) V ar(Y) = E[V ar(Y|X)] + V ar[E(Y|X)]

    Proof:

    V ar(Y) = E(Y2) E2(Y) = E[E(Y2|X)] E2[E(Y|X)]

    = E[V ar(Y|X) + E2(Y|X)] E2[E(Y|X)]

    = E[V ar(Y|X)] +

    E[E2(Y|X)] E2[E(Y|X)]

    = E[V ar(Y|X)] + V ar[E(Y|X)]

    1

  • 7/27/2019 Bivariate Normal Property

    6/14

  • 7/27/2019 Bivariate Normal Property

    7/14

  • 7/27/2019 Bivariate Normal Property

    8/14

  • 7/27/2019 Bivariate Normal Property

    9/14

  • 7/27/2019 Bivariate Normal Property

    10/14

    Stat 330 - Tutorial 6

    1. Suppose X 2(n) and Y 2(m). Further X and Y are independent. Show that

    U =X/n

    Y /m Fn,m.

    That is, show that U has the pdf

    g1(u) =n+m2

    n2

    m2

    n

    m

    n/2un/21

    1 +

    n

    mu(n+m)/2

    , u > 0.

    2. Suppose X and Y are continuous random variables with joint pdf

    f(x, y) = 2(x + y), 0 < x < y < 1.

    (a) Find the joint pdf ofU = X and V = XY. Be sure to specify the support of(U, V).

    (b) Find the marginal pdf ofV. Be sure to specify the support ofV.

    3. (Additivity of Poisson distribution) IfXi P OI(i), i = 1, . . . , n, and X

    is are indepen-dent, show that

    ni=1

    Xi P OI n

    i=1

    i

    .

    4. (Additivity of Binomial distribution) IfXi BI N(mi, p), i = 1, . . . , n, and X

    is are inde-pendent, show that

    ni=1

    Xi BI N n

    i=1

    mi, p

    .

    1

  • 7/27/2019 Bivariate Normal Property

    11/14

    Stat 330 - Tutorial 5

    1. (Minimum Mean Squared Error) Let h(X) be any function of X and g(X) = E(Y|X).Show that the g(X) minimizes the mean squared error, that is,

    E[(Y g(X))2

    ] E[(Y h(X))2

    ].

    Hint: Write E[(Y h(X))2] = E[(A + B)2], where A = Y g(X) and B = g(X) h(X).Expand the quadratic and simplify.

    2. Suppose X1, . . . , X n are i.i.d. random variables with m.g.f. M(t), E(Xi) = and V ar(Xi) = 2 < .

    Find the m.g.f. of Z =

    n(X )/, where X = 1n

    n

    i=1

    Xi.

    3. (Past SOA Exam P Question) Once a fire is reported to a fire insurance company, the

    company makes an initial estimate, X, of the amount it will pay to the claimant for the fireloss. When the claim is finally settled, the company pays an amount, Y, to the claimant.The company has determined that X and Y have the joint density function

    f(x, y) =2

    x2(x 1) y

    (2x1)(x1) , x > 1 , y > 1

    Given that the initial claim estimated by the company is 2, determine the probability thatthe final settlement amount is between 1 and 3.

    1

  • 7/27/2019 Bivariate Normal Property

    12/14

    Stat 330 - Tutorial 5 Solution

    1.

    E[(Y h(X))2] = EY g(X) + g(X) h(X)2

    = E

    Y g(X)

    2 + 2

    Y g(X)

    g(X) h(X)

    +

    g(X) h(X)2

    = E

    Y g(X)

    2

    + 2E

    Y g(X)

    g(X) h(X)

    + E

    g(X) h(X)

    2

    0 (expectationof squared terms)

    E

    Y g(X)2

    + 2E

    Y g(X)

    g(X) h(X)

    Consider the second term:

    E

    Y g(X)g(X) h(X) = EEY g(X)g(X) h(X)|X

    by the double expectation theorem (Aside: E

    h(Y)

    = E

    E[h(Y)|X] ).Now, consider the term in the curly brackets on the right hand side, that is,

    E

    Y g(X)g(X) h(X)|X.Conditioning on X = x, we have

    E

    Y g(X)g(X) h(X)|X = x= E

    Y g(x)g(x) h(x)|X = x

    =

    g(x) h(x) EY g(x)|X = x (Given X = x, both g(x) = E(Y|X = x) and h(x)are real-valued (not random) functions ofx.)

    =

    g(x) h(x) EY|X = x g(x)

    g(x)

    = g(x) h(x) 0= 0

    Therefore, E

    Y g(X)g(X) h(X)|X = 0, and soE

    Y g(X)g(X) h(X) = EEY g(X)g(X) h(X)|X = E0 = 0

    Overall, we have

    1

  • 7/27/2019 Bivariate Normal Property

    13/14

    E[(Y h(X))2] E

    Y g(X)2

    + 2E

    Y g(X)

    g(X) h(X)

    EY g(X)

    2 + 2 0 E

    Y g(X)

    2

    as required.

    2. M.g.f. ofT = X1 + X2 + + Xn:

    MT(t)

    = EetT= E

    et(X1+X2++Xn)

    = E(etX1)E(etX2) . . . E (etXn) = M(t) M(t) . . . M (t) = [M(t)]n

    M.g.f. ofX = Tn :

    MX(t) = E

    etX

    = E

    etT/n

    = E

    e(t/n)T

    = MT(t/n) =

    M

    t

    n

    n

    M.g.f. ofZ =

    n(X )/:

    MZ(t) = EetZ = Een(X)/t= e

    n

    t E

    enX/ t

    = e

    n

    t E

    e[(n/)t]X

    = e

    n

    t MX

    n

    t

    = en

    t

    M

    t

    n

    n

    2

  • 7/27/2019 Bivariate Normal Property

    14/14

    Aside:

    When Xiiid N(, 2), M(t) = et+ 122t2 .

    Therefore, using the above expression,

    MZ(t) = en

    t

    M

    t

    n

    n

    = en

    t

    exp

    t

    n

    +

    1

    22

    t

    n

    2

    n

    = en

    t exp

    n

    t +

    1

    2 t2

    = exp

    1

    2 t2

    which is the m.g.f. of a N(0, 1) r.v.. By the uniqueness theorem, Z N(0, 1).

    3. Joint d.f. ofX and Y:

    f(x, y) =2

    x2(x 1) y (2x1)

    (x1) , x > 1 , y > 1

    Objective: Find P(1 < Y < 3|X = 2).

    Step 1: Find the conditional distribution ofY given X = 2.

    f2(y|2) = f(x, 2)f1(2)

    =0.5y3

    f1(2)

    Since,

    f1(2) =

    1

    f(x, 2)dx = = 0.25

    we have

    f2(y|2) = 0.5y3

    0.25 = 2y3, y > 1

    Finally,

    P(1 < Y < 3|X = 2) =31

    2y3dy =8

    9

    3


Recommended