Math 341: Probability Thirteenth Lecture (10/27/09) · Summary for the Day Section 4.7 Sections 3.8...

Post on 31-May-2020

7 views 0 download

transcript

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Math 341: ProbabilityThirteenth Lecture (10/27/09)

Steven J MillerWilliams College

Steven.J.Miller@williams.eduhttp://www.williams.edu/go/math/sjmiller/

public html/341/

Bronfman Science CenterWilliams College, October 27, 2009

1

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Summary for the Day

2

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Summary for the day

Change of variable formulas:⋄ Review of Jacobians.⋄ Joint density of functions of random variables.

Sums of random variables:⋄ Convolution.⋄ Properties of convolution.⋄ Poisson example.

Distributions from Normal:⋄ Sample mean and variance.⋄ Central Limit Theorem and Testing.⋄ Pepys’ Problem.

3

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Section 4.7Functions of Random Variables

4

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

One-dimension

Change of variable formula

g a strictly increasing function with inverse h, Y = g(X )then fY (y) = fX (h(y))h′(y).

5

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

One-dimension

Change of variable formula

g a strictly increasing function with inverse h, Y = g(X )then fY (y) = fX (h(y))h′(y).

Proof:ℙ(Y ≤ y) = ℙ(g(X ) ≤ y) = ℙ(X ≤ g−1(y)) =FX (g−1(y)) = FX (h(y)).

fY (y) = F ′

X (h(y))h′(y) = fX (h(y))h′(y).

As g(h(y)) = y , g′(h(y))h′(y) = 1 or h′(y) = 1/g′(h(y)).

6

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Review of Jacobian

Definition of the JacobianGiven variables (x1, x2) that are transformed to (y1, y2) by

T (x1, x2) = (y1(x1, x2), y2(x1, x2))

and inverse mapping

T−1(y1, y2) = (x1(y1, y2), x2(y1, y2)).

The Jacobian is defined by

J(y1, y2) =

∣∣∣∣∣∂x1∂y1

∂x2∂y1

∂x1∂y2

∂x1∂y2

∣∣∣∣∣ .

7

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Review of Jacobian

Note

∣∣∣∣a bc d

∣∣∣∣ = ad − bc.

Use: dx1dx2 → ∣J∣dy1dy2 (tells us how the volumeelement is transformed).

8

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Example of Jacobian

Polar Coordinates

x1(r , �) = r cos �, x2(r , �) = r sin �.

Calculating gives

J =

∣∣∣∣cos � sin �

−r sin � r cos �

∣∣∣∣ = r .

Thus dx1dx2 → rdrd�.

9

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Change of Variable Theorem

TheoremfX1,X2 joint density of X1 and X2, (Y1,Y2) = T (X1,X2) withJacobian J. For points in the range of T ,

fY1,Y2(y1, y2) = fX1,X2 (x1(y1, y2), x2(y1, y2)) ∣J(y1, y2)∣.

Example: X1,X2 independent Exponential(�). Find thejoint density of Y1 = X1 + X2, Y2 = X1/X2. Answer is

fY1,Y2(y1, y2) = �2y1e−�y1 ⋅1

(1 + y2)2.

If instead had Y1 = X1 + X2 and Y3 = X1 − X2, would find

fY1,Y3(y1, y3) =�2

2e−�y1 for ∣y3∣ ≤ y .

10

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Strange Example

Let X1,X2 be independent Exponential(�). Compute theconditional density of X1 + X2 given X1 = X2.

One solution is to use Y1,Y2 from above; another is to useY1,Y3.

Note {X1 = X2} is a null event, these two describe itdifferently.

11

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Sections 3.8 and 4.8Sums of Random Variables

12

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Example

X1,X2 independent Uniform(0, 1). What is X1 + X2?

Build intuition: extreme examples.

Consider discrete analogue: die.

13

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Example

X1,X2 independent Uniform(0, 1). What is X1 + X2?

Build intuition: extreme examples.

Consider discrete analogue: die.

Answer: triangle from 0 to 2 with maximum at 1.

14

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Convolution

Definition

(f ∗ g)(x) :=

∫∞

−∞

f (t)g(x − t)dt .

Interpretation: X and Y with densities f and g thendensity of X + Y is f ∗ g.

Revisit sum of uniforms.

15

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Properties of the convolution

Lemmaf ∗ g = g ∗ f .

( ˆf ∗ g)(x) = f (x) ⋅ g(x), where

f (�) =

∫∞

−∞

f (x)e−2�ix�

is the Fourier transform.

f ∗ � = f where � is the Dirac delta functional.

f ∗ (g ∗ h) = (f ∗ g) ∗ h.

16

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Example

X1,X2 Poisson(�1) and Poisson(�2), then X1 + X2 isPoisson(�1 + �2)

Proof: Evaluate convolution, using binomial theorem.

17

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Section 4.10Distributions from the Normal

18

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Standard results and definitions

X ∼ N(0, 1) then X 2 is chi-square with 1 degree offreedom.

Sample mean: X := 1N

∑ni=1 Xi .

Sample variance: S2 = 1n−1

∑ni=1(Xi − X )2.

19

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Main theorem

Sums of normal random variables

Let X1, . . . ,Xn be i.i.d. N(�, �2). Then

X = N(�, �2/n).

(n − 1)S2 is a chi-square with n − 1 degrees offreedom. (Easier proof with convolutions?)

X and S2 are independent.

Central Limit Theorem: X ∼ N(�, �2/n).

20

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

ClickerQuestions

21

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Pepys’ Problem

Problem StatementAlice and Bob decide wager on the rolls of die. Alice rolls6n fair die and wins if she gets at least n sixes, while Bobwins if she fails. What n should Alice choose to maximizeher chance of winning?

22

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Pepys’ Problem

Problem StatementAlice and Bob decide wager on the rolls of die. Alice rolls6n fair die and wins if she gets at least n sixes, while Bobwins if she fails. What n should Alice choose to maximizeher chance of winning?

(a) 1(b) 2(c) 6(d) 10(e) 20(f) 341(g) The larger n is, the greater chance she has ofwinning.

23

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Pepys’ Problem (continued): 1000 simulations, binsize = .2 5

-2 0 2 4

0.05

0.10

0.15

8n=, 10, Histogram Plot: Hð6s - expectedL�StDev<

24

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Pepys’ Problem (continued): 1000 simulations, binsize = .2 5

-2 0 2 4

0.05

0.10

0.15

0.20

8n=, 20, Histogram Plot: Hð6s - expectedL�StDev<

25

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Pepys’ Problem (continued): 1000 simulations, binsize = .2 5

-2 0 2 4

0.05

0.10

0.15

8n=, 30, Histogram Plot: Hð6s - expectedL�StDev<

26

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Pepys’ Problem (continued): 1000 simulations, binsize = .2 5

-2 0 2 4

0.05

0.10

0.15

8n=, 40, Histogram Plot: Hð6s - expectedL�StDev<

27

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Pepys’ Problem (continued): 1000 simulations, binsize = .2 5

-2 0 2 4

0.02

0.04

0.06

0.08

0.10

0.12

0.14

8n=, 50, Histogram Plot: Hð6s - expectedL�StDev<

28

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Pepys’ Problem (continued): 1000 simulations, binsize = .2 5

-2 0 2 4

0.02

0.04

0.06

0.08

0.10

0.12

8n=, 60, Histogram Plot: Hð6s - expectedL�StDev<

29

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Pepys’ Problem (continued): 1000 simulations, binsize = .2 5

-2 0 2 4

0.02

0.04

0.06

0.08

0.10

8n=, 70, Histogram Plot: Hð6s - expectedL�StDev<

30

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Pepys’ Problem (continued): 1000 simulations, binsize = .2 5

-2 0 2 4

0.05

0.10

0.15

8n=, 80, Histogram Plot: Hð6s - expectedL�StDev<

31

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Pepys’ Problem (continued): 1000 simulations, binsize = .2 5

-2 0 2 4

0.02

0.04

0.06

0.08

0.10

0.12

0.14

8n=, 90, Histogram Plot: Hð6s - expectedL�StDev<

32

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Pepys’ Problem (continued): 1000 simulations, binsize = .2 5

-2 0 2 4

0.02

0.04

0.06

0.08

0.10

0.12

0.14

8n=, 100, Histogram Plot: Hð6s - expectedL�StDev<

33

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Pepys’ Problem (continued): 1000 simulations, binsize = .2 5

-2 0 2 4

0.02

0.04

0.06

0.08

0.10

0.12

8n=, 110, Histogram Plot: Hð6s - expectedL�StDev<

34

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Pepys’ Problem (continued): 1000 simulations, binsize = .2 5

-2 0 2 4

0.02

0.04

0.06

0.08

0.10

0.12

8n=, 120, Histogram Plot: Hð6s - expectedL�StDev<

35

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Pepys’ Problem (continued): 1000 simulations, binsize = .2 5

-2 0 2 4

0.02

0.04

0.06

0.08

0.10

0.12

8n=, 130, Histogram Plot: Hð6s - expectedL�StDev<

36

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Pepys’ Problem (continued): 1000 simulations, binsize = .2 5

-2 0 2 4

0.02

0.04

0.06

0.08

0.10

8n=, 140, Histogram Plot: Hð6s - expectedL�StDev<

37

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Pepys’ Problem (continued): 1000 simulations, binsize = .2 5

-2 0 2 4

0.02

0.04

0.06

0.08

0.10

0.12

8n=, 150, Histogram Plot: Hð6s - expectedL�StDev<

38

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Pepys’ Problem (continued): 1000 simulations, binsize = .2 5

-2 0 2 4

0.02

0.04

0.06

0.08

0.10

8n=, 160, Histogram Plot: Hð6s - expectedL�StDev<

39

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Pepys’ Problem (continued): 1000 simulations, binsize = .2 5

-2 0 2 4

0.02

0.04

0.06

0.08

0.10

8n=, 170, Histogram Plot: Hð6s - expectedL�StDev<

40

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Pepys’ Problem (continued): 1000 simulations, binsize = .2 5

-2 0 2 4

0.02

0.04

0.06

0.08

0.10

0.12

8n=, 180, Histogram Plot: Hð6s - expectedL�StDev<

41

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Pepys’ Problem (continued): 1000 simulations, binsize = .2 5

-2 0 2 4

0.02

0.04

0.06

0.08

0.10

0.12

8n=, 190, Histogram Plot: Hð6s - expectedL�StDev<

42

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Pepys’ Problem (continued): 1000 simulations, binsize = .2 5

-2 0 2 4

0.02

0.04

0.06

0.08

0.10

0.12

8n=, 200, Histogram Plot: Hð6s - expectedL�StDev<

43

Summary for the Day Section 4.7 Sections 3.8 and 4.8 Section 4.10 Clicker Questions

Pepys’ Problem (continued): probability versus n

200 400 600 800 1000

0.505

0.510

0.515

0.520

44