+ All Categories
Home > Documents > Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication...

Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication...

Date post: 04-Oct-2020
Category:
Upload: others
View: 0 times
Download: 0 times
Share this document with a friend
87
What’s Happening in Selective Inference III? Emmanuel Cand` es, Stanford University The 2017 Wald Lectures, Joint Statistical Meetings, Baltimore, August 2017
Transcript
Page 1: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

What’s Happening in Selective Inference III?

Emmanuel Candes, Stanford University

The 2017 Wald Lectures, Joint Statistical Meetings, Baltimore, August 2017

Page 2: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Lecture 3: Special dedication

Maryam Mirzakhani 1977–2017

“Life is not supposed to be easy”

Page 3: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Knockoffs: Power Analysis

Joint with A. Weinstein and R. Barber

Page 4: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Knockoffs: wrapper around a black box

Cam we analyze power?

Page 5: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Case study

y = Xβ + ε

Xijiid∼ N (0, 1/n) εi

iid∼ N (0, 1) βjiid∼ Π = (1− ε)δ0 + εΠ?

Feature importance Zj = sup{λ : |βj(λ)| 6= 0}

Can carry out theoretical calculations when

n, p→∞ n/p→ δ

thanks to powerful Approximate Message Passing (AMP) theory of BayatiMontanari (’12) (see also Su, Bogdan & C., ’15)

Page 6: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Case study

y = Xβ + ε

Xijiid∼ N (0, 1/n) εi

iid∼ N (0, 1) βjiid∼ Π = (1− ε)δ0 + εΠ?

Feature importance Zj = sup{λ : |βj(λ)| 6= 0}

Can carry out theoretical calculations when

n, p→∞ n/p→ δ

thanks to powerful Approximate Message Passing (AMP) theory of BayatiMontanari (’12) (see also Su, Bogdan & C., ’15)

Page 7: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Case study

y = Xβ + ε

Xijiid∼ N (0, 1/n) εi

iid∼ N (0, 1) βjiid∼ Π = (1− ε)δ0 + εΠ?

Feature importance Zj = sup{λ : |βj(λ)| 6= 0}

Can carry out theoretical calculations when

n, p→∞ n/p→ δ

thanks to powerful Approximate Message Passing (AMP) theory of BayatiMontanari (’12) (see also Su, Bogdan & C., ’15)

Page 8: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

0.0 0.2 0.4 0.6 0.8

0.0

0.2

0.4

0.6

0.8

Π* = 0.7N(0,1)+0.3N(2,1)

TDP

FDP

oracle

δ=1, ε=0.2, σ=0.5

Page 9: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

0.0 0.2 0.4 0.6 0.8

0.0

0.2

0.4

0.6

0.8

Π* = 0.7N(0,1)+0.3N(2,1)

TDP

FDP

oracleknockoff

δ=1, ε=0.2, σ=0.5

Page 10: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

0.0 0.2 0.4 0.6 0.8

0.0

0.2

0.4

0.6

0.8

Π* = 0.7N(0,1)+0.3N(2,1)

TDP

FDP

oracleknockoff

δ=1, ε=0.2, σ=0.5++ q=0.05++ q=0.1

++ q=0.3

Page 11: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

0.4 0.6 0.8 1.0

0.2

0.4

0.6

0.8

Π* = δ50

TDP

FDP

oracleknockoff

++q=0.05 ++q=0.1++q=0.3

0.0 0.2 0.4 0.6 0.8

0.0

0.2

0.4

0.6

0.8

Π* = 0.7N(0,1)+0.3N(2,1)

TDP

FDP

oracleknockoff

++q=0.05++q=0.1

++ q=0.3

0.2 0.4 0.6 0.8 1.0

0.0

0.2

0.4

0.6

0.8

Π* = 0.5δ0.1+0.5δ50

TDP

FDP

oracleknockoff

++q=0.1

++q=0.05++q=0.010.2 0.4 0.6 0.8 1.0

0.0

0.2

0.4

0.6

0.8

Π* = exp(λ)=0.2

TDP

FDP

oracleoracleknockoff

++q=0.1++q=0.05++q=0.01

Page 12: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

0.4 0.6 0.8 1.0

0.4

0.6

0.8

1.0

Π* = δ50

TDP (oracle)

TDP

(knockoff)

+ q=0.05

+ q=0.1

+ q=0.125

0.0 0.2 0.4 0.6 0.8 1.00.0

0.1

0.2

0.3

0.4

0.5

0.6

Π* = exp(1)

TDP (oracle)

TDP

(knockoff)

+ q=0.05

+ q=0.2

+ q=0.3

Figure: Π? = δ50 (left) and Π? = exp(1) (right)

Page 13: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Consequence of new scientific paradigm

Collect data first =⇒ Ask questions later

Textbook practice

(1) Selecthypotheses/model/question

(2) Collect data

(3) Perform inference

Modern practice

(1) Collect data

(2) Selecthypotheses/model/questions

(3) Perform inference

2017 Wald LecturesExplain how I and others are responding

Explain various facets of the selective inference problem

Contribute to enhanced statistical reasoning

Page 14: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Consequence of new scientific paradigm

Collect data first =⇒ Ask questions later

Textbook practice

(1) Selecthypotheses/model/question

(2) Collect data

(3) Perform inference

Modern practice

(1) Collect data

(2) Selecthypotheses/model/questions

(3) Perform inference

2017 Wald LecturesExplain how I and others are responding

Explain various facets of the selective inference problem

Contribute to enhanced statistical reasoning

Page 15: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Model selection in practice> model = lm(y ~ . , data = X)

> model.AIC = stepAIC(model,direction="both")

> summary(model.AIC)

Call:

lm(formula = y ~ V1 + V2 + V5 + V7 + V8 + V9 + V10, data = X)

Coefficients:

Estimate Std. Error t value Pr(>|t|)

(Intercept) 0.1034 0.1575 0.656 0.5239

V1 0.4716 0.1665 2.832 0.0151 *

V2 0.3437 0.1351 2.544 0.0258 *

V5 0.7157 0.3147 2.274 0.0421 *

V7 0.3336 0.2027 1.646 0.1257

V8 -0.4358 0.1789 -2.436 0.0314 *

V9 0.4989 0.1503 3.321 0.0061 **

V10 0.4120 0.2425 1.699 0.1151

---

Signif. codes: 0 ’***’ 0.001 ’**’ 0.01 ’*’ 0.05 ’.’ 0.1 ’ ’ 1

Residual standard error: 0.6636 on 12 degrees of freedom

Multiple R-squared: 0.8073,Adjusted R-squared: 0.6949

F-statistic: 7.181 on 7 and 12 DF, p-value: 0.001629

Inference lik

ely distorte

d!

Inference lik

ely distorte

d!

Page 16: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Model selection in practice> model = lm(y ~ . , data = X)

> model.AIC = stepAIC(model,direction="both")

> summary(model.AIC)

Call:

lm(formula = y ~ V1 + V2 + V5 + V7 + V8 + V9 + V10, data = X)

Coefficients:

Estimate Std. Error t value Pr(>|t|)

(Intercept) 0.1034 0.1575 0.656 0.5239

V1 0.4716 0.1665 2.832 0.0151 *

V2 0.3437 0.1351 2.544 0.0258 *

V5 0.7157 0.3147 2.274 0.0421 *

V7 0.3336 0.2027 1.646 0.1257

V8 -0.4358 0.1789 -2.436 0.0314 *

V9 0.4989 0.1503 3.321 0.0061 **

V10 0.4120 0.2425 1.699 0.1151

---

Signif. codes: 0 ’***’ 0.001 ’**’ 0.01 ’*’ 0.05 ’.’ 0.1 ’ ’ 1

Residual standard error: 0.6636 on 12 degrees of freedom

Multiple R-squared: 0.8073,Adjusted R-squared: 0.6949

F-statistic: 7.181 on 7 and 12 DF, p-value: 0.001629Inference lik

ely distorte

d!

Inference lik

ely distorte

d!

Page 17: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Example from A. Buja

y = β0x0 +

10∑

j=1

βjxj + zj n = 250, zjiid∼ N (0, 1)

Interested in CI for β0 Select model always including x0 via BIC

Evidence from a Simulation (contd.)

Marginal Distribution of Post-Selection t-statistics:

tX

Den

sity

−4 −2 0 2 4

0.0

0.1

0.2

0.3

0.4

0.5

0.6 Nominal Dist.

Actual Dist.

The overall coverage probability of the conventional post-selection CI is83.5% < 95%.

For p = 30, the coverage probability can be as low as 39%.

Andreas Buja (Wharton, UPenn) “PoSI” — Valid Post-Selection Inference 2014/01/18 9 / 31

Figure: Marginal distribution of post-selectiont-statistics

Coverage is 83.5% < 95%

For p = 30, coverage as lowas 39%

Page 18: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Example from A. Buja

y = β0x0 +

10∑

j=1

βjxj + zj n = 250, zjiid∼ N (0, 1)

Interested in CI for β0 Select model always including x0 via BIC

Evidence from a Simulation (contd.)

Marginal Distribution of Post-Selection t-statistics:

tX

Den

sity

−4 −2 0 2 4

0.0

0.1

0.2

0.3

0.4

0.5

0.6 Nominal Dist.

Actual Dist.

The overall coverage probability of the conventional post-selection CI is83.5% < 95%.

For p = 30, the coverage probability can be as low as 39%.

Andreas Buja (Wharton, UPenn) “PoSI” — Valid Post-Selection Inference 2014/01/18 9 / 31

Figure: Marginal distribution of post-selectiont-statistics

Coverage is 83.5% < 95%

For p = 30, coverage as lowas 39%

Page 19: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Recall Soric’s warning from Lecture 1

“In a large number of 95% confidence intervals, 95% of them containthe population parameter [...] but it would be wrong to imagine that thesame rule also applies to a large number of 95% interesting confidenceintervals”

θiiid∼N (0, 0.04), i = 1, 2, . . . , 20

Sample ziiid∼N (θi, 1)

Construct level 90% marginal CIs

Select intervals that do not cover 0

Through simulations

Pθ(θi ∈ CIi(α)|i ∈ S) ≈ 0.043

Page 20: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Recall Soric’s warning from Lecture 1

“In a large number of 95% confidence intervals, 95% of them containthe population parameter [...] but it would be wrong to imagine that thesame rule also applies to a large number of 95% interesting confidenceintervals”

θiiid∼N (0, 0.04), i = 1, 2, . . . , 20

Sample ziiid∼N (θi, 1)

Construct level 90% marginal CIs

Select intervals that do not cover 0

Through simulations

Pθ(θi ∈ CIi(α)|i ∈ S) ≈ 0.043

Page 21: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Geography of error rates

A Simultaneous over all possible selection rules (Bonferroni)

B Simultaneous over the selected

C On the average over the selected (FDR/FCR)

D Conditional over the selected

Wald Lecture IIIPresent vignettes for each territory

Not exhaustive (would have also liked to discuss work by Goeman and Solari(’11) on multiple testing for exploratory research)

Works I have learned about early and that inspired my thinking

Page 22: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Geography of error rates

A Simultaneous over all possible selection rules (Bonferroni)

B Simultaneous over the selected

C On the average over the selected (FDR/FCR)

D Conditional over the selected

Wald Lecture IIIPresent vignettes for each territory

Not exhaustive (would have also liked to discuss work by Goeman and Solari(’11) on multiple testing for exploratory research)

Works I have learned about early and that inspired my thinking

Page 23: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

A Simultaneous over all possible selection rules

B Simultaneous over the selected

C On the average over the selected (FDR/FCR)

D Conditional over the selected

False Coverage RateBenjamini & Yekutieli (’05)

Page 24: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Conditional coverage I

yiiid∼ N (µ, 1) i = 1, . . . , 200

Select when 95% CI does not cover 0

Conditional coverage can be low and depends on unknown parameter

Page 25: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Conditional coverage II

yiiid∼ N (µ, 1) i = 1, . . . , 200

Bonferroni selected and Bonferroni adjusted CIs

Better but still no conditional coverage!

Page 26: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Conditional coverage

Worthy goal: select set S of parameters and

Pθ(θi ∈ CIi(α)|i ∈ S) ≥ 1− α

Cannot in general be achieved: similar to why pFDR = E(FDP|R > 0) cannot becontrolled; e.g. under global null, conditional on making a rejection, pFDR = 1

Have to settle for a bit less!

Page 27: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

False coverage rate

Definition

False coverage rate (FCR) is defined as

FCR = E[

V CI

RCI ∨ 1

]RCI : # selected parametersVCI : # CIs not covering

Similar to FDR: controls type I error over the selected

Without selection, i.e. |S| = n, the marginal CI’s control the FCR since

FCR = E[∑n

i=1 1(θi /∈ CIi(α))

n

]≤ α

With selection, marginal CI’s will not generally control the FCR

Bonferroni’s CIs do control FCR in the same way that Bonferroni’s procedurecontrols the FDR

Page 28: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

False coverage rate

Definition

False coverage rate (FCR) is defined as

FCR = E[

V CI

RCI ∨ 1

]RCI : # selected parametersVCI : # CIs not covering

Similar to FDR: controls type I error over the selected

Without selection, i.e. |S| = n, the marginal CI’s control the FCR since

FCR = E[∑n

i=1 1(θi /∈ CIi(α))

n

]≤ α

With selection, marginal CI’s will not generally control the FCR

Bonferroni’s CIs do control FCR in the same way that Bonferroni’s procedurecontrols the FDR

Page 29: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

False coverage rate

Definition

False coverage rate (FCR) is defined as

FCR = E[

V CI

RCI ∨ 1

]RCI : # selected parametersVCI : # CIs not covering

Similar to FDR: controls type I error over the selected

Without selection, i.e. |S| = n, the marginal CI’s control the FCR since

FCR = E[∑n

i=1 1(θi /∈ CIi(α))

n

]≤ α

With selection, marginal CI’s will not generally control the FCR

Bonferroni’s CIs do control FCR in the same way that Bonferroni’s procedurecontrols the FDR

Page 30: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

False coverage rate

Definition

False coverage rate (FCR) is defined as

FCR = E[

V CI

RCI ∨ 1

]RCI : # selected parametersVCI : # CIs not covering

Similar to FDR: controls type I error over the selected

Without selection, i.e. |S| = n, the marginal CI’s control the FCR since

FCR = E[∑n

i=1 1(θi /∈ CIi(α))

n

]≤ α

With selection, marginal CI’s will not generally control the FCR

Bonferroni’s CIs do control FCR in the same way that Bonferroni’s procedurecontrols the FDR

Page 31: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

False coverage rate

Definition

False coverage rate (FCR) is defined as

FCR = E[

V CI

RCI ∨ 1

]RCI : # selected parametersVCI : # CIs not covering

Similar to FDR: controls type I error over the selected

Without selection, i.e. |S| = n, the marginal CI’s control the FCR since

FCR = E[∑n

i=1 1(θi /∈ CIi(α))

n

]≤ α

With selection, marginal CI’s will not generally control the FCR

Bonferroni’s CIs do control FCR in the same way that Bonferroni’s procedurecontrols the FDR

Page 32: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Selection expressed by FCR

Marginal CIs for selected

FCR can be high and depends on unknown parameter

Page 33: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Selection expressed by FCR

Bonferroni selection & Bonferroni adjusted intervals

Can achieve FCR control with any projection of confidence region achievingsimultaneous coverage

P((θ1, θ2, . . . , θn) ∈ CI(α)) ≥ 1− α

Problem: FCR levels are too low; Bonferroni adjusted intervals are very wide

Page 34: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Selection expressed by FCR

Bonferroni selection & Bonferroni adjusted intervals

Can achieve FCR control with any projection of confidence region achievingsimultaneous coverage

P((θ1, θ2, . . . , θn) ∈ CI(α)) ≥ 1− α

Problem: FCR levels are too low; Bonferroni adjusted intervals are very wide

Page 35: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

FCR adjusted CIs

(i) Apply selection rule S(T )

(ii) For each i ∈ S

R(i) = mint{|S(T (i), t)| : i ∈ S(T (i), t)} T (i) = T \ {Ti}

(iii) FCR adjusted CI for i ∈ S is CIi(R(i))α/n)

UsuallyR(i) = |S(T )| := R

∴ construct adjusted CIs at level 1−Rα/nSome special cases:

RCI = n, no adjustment

RCI = 1, Bonferroni adjustment

Theorem (Benjamini & Yekutieli, ’05)

If Ti’s are independent, then for any selection procedure, the FCR of adjusted CI’sobey FCR ≤ α (extends to PRDS statistics)

Page 36: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

FCR adjusted CIs

(i) Apply selection rule S(T )

(ii) For each i ∈ S

R(i) = mint{|S(T (i), t)| : i ∈ S(T (i), t)} T (i) = T \ {Ti}

(iii) FCR adjusted CI for i ∈ S is CIi(R(i))α/n)

UsuallyR(i) = |S(T )| := R

∴ construct adjusted CIs at level 1−Rα/n

Some special cases:

RCI = n, no adjustment

RCI = 1, Bonferroni adjustment

Theorem (Benjamini & Yekutieli, ’05)

If Ti’s are independent, then for any selection procedure, the FCR of adjusted CI’sobey FCR ≤ α (extends to PRDS statistics)

Page 37: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

FCR adjusted CIs

(i) Apply selection rule S(T )

(ii) For each i ∈ S

R(i) = mint{|S(T (i), t)| : i ∈ S(T (i), t)} T (i) = T \ {Ti}

(iii) FCR adjusted CI for i ∈ S is CIi(R(i))α/n)

UsuallyR(i) = |S(T )| := R

∴ construct adjusted CIs at level 1−Rα/nSome special cases:

RCI = n, no adjustment

RCI = 1, Bonferroni adjustment

Theorem (Benjamini & Yekutieli, ’05)

If Ti’s are independent, then for any selection procedure, the FCR of adjusted CI’sobey FCR ≤ α (extends to PRDS statistics)

Page 38: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

FCR adjusted CIs

(i) Apply selection rule S(T )

(ii) For each i ∈ S

R(i) = mint{|S(T (i), t)| : i ∈ S(T (i), t)} T (i) = T \ {Ti}

(iii) FCR adjusted CI for i ∈ S is CIi(R(i))α/n)

UsuallyR(i) = |S(T )| := R

∴ construct adjusted CIs at level 1−Rα/nSome special cases:

RCI = n, no adjustment

RCI = 1, Bonferroni adjustment

Theorem (Benjamini & Yekutieli, ’05)

If Ti’s are independent, then for any selection procedure, the FCR of adjusted CI’sobey FCR ≤ α (extends to PRDS statistics)

Page 39: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

How well do we do?

yiind∼ N (µi, 1)

BH(q) selection procedure, FCR-adjusted intervals

µi = µ

Intuitively clear that if µi → 0 or µi →∞, FCR→ q

Page 40: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Some issues (after B. Efron)

n = 10, 000µi = 0 1 ≤ i ≤ 9, 000

µiiid∼ N (3, 1) 9, 001 ≤ i ≤ 10, 000

ziind∼ N (µi, 1)

●● ●● ●●● ●● ●● ● ● ●●●● ●●●● ● ●● ● ●● ●● ● ● ●● ● ●● ● ●●●●●● ● ●● ● ●●● ●●● ●●● ●●● ● ●●●●●● ● ●● ●●●● ●● ● ●● ● ●● ● ●● ● ●● ●● ● ● ●●● ●● ●● ● ●●● ● ● ●●● ● ●●● ●● ● ●●●●● ●● ●● ●● ● ●● ● ●●● ●● ●● ● ● ●● ●● ●●● ●●● ●● ● ●● ● ●● ●● ●● ●● ●●● ●● ● ●● ●● ●● ●● ● ●●● ●● ● ●●● ●● ●● ●●● ● ● ●● ● ●● ● ● ●● ●● ●●● ● ●● ●●● ●● ●● ●●● ●● ●●●● ● ●● ●●● ● ●● ●●● ●●● ●●● ●● ●● ● ● ●●● ●● ● ● ●●● ●● ● ●●●● ● ● ●● ●● ●●● ●●●● ●●● ●● ●● ● ●● ● ● ●●●● ● ● ●●● ●●●●● ●● ● ●●●● ● ● ●●● ●●●●● ●●● ● ● ●●●● ● ●● ●● ● ●●●● ●●● ● ●●● ●● ●● ●●● ● ●● ● ●● ●●● ●● ● ●● ● ●●●● ● ●● ●● ●●●● ●● ●●● ●●● ●● ●● ●● ● ●● ●●●● ● ● ●● ●●● ●●● ●●● ●●● ●●● ● ●●● ●● ● ●●● ●● ●● ●● ● ●● ● ●● ●●● ●● ●●● ● ● ●● ●● ●● ●● ●● ● ●●●●●● ●●● ● ●●● ● ●●●● ●● ●● ● ●●●● ●● ●● ●● ●● ● ●●●●● ●● ●●●●● ● ●●● ●●●●● ● ●● ●● ●● ● ●● ●●● ● ● ● ●●●● ●● ●● ● ●●● ● ●●● ● ●● ●● ●● ●● ● ●● ● ●● ● ●●●● ●● ●●●●● ● ●●●● ● ●● ●● ● ● ●●● ● ●●● ● ●● ● ● ● ●●●● ● ●●●● ●● ●● ●● ●● ●● ●●● ●● ●●● ●● ●●● ●● ● ● ●● ● ●●●●● ● ●● ●● ●●● ●● ●●● ● ●● ● ●● ●●● ● ● ●●● ●●● ● ●● ●● ●● ● ● ●●● ● ●● ●●● ● ●● ● ● ●●●● ●● ●●● ●●● ● ●●● ●● ●● ● ●●●●●● ● ● ●●● ●● ●● ●● ●● ●● ● ●●●●● ● ●● ● ● ●● ●●● ●●● ● ●● ●● ●● ● ●●●● ● ●● ●● ●●● ● ●● ●● ●●● ●● ● ●● ●● ● ●●● ● ● ●● ●●● ●● ●●● ●●●● ●● ● ●●● ● ●● ●● ●●●● ●● ●●●● ●● ●● ● ●● ●● ●● ●● ●●●●●● ●●●● ● ●●● ●● ● ● ●●● ● ●●● ● ●●● ● ● ● ●●● ●● ●● ● ●●● ●● ●● ●●●● ● ● ●● ●●● ● ● ●● ● ●● ●●● ●● ● ● ●●● ● ●● ●●● ● ●● ●● ●● ●● ●● ● ●● ●● ●● ● ●● ● ●● ● ●● ● ●● ●●●●● ● ●● ●●● ● ●● ● ●● ●● ●●● ●● ●●●● ●● ●●● ●●● ●● ●● ●● ●● ●● ● ●●● ●● ● ●●●● ●●● ●● ● ●● ●●● ● ●●●● ●●● ● ●● ●● ●●● ● ●● ●● ● ● ●● ●●● ●● ●● ●● ● ● ●●● ● ●●● ●● ●●● ● ● ●● ●●● ●● ●●●●● ●●●●●● ● ● ●●● ●● ●●● ●● ● ● ●● ● ●●● ●● ●●●● ●● ●● ●● ●● ● ●● ●● ● ●●●● ●●●● ●●● ●● ●●● ●●● ●●●● ● ●● ●● ●●●● ● ●●● ●● ● ●●● ●● ●● ● ●● ● ●● ●● ●●●● ● ●● ●●● ●●● ●●● ●● ●● ● ● ● ●● ● ● ●● ●●● ●●● ● ●● ●●● ● ●● ●● ●● ● ●● ●●● ●● ● ● ● ●● ●● ●● ●●● ●● ●●●● ●● ●● ● ● ●●● ● ●● ●● ● ● ●● ●● ●● ●●● ●● ●●● ● ●●● ●●●●● ●● ●● ●● ●● ●●● ●●● ●● ●●● ●● ●●● ● ● ●●● ● ●●●● ●● ● ● ●● ● ●● ●● ●●●● ●●● ● ●● ●●● ● ●●● ● ● ●●● ●● ●●●●●●● ●● ●● ●●● ● ●● ●● ●● ● ●●●●●● ● ●● ● ●● ●● ● ● ●●●●● ●● ● ● ●● ● ●● ● ●● ●● ●●● ● ●● ●●●●● ●● ●● ● ●●●● ● ● ●● ● ●● ●● ●●● ●●● ●●●● ● ●● ●● ●● ● ●● ● ●● ● ●●● ●● ● ●●● ●●●●●● ●● ●● ●●● ● ●● ●●● ●●●●● ● ●● ● ● ● ●● ●● ●●●● ● ●●●● ● ●● ● ●● ●●● ●● ●●● ●● ● ●● ● ● ●●● ●● ●●● ●● ● ● ●●● ●● ● ●●● ●●● ● ● ●● ●●● ● ●●●● ● ●● ●●● ● ●●●●● ● ●● ●●● ●●●● ●●● ● ● ●● ●●● ●● ● ●●● ●● ●● ● ●●● ● ●●● ●●●●● ●● ●●● ●● ●●● ●●●●● ●●● ●● ●● ●● ● ●●● ● ●●● ● ●●● ● ●● ● ●● ●●●● ●●● ●● ● ●●● ●● ●● ●● ● ● ●● ●● ● ●● ●●● ● ●●●●● ●● ● ● ●● ●●● ●●● ●●● ●●● ●● ●●●● ● ● ●● ● ●●● ●●●● ●● ●●● ●● ●● ●● ● ●●●● ●● ●● ● ● ●●● ● ● ●● ●●● ●●●● ●●● ●●● ● ●● ●●● ●●● ●●● ● ●●●● ● ●●● ●● ●● ● ● ●●● ●● ● ●●● ● ● ●● ●● ●● ●●● ● ●● ● ● ●● ● ●●● ●●●● ●●●● ●●● ● ●●● ●● ●● ●● ●●● ● ● ●●●● ●●● ●●●● ● ●●● ●●● ●●● ●● ● ●● ● ●●● ●● ●●● ●●●●● ● ●● ●● ●●● ●● ● ●● ●●● ●● ●● ●● ●● ●● ●●● ●● ●● ●● ● ●●● ● ●● ● ●● ●● ● ●●● ●●● ●● ●● ●● ● ●●●●● ●● ●●● ●●● ● ● ●● ● ●●● ● ●●● ● ●●● ●●●● ●●● ●●● ●● ●●●● ●● ●● ●● ● ● ●● ●●● ●● ●● ● ●●● ● ●●● ●● ●● ●● ●●●● ●● ●● ●● ●●● ● ●● ●● ● ●● ●● ●●● ●●●● ●● ●●● ●● ●●● ●●● ● ● ● ●● ● ●●● ● ●● ●● ●● ●● ●● ●●● ●● ●● ● ●●● ●● ●●● ● ● ●●● ●● ● ●● ●●● ●● ●● ●● ● ●● ●●●● ●● ●● ● ●● ●● ● ●●● ●● ●●● ● ●●● ●●● ● ●●● ●● ●● ●●● ● ●● ●● ●● ●● ●● ●●● ●●● ●●● ●● ●● ●● ●● ●●●●● ●●●● ●●● ● ● ●●●● ●●● ● ●●● ●●● ●●●●● ●● ●● ●● ● ●●●●● ●● ●● ●●● ●● ● ●● ●● ●● ● ●●● ●● ● ● ●● ● ●● ●● ● ● ●● ● ●● ●● ●● ● ●●● ●●● ●●●● ●● ●● ● ● ● ●● ●● ●● ● ●●●● ● ●● ●●● ●● ●● ● ●● ●●● ●●●●● ● ● ●● ●●●● ●●● ● ●● ●●●● ● ●● ●●● ●●● ● ●●●●● ● ● ●●●● ● ●●●● ● ●● ●●● ●●●● ●● ● ●●● ●● ●●● ●● ● ●●●● ●● ●● ●● ●● ●●●● ●● ●● ●● ●●●● ● ●● ●● ●● ● ●● ●●● ●●● ● ●●●● ● ●● ● ●●● ●●●● ● ●● ● ●● ● ●● ●● ● ●●● ● ●● ●● ●●●●● ● ● ●●● ● ●●● ●●● ●● ● ●●● ●●●● ●● ●● ●● ●● ●● ●● ● ●● ●● ●● ●●●● ●●● ●● ●● ● ●● ●● ● ● ●●● ● ●● ●●● ●●● ● ●● ●● ●●● ●● ●●●●● ● ●● ●● ●● ● ●●● ●● ●● ●●● ● ● ● ●●●● ●● ● ●●●●● ●●● ● ●●● ● ●●●● ● ●● ●● ● ●● ●● ● ●●●● ●● ● ●● ●●● ●●●● ●●● ● ●● ● ●● ●●● ●● ●●● ●● ●● ●● ●● ●●●● ●● ●●● ●● ●● ●●● ● ● ●● ●● ●● ●● ●● ●● ●●● ●● ●● ●●●● ● ●● ●● ●● ●●● ●● ● ●●●●● ● ●● ●● ●● ● ● ●●● ● ● ●●● ●● ●● ●● ●●● ●● ●●● ●● ● ●● ●●● ● ●●● ●● ●● ● ●● ● ●●●●● ● ●● ●● ●● ● ●● ● ●●●●● ●●●● ● ● ●● ● ●●●● ●● ●● ● ●● ●● ●●●● ● ●●●● ●●● ● ●● ●●● ● ●●● ● ●●●● ●● ●● ●● ●● ●● ●●●● ●●●● ● ● ●● ●●●●● ●● ●●● ●● ● ● ● ●●●● ●● ●● ● ●● ● ●● ●● ● ●●●● ●●● ●● ●● ●●● ●●● ● ●●● ●●●●● ●●● ● ● ● ●● ●●●● ●● ●●●● ●● ●●● ●●●● ●●●● ●●●● ●●●● ●●● ●● ● ●● ●● ●● ●● ●● ●● ● ●●● ● ●● ●●●● ● ●● ●● ●● ● ●● ●● ●●●●● ●●● ●●●●● ● ●●● ●●● ●●● ● ●●●● ●●● ●● ●●● ●●●● ●● ●● ●● ●●● ●● ● ●●●● ●●● ● ●● ●● ●●● ● ● ●● ● ●●● ●● ● ●● ●● ●● ●● ●●● ●●● ● ●● ●● ● ●●● ● ●● ●●● ● ● ●●● ●●●● ● ● ●● ●● ●●● ●● ●● ●●● ●● ● ●●● ●● ●● ● ● ●●● ● ●● ● ● ●●● ● ● ●● ● ●●●●● ●● ●● ●● ●●● ●● ● ●●●● ●● ● ●● ● ●●● ●●●● ●●● ●● ● ●●●● ●●● ●● ●● ●●● ● ●● ●● ● ●● ●● ●● ●●●● ●● ●●● ● ●● ●●● ● ●● ●●●● ● ●● ● ●●● ●●● ●● ● ●●● ●●●● ●● ● ●● ● ●● ●● ●●● ● ●● ●● ●● ● ●● ● ●●● ●● ●● ●● ●● ●● ●●● ●● ●● ●● ● ●● ●● ●● ●● ● ●● ● ●●● ●●● ● ●●● ● ●● ●● ●● ●● ● ●●● ●● ●●● ● ●● ● ●●● ●●● ● ● ●● ● ● ●● ●● ●●● ●● ●● ●● ●● ● ●●● ●● ●● ●● ●● ●● ● ●● ●●● ● ●● ●● ●● ● ●● ●● ●●● ●● ●●● ●● ●● ●●● ●●●●● ●● ●● ● ●● ●●● ●●● ●● ●● ● ● ●● ●●● ● ●●● ●● ● ●● ● ●●●● ● ● ● ● ●●● ●● ●●●● ●●●● ● ● ● ●● ●●● ●● ●● ●● ●●●●● ● ● ●● ●● ●● ●●● ● ● ●● ●●●●● ●● ● ●●● ●● ●● ● ●●● ● ●●● ● ●●● ●●● ●●● ●●● ●●● ●● ●● ● ●● ● ●●● ●● ●●● ●● ● ● ●●●● ●● ●●● ●●● ● ●●● ●● ● ●●●● ●● ● ●●● ● ●●● ●● ●●● ●● ●●● ●●●● ● ● ●● ●● ●● ●● ● ●● ●●● ● ●● ●●●● ●● ●●● ● ● ●● ●● ● ●● ● ●●●●● ● ●●●●● ● ● ●●● ●●● ●●● ●● ●●●● ● ●● ● ●● ●● ●●● ●● ●● ●●● ●● ●●● ● ●● ●●● ●● ● ●● ●●● ●● ●● ● ● ●● ● ● ● ●● ● ●● ●● ●●●● ● ●● ● ●●●● ●● ●●●● ● ●● ●●● ● ●●●● ● ●●● ●● ●●●● ● ●● ●●● ●● ●● ●●● ●● ●● ●●●●● ●● ●● ●● ●●●●● ●● ●●● ●●● ●●● ● ● ● ●● ●● ●● ●● ●●● ●● ●●●● ●● ●●● ● ●●●● ●●●● ● ●● ● ●●● ●●● ●● ●●● ●●●● ● ● ●●● ● ●● ●● ●●●● ●●● ●●● ● ●● ●●● ● ● ●●● ●● ●● ●● ●● ●● ●● ●● ●● ●● ●● ●● ●● ●●● ● ●● ●● ● ●●●● ● ●● ●●● ●● ●● ● ● ●●● ●●● ●●●● ●● ● ●● ● ●● ● ●● ●●● ● ●●●●●● ●● ●● ●●●● ●● ●● ● ● ●●●● ●● ● ●● ●● ●●● ●● ●● ● ●●● ●●●● ● ●● ●●●● ●●● ● ●●● ●● ● ●●● ●● ●● ● ●● ●● ●●● ●● ●●● ● ● ●● ●● ● ●● ●●● ● ●● ●●● ● ●● ●● ● ● ●●● ●●● ● ●●● ●●●●● ●● ●●●● ●● ●● ●● ● ●● ●● ●● ● ●●● ●● ● ● ●● ● ●●●●● ●● ● ●● ●● ●● ● ●●●● ●● ●● ●●●● ●●●● ●● ●●●● ●● ●● ● ●●● ●● ●● ●●● ● ●● ●● ● ●● ●● ● ●● ●● ●● ●● ● ●● ●● ● ●●●● ●● ● ● ●●● ●●● ●● ●●● ●● ● ●● ●●● ● ● ●●● ●●● ●● ●●● ●● ● ●● ●● ● ●●●● ●●● ●● ● ●● ●●● ●● ●●● ●● ●●● ●● ● ● ●●●●● ●●●● ●● ●●●● ●● ● ●●● ● ● ●● ●● ●●● ● ● ●●●● ● ●● ●●●● ● ●● ● ●● ●●● ●● ●● ● ● ●● ●● ●●● ● ●● ●●● ●● ●● ●● ●● ● ●●● ●●●● ●● ●●● ●● ●●● ●●●● ●● ●● ●●● ●●● ●●● ●●● ●● ● ●●●● ●●● ●● ●●● ●● ● ●●●● ●●● ● ● ●●● ● ●●●●● ●●●● ●●● ● ●●● ● ● ●● ●● ●●●● ● ● ●● ● ●●●● ● ●●●● ●●● ●● ●● ●● ●● ●● ● ●● ● ●● ● ●● ●● ●● ●● ●● ● ●●●● ● ● ●● ●●● ● ●● ● ●● ●● ● ●● ● ●● ●●● ●● ●● ●● ● ●●●● ● ●● ●●● ● ●●● ●● ● ●●● ●● ●●● ● ●● ●●● ●● ●● ●● ●● ● ●●●●● ●●● ● ●● ●●● ●● ●● ● ● ●●● ●●● ● ●●● ●●● ●●● ●● ●● ● ●● ● ●● ●●●● ●● ●● ●● ●● ● ● ●●● ● ●● ●● ●●● ●● ●● ●● ● ● ●● ●●● ●● ●● ●● ● ● ●●● ● ●● ●● ●●● ●● ●● ●● ● ●●● ●●●● ●● ● ● ● ●●● ● ●●●● ●● ●● ●● ●● ●●●● ●● ●● ● ●● ●●●● ●● ●● ● ●● ●● ● ●●● ● ●●● ●● ●●● ● ●● ●● ● ●● ●●●● ●● ● ●●● ●● ● ●●●● ● ●● ●● ● ●● ●● ●● ● ●● ● ●● ●● ●●● ● ● ●●● ● ●● ●● ●● ●●● ● ●● ●● ●●● ●● ●●●● ● ●● ● ●●● ● ●● ●●● ● ●● ● ●●● ● ●●● ●● ●● ●● ●● ●● ●●● ● ● ●● ● ●●●● ●●● ● ●● ●● ●● ●● ●●●● ●●● ● ●●● ●● ● ●●● ●●● ●● ●● ●●● ● ● ●● ● ●●● ●● ●● ●● ●●●● ●●● ● ● ●●● ●●● ● ●● ●● ●●● ●●● ● ●● ●● ●● ●●●● ● ●● ●● ●●●● ●●● ● ●●● ●●● ●● ●● ●● ● ●● ●● ● ●● ●●●● ●●● ●● ●● ●●● ● ●● ● ●●●● ● ●●● ● ●● ●●●● ●●●● ● ● ●●● ●●● ●●● ●● ●● ●●● ●●●● ● ●● ● ●●● ●● ●●● ● ● ● ●●● ● ●●●● ●●●● ●●● ● ●● ●●● ● ●●● ●● ●● ●● ●● ●●● ● ●● ● ●● ●● ● ●● ●● ●●● ● ● ●●● ●● ●● ●● ●● ● ● ●● ●● ● ●●● ●●● ● ● ●● ●●● ●● ●● ●● ●● ●● ● ●●●● ●● ● ●● ●● ●●● ● ●● ●●●● ●● ●● ● ●● ● ● ●● ●● ●●● ● ●●●●● ●●● ●● ●● ●●●● ●● ● ●● ●●● ● ● ●●●● ●●● ● ●● ● ●●●● ●●●● ● ●● ●● ● ●●● ●● ●●● ●● ● ●● ●● ●●●● ●●●● ●●● ●●● ● ●● ●● ● ●● ● ●● ●● ● ●●● ●●●● ●●● ● ●●●●● ●● ●● ● ●● ●●●● ●●●● ●● ●● ●●● ●●●● ● ●● ●● ●● ● ●● ● ●● ● ●● ●● ●●● ● ●●● ●●● ●● ● ●●●● ● ●● ●●● ●● ●● ●●● ●●● ●● ● ●● ● ●● ●● ●● ●● ●● ● ●● ● ●● ● ●●● ● ●● ● ● ●● ●● ● ●●● ●● ● ●●● ●● ●●● ●●●● ●● ● ● ●● ●●● ●● ●● ●● ●●●● ●● ● ●● ● ● ●●●● ● ●●● ● ●●●●●● ●●● ●●● ● ●●● ●●● ● ●● ●●● ●● ●● ●● ● ●●● ●● ●● ●●●● ●● ●● ● ●● ● ●●● ●● ●●● ● ●● ● ●● ●●● ●● ● ●● ●● ●● ●● ● ● ●● ●● ●● ● ● ●● ●● ●● ● ● ●● ●●● ●● ● ●● ●●●●● ●● ●● ●● ●● ● ●●● ● ●●● ●● ●● ● ●●● ● ●● ●●● ●● ● ●● ● ●● ●● ● ●●● ●● ● ● ●● ● ●● ● ● ●● ●●●● ● ●● ●● ● ● ● ●●●● ●● ●● ●●● ● ● ●● ●● ●●●●● ●● ● ●● ● ●● ● ●● ●● ● ● ●●● ●●● ●● ●● ●●●● ● ●●● ●●● ●● ● ●● ●● ●● ● ●●● ●● ● ●● ● ● ●●● ●● ●● ● ●● ●●●●● ●●● ●●●●●● ●● ●●●● ● ●●● ●● ●●● ● ●● ●● ● ●● ● ●●● ● ●●●● ● ● ●● ●● ●●● ● ●●● ●● ● ●● ●●●● ●● ●●● ●● ●● ● ● ●● ● ● ●● ● ●●● ● ●● ● ●● ● ●●● ●● ●●● ●●●● ●●● ●● ●●● ●● ● ●●●● ● ●● ●● ●●● ● ● ●●● ●● ●● ●● ●●● ●● ● ●● ● ●● ●● ●● ●●● ●● ●●● ●●● ●● ●● ●● ●● ●●●● ● ● ●●● ● ●●● ● ●●●● ●●●● ●●●●● ●● ● ●● ● ●● ●● ●●● ●●●● ● ●●● ●● ●● ●● ● ●●●● ● ●● ●● ●●● ●● ●●● ●● ●● ●●● ●●● ●●●● ●●● ● ●● ●●● ● ●●● ●●● ● ●●●● ● ● ●●● ●●●● ● ●● ●● ● ●● ●● ●● ● ●●● ●●● ●●● ●●●● ● ●●●●●● ●●● ●●● ● ●●● ●● ● ●●● ● ●● ● ●● ●● ●● ● ●● ● ●●● ●● ●● ●● ●●● ● ●● ●●● ●● ●● ● ●● ●●● ●● ●● ● ●●● ●●● ●● ●● ●● ●●● ●●● ● ●●● ●●● ●● ●●● ●● ●● ●● ●●● ●● ●●●●● ●● ●●●● ●● ●●● ●●●● ● ●● ● ● ●● ●● ● ●● ●●●● ●● ●● ●● ● ● ●●●●● ●● ● ●● ●●●● ● ●●● ● ● ●●●● ● ●● ●● ● ●●●● ● ● ●●● ● ●● ●●● ●● ●●● ●● ●● ●●● ●●●● ●●● ● ●● ●●● ●●● ● ●●● ● ●● ●●●● ● ●● ●● ●● ●●●● ●● ● ● ●●● ●●● ●● ● ●●● ● ●●● ●●● ● ●● ●● ● ●●● ●● ●●● ●● ●● ●● ●●●● ●● ● ●●● ● ●● ● ●● ● ● ●● ●● ●●● ●●● ●● ●●●●● ● ● ●● ● ●● ●●● ●● ●●●●●● ● ● ●● ●● ●● ●● ● ●● ● ● ●●●● ●●● ●● ●● ●● ●● ●●●● ●●● ●● ●●● ● ● ●● ●● ●● ●● ●● ● ●● ●● ●● ●●● ●●● ● ●● ●●●● ●● ●● ●● ●●● ● ●● ●● ● ●● ● ● ●● ● ● ●● ●●● ●● ● ●●●●●● ●● ●●● ● ●●● ●● ● ● ●●● ● ●● ●●● ●●●●● ●●● ● ●● ●● ●● ●●● ●● ●● ●●●●● ●● ●● ●● ●● ●● ●● ●●●● ●● ●● ●●● ●● ● ●●● ● ●● ●● ● ● ●●● ●● ●● ●●● ●● ●●●●● ● ●● ●●● ● ●●● ● ●●● ●●●● ●●● ● ● ●●● ●●●● ●● ●● ●●● ●● ●● ●● ●●●● ● ● ● ●● ●● ●● ●●●● ●● ●● ●● ● ●● ●● ●●● ●●● ●● ●●●● ●●● ● ●● ●● ● ●●●● ●●● ●● ● ●● ● ●● ●●● ●●● ●●●●● ●● ●● ●●● ● ●●● ●● ●●● ● ● ● ●● ● ●● ●● ●●●● ● ●● ●●● ●● ●● ●● ● ●● ●●● ● ●●●● ●● ●● ●● ●● ●● ●● ●●● ●●● ● ●● ●● ● ●●● ●● ● ● ●●● ● ●●● ●● ●●● ●●●● ● ●●●● ● ●● ●●●● ● ●● ●●● ● ● ●● ● ● ●● ●●● ●● ●●● ●●● ●●● ●● ●●● ●● ●● ●● ●● ●●● ●●● ● ● ● ● ● ●● ● ● ●● ●● ●● ●● ●●● ●● ●●●● ● ●● ● ●●● ●● ●● ●●●● ● ●●● ●●● ●●● ● ●●●● ● ●● ● ●● ●● ●● ●● ●● ●●● ● ●● ● ●● ●● ●● ●● ●●● ●● ●● ●● ●●●● ●●● ●● ● ● ●●● ●● ● ●● ●● ●●● ●●●●● ●● ● ●●●● ●● ● ● ●●●● ● ● ●● ●●● ● ●● ●●● ●●● ●● ●●● ●● ●●● ●● ●●● ●●●● ●● ●● ●● ●● ●●●● ●●● ●●● ●● ●●● ●● ●● ● ●● ● ●●● ●●● ●●● ● ●●●●● ● ●● ● ●●● ●●●●● ● ●● ●●● ●●● ●● ●●●●●● ●● ●● ● ●● ●● ● ●●● ●● ● ●●● ● ●● ●●●● ●● ● ● ●● ●●●● ●●● ●●● ●●● ●●● ● ●● ● ● ●● ● ●●● ●●● ●●● ● ●●● ● ●●● ●● ●●● ● ●●● ●● ● ●●● ●●●● ●●● ● ● ●●● ●●● ●● ●● ●● ●● ● ●● ●● ●● ●● ●● ●●● ● ● ●● ● ●● ●● ●● ●●●● ●●● ●●● ●● ●● ●● ●●● ●● ●● ●●● ●●●● ● ● ● ●● ●●● ●● ●● ●● ● ●● ●● ●● ●●● ●● ● ●● ●● ● ●●● ● ●●● ● ●● ● ●●●● ● ●● ●● ●● ●● ●● ●●● ●● ● ●● ●● ●● ●●● ● ●● ●● ●● ●● ●● ●● ● ●●● ●● ●●● ●●● ●● ● ●●● ● ●●●●● ● ● ● ●●●● ●● ● ●●●● ●●● ●● ● ●● ●● ● ● ●●● ● ●● ● ● ●● ●● ●● ●● ●●●● ●● ● ●● ●● ●● ●● ●● ●●●● ●● ●●●● ● ●● ●●● ●●● ●● ●● ●●●● ● ● ●●● ● ● ●● ● ●● ●● ●● ● ●● ● ●● ●● ●●● ●● ●● ●● ●● ●● ●● ●● ●●●● ● ●● ●● ●●● ●●●● ●●● ●● ● ● ●● ●●● ●●● ●● ●● ●● ●● ●● ●● ● ●● ●● ●●●● ● ●●● ● ●●● ● ●● ●● ●● ●●● ● ●●● ● ●●● ●●● ● ●●● ● ● ●●●● ● ●●● ●● ●● ●● ● ● ●● ●● ●●●● ●●● ●● ● ●●●● ●● ●●●●● ●● ●●●● ● ● ●● ●●●● ● ●●● ●●● ●●●● ●● ● ● ● ●● ● ●● ●● ●●●●● ●●● ● ●●● ●● ● ● ●●●●● ●●● ●● ●● ● ●● ●● ●●● ●● ● ●●● ● ●● ● ●● ●●● ● ● ●●●●● ● ● ●● ●● ●● ●●●● ●●● ●● ●●●● ●●● ● ●● ● ●● ● ●●●●● ● ●● ●● ●●●● ●●● ●● ● ● ●● ●● ●● ●● ●● ●● ● ● ●●● ●● ● ●●● ●● ● ●●● ● ● ●●● ●●● ●● ●● ● ●●● ●● ● ● ●● ● ● ● ●●● ●●● ●● ●● ●●● ● ● ●● ● ● ●● ●●●●● ● ● ●●●●●● ● ●● ●● ●●●● ●● ● ●● ● ●● ●● ●● ●● ●●● ● ●● ● ●● ●●● ● ●●● ●● ●●● ●●● ● ●● ●●●●● ●●●● ●● ●●● ● ●● ●● ●● ● ●●● ●● ●● ●●● ●● ● ●●● ● ●● ● ●● ● ●● ●●● ● ●●● ●●● ● ●●● ●●● ●● ●● ●● ●● ●● ●● ●● ●● ● ●●●● ●● ● ●● ●● ●●● ●● ●● ●● ●●● ●● ● ●●● ● ●●● ●● ● ●●●● ●● ●●●● ● ● ●●● ●● ● ●●● ●● ●●●● ●●●●● ●● ●●● ●●● ● ●● ●● ●●● ●● ●● ●●● ● ●● ● ●●● ● ●● ●● ●●●● ●● ●● ●● ●● ● ●● ●● ●●● ●● ●● ● ● ● ●● ● ●●● ●●● ● ●● ●●●● ●●● ● ●● ●● ●●● ●● ●●●● ● ●●● ● ●● ● ●●●● ●● ●● ● ●● ● ●● ● ● ●● ●●● ●● ●● ● ● ●● ●●● ●● ●●● ● ●● ●● ●● ● ●● ●●●● ●● ●●● ● ● ●● ● ●●●● ● ●●● ●● ●●● ●● ● ● ●● ●●● ● ●● ● ●● ● ●● ●● ●● ● ●● ●● ●● ●●●

●●

● ●●

●●

●●●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●●

●●

●● ●●● ●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

● ●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●●

● ●

●● ●

●●

● ●

● ●●

●●

●●

●●

●● ●

● ●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●● ●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●●

●●

● ●

●●

●●

● ●●

● ●

●●

●●

●●

●●

●●

● ●●

●●●

●●

●●

●●

●●

●● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

● ●

●●

●●

●●

●●

0 2 4 6 8

02

46

810

Observations

True

Mea

ns

Select via BHq(one-sided)

FCR-adjusted 95% CIs

Realized FCR

18/610 ≈ 0.03

Intervals two wide (upward)

Slope does not seem right

Page 41: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Some issues (after B. Efron)

n = 10, 000µi = 0 1 ≤ i ≤ 9, 000

µiiid∼ N (3, 1) 9, 001 ≤ i ≤ 10, 000

ziind∼ N (µi, 1)

●● ●● ●●● ●● ●● ● ● ●●●● ●●●● ● ●● ● ●● ●● ● ● ●● ● ●● ● ●●●●●● ● ●● ● ●●● ●●● ●●● ●●● ● ●●●●●● ● ●● ●●●● ●● ● ●● ● ●● ● ●● ● ●● ●● ● ● ●●● ●● ●● ● ●●● ● ● ●●● ● ●●● ●● ● ●●●●● ●● ●● ●● ● ●● ● ●●● ●● ●● ● ● ●● ●● ●●● ●●● ●● ● ●● ● ●● ●● ●● ●● ●●● ●● ● ●● ●● ●● ●● ● ●●● ●● ● ●●● ●● ●● ●●● ● ● ●● ● ●● ● ● ●● ●● ●●● ● ●● ●●● ●● ●● ●●● ●● ●●●● ● ●● ●●● ● ●● ●●● ●●● ●●● ●● ●● ● ● ●●● ●● ● ● ●●● ●● ● ●●●● ● ● ●● ●● ●●● ●●●● ●●● ●● ●● ● ●● ● ● ●●●● ● ● ●●● ●●●●● ●● ● ●●●● ● ● ●●● ●●●●● ●●● ● ● ●●●● ● ●● ●● ● ●●●● ●●● ● ●●● ●● ●● ●●● ● ●● ● ●● ●●● ●● ● ●● ● ●●●● ● ●● ●● ●●●● ●● ●●● ●●● ●● ●● ●● ● ●● ●●●● ● ● ●● ●●● ●●● ●●● ●●● ●●● ● ●●● ●● ● ●●● ●● ●● ●● ● ●● ● ●● ●●● ●● ●●● ● ● ●● ●● ●● ●● ●● ● ●●●●●● ●●● ● ●●● ● ●●●● ●● ●● ● ●●●● ●● ●● ●● ●● ● ●●●●● ●● ●●●●● ● ●●● ●●●●● ● ●● ●● ●● ● ●● ●●● ● ● ● ●●●● ●● ●● ● ●●● ● ●●● ● ●● ●● ●● ●● ● ●● ● ●● ● ●●●● ●● ●●●●● ● ●●●● ● ●● ●● ● ● ●●● ● ●●● ● ●● ● ● ● ●●●● ● ●●●● ●● ●● ●● ●● ●● ●●● ●● ●●● ●● ●●● ●● ● ● ●● ● ●●●●● ● ●● ●● ●●● ●● ●●● ● ●● ● ●● ●●● ● ● ●●● ●●● ● ●● ●● ●● ● ● ●●● ● ●● ●●● ● ●● ● ● ●●●● ●● ●●● ●●● ● ●●● ●● ●● ● ●●●●●● ● ● ●●● ●● ●● ●● ●● ●● ● ●●●●● ● ●● ● ● ●● ●●● ●●● ● ●● ●● ●● ● ●●●● ● ●● ●● ●●● ● ●● ●● ●●● ●● ● ●● ●● ● ●●● ● ● ●● ●●● ●● ●●● ●●●● ●● ● ●●● ● ●● ●● ●●●● ●● ●●●● ●● ●● ● ●● ●● ●● ●● ●●●●●● ●●●● ● ●●● ●● ● ● ●●● ● ●●● ● ●●● ● ● ● ●●● ●● ●● ● ●●● ●● ●● ●●●● ● ● ●● ●●● ● ● ●● ● ●● ●●● ●● ● ● ●●● ● ●● ●●● ● ●● ●● ●● ●● ●● ● ●● ●● ●● ● ●● ● ●● ● ●● ● ●● ●●●●● ● ●● ●●● ● ●● ● ●● ●● ●●● ●● ●●●● ●● ●●● ●●● ●● ●● ●● ●● ●● ● ●●● ●● ● ●●●● ●●● ●● ● ●● ●●● ● ●●●● ●●● ● ●● ●● ●●● ● ●● ●● ● ● ●● ●●● ●● ●● ●● ● ● ●●● ● ●●● ●● ●●● ● ● ●● ●●● ●● ●●●●● ●●●●●● ● ● ●●● ●● ●●● ●● ● ● ●● ● ●●● ●● ●●●● ●● ●● ●● ●● ● ●● ●● ● ●●●● ●●●● ●●● ●● ●●● ●●● ●●●● ● ●● ●● ●●●● ● ●●● ●● ● ●●● ●● ●● ● ●● ● ●● ●● ●●●● ● ●● ●●● ●●● ●●● ●● ●● ● ● ● ●● ● ● ●● ●●● ●●● ● ●● ●●● ● ●● ●● ●● ● ●● ●●● ●● ● ● ● ●● ●● ●● ●●● ●● ●●●● ●● ●● ● ● ●●● ● ●● ●● ● ● ●● ●● ●● ●●● ●● ●●● ● ●●● ●●●●● ●● ●● ●● ●● ●●● ●●● ●● ●●● ●● ●●● ● ● ●●● ● ●●●● ●● ● ● ●● ● ●● ●● ●●●● ●●● ● ●● ●●● ● ●●● ● ● ●●● ●● ●●●●●●● ●● ●● ●●● ● ●● ●● ●● ● ●●●●●● ● ●● ● ●● ●● ● ● ●●●●● ●● ● ● ●● ● ●● ● ●● ●● ●●● ● ●● ●●●●● ●● ●● ● ●●●● ● ● ●● ● ●● ●● ●●● ●●● ●●●● ● ●● ●● ●● ● ●● ● ●● ● ●●● ●● ● ●●● ●●●●●● ●● ●● ●●● ● ●● ●●● ●●●●● ● ●● ● ● ● ●● ●● ●●●● ● ●●●● ● ●● ● ●● ●●● ●● ●●● ●● ● ●● ● ● ●●● ●● ●●● ●● ● ● ●●● ●● ● ●●● ●●● ● ● ●● ●●● ● ●●●● ● ●● ●●● ● ●●●●● ● ●● ●●● ●●●● ●●● ● ● ●● ●●● ●● ● ●●● ●● ●● ● ●●● ● ●●● ●●●●● ●● ●●● ●● ●●● ●●●●● ●●● ●● ●● ●● ● ●●● ● ●●● ● ●●● ● ●● ● ●● ●●●● ●●● ●● ● ●●● ●● ●● ●● ● ● ●● ●● ● ●● ●●● ● ●●●●● ●● ● ● ●● ●●● ●●● ●●● ●●● ●● ●●●● ● ● ●● ● ●●● ●●●● ●● ●●● ●● ●● ●● ● ●●●● ●● ●● ● ● ●●● ● ● ●● ●●● ●●●● ●●● ●●● ● ●● ●●● ●●● ●●● ● ●●●● ● ●●● ●● ●● ● ● ●●● ●● ● ●●● ● ● ●● ●● ●● ●●● ● ●● ● ● ●● ● ●●● ●●●● ●●●● ●●● ● ●●● ●● ●● ●● ●●● ● ● ●●●● ●●● ●●●● ● ●●● ●●● ●●● ●● ● ●● ● ●●● ●● ●●● ●●●●● ● ●● ●● ●●● ●● ● ●● ●●● ●● ●● ●● ●● ●● ●●● ●● ●● ●● ● ●●● ● ●● ● ●● ●● ● ●●● ●●● ●● ●● ●● ● ●●●●● ●● ●●● ●●● ● ● ●● ● ●●● ● ●●● ● ●●● ●●●● ●●● ●●● ●● ●●●● ●● ●● ●● ● ● ●● ●●● ●● ●● ● ●●● ● ●●● ●● ●● ●● ●●●● ●● ●● ●● ●●● ● ●● ●● ● ●● ●● ●●● ●●●● ●● ●●● ●● ●●● ●●● ● ● ● ●● ● ●●● ● ●● ●● ●● ●● ●● ●●● ●● ●● ● ●●● ●● ●●● ● ● ●●● ●● ● ●● ●●● ●● ●● ●● ● ●● ●●●● ●● ●● ● ●● ●● ● ●●● ●● ●●● ● ●●● ●●● ● ●●● ●● ●● ●●● ● ●● ●● ●● ●● ●● ●●● ●●● ●●● ●● ●● ●● ●● ●●●●● ●●●● ●●● ● ● ●●●● ●●● ● ●●● ●●● ●●●●● ●● ●● ●● ● ●●●●● ●● ●● ●●● ●● ● ●● ●● ●● ● ●●● ●● ● ● ●● ● ●● ●● ● ● ●● ● ●● ●● ●● ● ●●● ●●● ●●●● ●● ●● ● ● ● ●● ●● ●● ● ●●●● ● ●● ●●● ●● ●● ● ●● ●●● ●●●●● ● ● ●● ●●●● ●●● ● ●● ●●●● ● ●● ●●● ●●● ● ●●●●● ● ● ●●●● ● ●●●● ● ●● ●●● ●●●● ●● ● ●●● ●● ●●● ●● ● ●●●● ●● ●● ●● ●● ●●●● ●● ●● ●● ●●●● ● ●● ●● ●● ● ●● ●●● ●●● ● ●●●● ● ●● ● ●●● ●●●● ● ●● ● ●● ● ●● ●● ● ●●● ● ●● ●● ●●●●● ● ● ●●● ● ●●● ●●● ●● ● ●●● ●●●● ●● ●● ●● ●● ●● ●● ● ●● ●● ●● ●●●● ●●● ●● ●● ● ●● ●● ● ● ●●● ● ●● ●●● ●●● ● ●● ●● ●●● ●● ●●●●● ● ●● ●● ●● ● ●●● ●● ●● ●●● ● ● ● ●●●● ●● ● ●●●●● ●●● ● ●●● ● ●●●● ● ●● ●● ● ●● ●● ● ●●●● ●● ● ●● ●●● ●●●● ●●● ● ●● ● ●● ●●● ●● ●●● ●● ●● ●● ●● ●●●● ●● ●●● ●● ●● ●●● ● ● ●● ●● ●● ●● ●● ●● ●●● ●● ●● ●●●● ● ●● ●● ●● ●●● ●● ● ●●●●● ● ●● ●● ●● ● ● ●●● ● ● ●●● ●● ●● ●● ●●● ●● ●●● ●● ● ●● ●●● ● ●●● ●● ●● ● ●● ● ●●●●● ● ●● ●● ●● ● ●● ● ●●●●● ●●●● ● ● ●● ● ●●●● ●● ●● ● ●● ●● ●●●● ● ●●●● ●●● ● ●● ●●● ● ●●● ● ●●●● ●● ●● ●● ●● ●● ●●●● ●●●● ● ● ●● ●●●●● ●● ●●● ●● ● ● ● ●●●● ●● ●● ● ●● ● ●● ●● ● ●●●● ●●● ●● ●● ●●● ●●● ● ●●● ●●●●● ●●● ● ● ● ●● ●●●● ●● ●●●● ●● ●●● ●●●● ●●●● ●●●● ●●●● ●●● ●● ● ●● ●● ●● ●● ●● ●● ● ●●● ● ●● ●●●● ● ●● ●● ●● ● ●● ●● ●●●●● ●●● ●●●●● ● ●●● ●●● ●●● ● ●●●● ●●● ●● ●●● ●●●● ●● ●● ●● ●●● ●● ● ●●●● ●●● ● ●● ●● ●●● ● ● ●● ● ●●● ●● ● ●● ●● ●● ●● ●●● ●●● ● ●● ●● ● ●●● ● ●● ●●● ● ● ●●● ●●●● ● ● ●● ●● ●●● ●● ●● ●●● ●● ● ●●● ●● ●● ● ● ●●● ● ●● ● ● ●●● ● ● ●● ● ●●●●● ●● ●● ●● ●●● ●● ● ●●●● ●● ● ●● ● ●●● ●●●● ●●● ●● ● ●●●● ●●● ●● ●● ●●● ● ●● ●● ● ●● ●● ●● ●●●● ●● ●●● ● ●● ●●● ● ●● ●●●● ● ●● ● ●●● ●●● ●● ● ●●● ●●●● ●● ● ●● ● ●● ●● ●●● ● ●● ●● ●● ● ●● ● ●●● ●● ●● ●● ●● ●● ●●● ●● ●● ●● ● ●● ●● ●● ●● ● ●● ● ●●● ●●● ● ●●● ● ●● ●● ●● ●● ● ●●● ●● ●●● ● ●● ● ●●● ●●● ● ● ●● ● ● ●● ●● ●●● ●● ●● ●● ●● ● ●●● ●● ●● ●● ●● ●● ● ●● ●●● ● ●● ●● ●● ● ●● ●● ●●● ●● ●●● ●● ●● ●●● ●●●●● ●● ●● ● ●● ●●● ●●● ●● ●● ● ● ●● ●●● ● ●●● ●● ● ●● ● ●●●● ● ● ● ● ●●● ●● ●●●● ●●●● ● ● ● ●● ●●● ●● ●● ●● ●●●●● ● ● ●● ●● ●● ●●● ● ● ●● ●●●●● ●● ● ●●● ●● ●● ● ●●● ● ●●● ● ●●● ●●● ●●● ●●● ●●● ●● ●● ● ●● ● ●●● ●● ●●● ●● ● ● ●●●● ●● ●●● ●●● ● ●●● ●● ● ●●●● ●● ● ●●● ● ●●● ●● ●●● ●● ●●● ●●●● ● ● ●● ●● ●● ●● ● ●● ●●● ● ●● ●●●● ●● ●●● ● ● ●● ●● ● ●● ● ●●●●● ● ●●●●● ● ● ●●● ●●● ●●● ●● ●●●● ● ●● ● ●● ●● ●●● ●● ●● ●●● ●● ●●● ● ●● ●●● ●● ● ●● ●●● ●● ●● ● ● ●● ● ● ● ●● ● ●● ●● ●●●● ● ●● ● ●●●● ●● ●●●● ● ●● ●●● ● ●●●● ● ●●● ●● ●●●● ● ●● ●●● ●● ●● ●●● ●● ●● ●●●●● ●● ●● ●● ●●●●● ●● ●●● ●●● ●●● ● ● ● ●● ●● ●● ●● ●●● ●● ●●●● ●● ●●● ● ●●●● ●●●● ● ●● ● ●●● ●●● ●● ●●● ●●●● ● ● ●●● ● ●● ●● ●●●● ●●● ●●● ● ●● ●●● ● ● ●●● ●● ●● ●● ●● ●● ●● ●● ●● ●● ●● ●● ●● ●●● ● ●● ●● ● ●●●● ● ●● ●●● ●● ●● ● ● ●●● ●●● ●●●● ●● ● ●● ● ●● ● ●● ●●● ● ●●●●●● ●● ●● ●●●● ●● ●● ● ● ●●●● ●● ● ●● ●● ●●● ●● ●● ● ●●● ●●●● ● ●● ●●●● ●●● ● ●●● ●● ● ●●● ●● ●● ● ●● ●● ●●● ●● ●●● ● ● ●● ●● ● ●● ●●● ● ●● ●●● ● ●● ●● ● ● ●●● ●●● ● ●●● ●●●●● ●● ●●●● ●● ●● ●● ● ●● ●● ●● ● ●●● ●● ● ● ●● ● ●●●●● ●● ● ●● ●● ●● ● ●●●● ●● ●● ●●●● ●●●● ●● ●●●● ●● ●● ● ●●● ●● ●● ●●● ● ●● ●● ● ●● ●● ● ●● ●● ●● ●● ● ●● ●● ● ●●●● ●● ● ● ●●● ●●● ●● ●●● ●● ● ●● ●●● ● ● ●●● ●●● ●● ●●● ●● ● ●● ●● ● ●●●● ●●● ●● ● ●● ●●● ●● ●●● ●● ●●● ●● ● ● ●●●●● ●●●● ●● ●●●● ●● ● ●●● ● ● ●● ●● ●●● ● ● ●●●● ● ●● ●●●● ● ●● ● ●● ●●● ●● ●● ● ● ●● ●● ●●● ● ●● ●●● ●● ●● ●● ●● ● ●●● ●●●● ●● ●●● ●● ●●● ●●●● ●● ●● ●●● ●●● ●●● ●●● ●● ● ●●●● ●●● ●● ●●● ●● ● ●●●● ●●● ● ● ●●● ● ●●●●● ●●●● ●●● ● ●●● ● ● ●● ●● ●●●● ● ● ●● ● ●●●● ● ●●●● ●●● ●● ●● ●● ●● ●● ● ●● ● ●● ● ●● ●● ●● ●● ●● ● ●●●● ● ● ●● ●●● ● ●● ● ●● ●● ● ●● ● ●● ●●● ●● ●● ●● ● ●●●● ● ●● ●●● ● ●●● ●● ● ●●● ●● ●●● ● ●● ●●● ●● ●● ●● ●● ● ●●●●● ●●● ● ●● ●●● ●● ●● ● ● ●●● ●●● ● ●●● ●●● ●●● ●● ●● ● ●● ● ●● ●●●● ●● ●● ●● ●● ● ● ●●● ● ●● ●● ●●● ●● ●● ●● ● ● ●● ●●● ●● ●● ●● ● ● ●●● ● ●● ●● ●●● ●● ●● ●● ● ●●● ●●●● ●● ● ● ● ●●● ● ●●●● ●● ●● ●● ●● ●●●● ●● ●● ● ●● ●●●● ●● ●● ● ●● ●● ● ●●● ● ●●● ●● ●●● ● ●● ●● ● ●● ●●●● ●● ● ●●● ●● ● ●●●● ● ●● ●● ● ●● ●● ●● ● ●● ● ●● ●● ●●● ● ● ●●● ● ●● ●● ●● ●●● ● ●● ●● ●●● ●● ●●●● ● ●● ● ●●● ● ●● ●●● ● ●● ● ●●● ● ●●● ●● ●● ●● ●● ●● ●●● ● ● ●● ● ●●●● ●●● ● ●● ●● ●● ●● ●●●● ●●● ● ●●● ●● ● ●●● ●●● ●● ●● ●●● ● ● ●● ● ●●● ●● ●● ●● ●●●● ●●● ● ● ●●● ●●● ● ●● ●● ●●● ●●● ● ●● ●● ●● ●●●● ● ●● ●● ●●●● ●●● ● ●●● ●●● ●● ●● ●● ● ●● ●● ● ●● ●●●● ●●● ●● ●● ●●● ● ●● ● ●●●● ● ●●● ● ●● ●●●● ●●●● ● ● ●●● ●●● ●●● ●● ●● ●●● ●●●● ● ●● ● ●●● ●● ●●● ● ● ● ●●● ● ●●●● ●●●● ●●● ● ●● ●●● ● ●●● ●● ●● ●● ●● ●●● ● ●● ● ●● ●● ● ●● ●● ●●● ● ● ●●● ●● ●● ●● ●● ● ● ●● ●● ● ●●● ●●● ● ● ●● ●●● ●● ●● ●● ●● ●● ● ●●●● ●● ● ●● ●● ●●● ● ●● ●●●● ●● ●● ● ●● ● ● ●● ●● ●●● ● ●●●●● ●●● ●● ●● ●●●● ●● ● ●● ●●● ● ● ●●●● ●●● ● ●● ● ●●●● ●●●● ● ●● ●● ● ●●● ●● ●●● ●● ● ●● ●● ●●●● ●●●● ●●● ●●● ● ●● ●● ● ●● ● ●● ●● ● ●●● ●●●● ●●● ● ●●●●● ●● ●● ● ●● ●●●● ●●●● ●● ●● ●●● ●●●● ● ●● ●● ●● ● ●● ● ●● ● ●● ●● ●●● ● ●●● ●●● ●● ● ●●●● ● ●● ●●● ●● ●● ●●● ●●● ●● ● ●● ● ●● ●● ●● ●● ●● ● ●● ● ●● ● ●●● ● ●● ● ● ●● ●● ● ●●● ●● ● ●●● ●● ●●● ●●●● ●● ● ● ●● ●●● ●● ●● ●● ●●●● ●● ● ●● ● ● ●●●● ● ●●● ● ●●●●●● ●●● ●●● ● ●●● ●●● ● ●● ●●● ●● ●● ●● ● ●●● ●● ●● ●●●● ●● ●● ● ●● ● ●●● ●● ●●● ● ●● ● ●● ●●● ●● ● ●● ●● ●● ●● ● ● ●● ●● ●● ● ● ●● ●● ●● ● ● ●● ●●● ●● ● ●● ●●●●● ●● ●● ●● ●● ● ●●● ● ●●● ●● ●● ● ●●● ● ●● ●●● ●● ● ●● ● ●● ●● ● ●●● ●● ● ● ●● ● ●● ● ● ●● ●●●● ● ●● ●● ● ● ● ●●●● ●● ●● ●●● ● ● ●● ●● ●●●●● ●● ● ●● ● ●● ● ●● ●● ● ● ●●● ●●● ●● ●● ●●●● ● ●●● ●●● ●● ● ●● ●● ●● ● ●●● ●● ● ●● ● ● ●●● ●● ●● ● ●● ●●●●● ●●● ●●●●●● ●● ●●●● ● ●●● ●● ●●● ● ●● ●● ● ●● ● ●●● ● ●●●● ● ● ●● ●● ●●● ● ●●● ●● ● ●● ●●●● ●● ●●● ●● ●● ● ● ●● ● ● ●● ● ●●● ● ●● ● ●● ● ●●● ●● ●●● ●●●● ●●● ●● ●●● ●● ● ●●●● ● ●● ●● ●●● ● ● ●●● ●● ●● ●● ●●● ●● ● ●● ● ●● ●● ●● ●●● ●● ●●● ●●● ●● ●● ●● ●● ●●●● ● ● ●●● ● ●●● ● ●●●● ●●●● ●●●●● ●● ● ●● ● ●● ●● ●●● ●●●● ● ●●● ●● ●● ●● ● ●●●● ● ●● ●● ●●● ●● ●●● ●● ●● ●●● ●●● ●●●● ●●● ● ●● ●●● ● ●●● ●●● ● ●●●● ● ● ●●● ●●●● ● ●● ●● ● ●● ●● ●● ● ●●● ●●● ●●● ●●●● ● ●●●●●● ●●● ●●● ● ●●● ●● ● ●●● ● ●● ● ●● ●● ●● ● ●● ● ●●● ●● ●● ●● ●●● ● ●● ●●● ●● ●● ● ●● ●●● ●● ●● ● ●●● ●●● ●● ●● ●● ●●● ●●● ● ●●● ●●● ●● ●●● ●● ●● ●● ●●● ●● ●●●●● ●● ●●●● ●● ●●● ●●●● ● ●● ● ● ●● ●● ● ●● ●●●● ●● ●● ●● ● ● ●●●●● ●● ● ●● ●●●● ● ●●● ● ● ●●●● ● ●● ●● ● ●●●● ● ● ●●● ● ●● ●●● ●● ●●● ●● ●● ●●● ●●●● ●●● ● ●● ●●● ●●● ● ●●● ● ●● ●●●● ● ●● ●● ●● ●●●● ●● ● ● ●●● ●●● ●● ● ●●● ● ●●● ●●● ● ●● ●● ● ●●● ●● ●●● ●● ●● ●● ●●●● ●● ● ●●● ● ●● ● ●● ● ● ●● ●● ●●● ●●● ●● ●●●●● ● ● ●● ● ●● ●●● ●● ●●●●●● ● ● ●● ●● ●● ●● ● ●● ● ● ●●●● ●●● ●● ●● ●● ●● ●●●● ●●● ●● ●●● ● ● ●● ●● ●● ●● ●● ● ●● ●● ●● ●●● ●●● ● ●● ●●●● ●● ●● ●● ●●● ● ●● ●● ● ●● ● ● ●● ● ● ●● ●●● ●● ● ●●●●●● ●● ●●● ● ●●● ●● ● ● ●●● ● ●● ●●● ●●●●● ●●● ● ●● ●● ●● ●●● ●● ●● ●●●●● ●● ●● ●● ●● ●● ●● ●●●● ●● ●● ●●● ●● ● ●●● ● ●● ●● ● ● ●●● ●● ●● ●●● ●● ●●●●● ● ●● ●●● ● ●●● ● ●●● ●●●● ●●● ● ● ●●● ●●●● ●● ●● ●●● ●● ●● ●● ●●●● ● ● ● ●● ●● ●● ●●●● ●● ●● ●● ● ●● ●● ●●● ●●● ●● ●●●● ●●● ● ●● ●● ● ●●●● ●●● ●● ● ●● ● ●● ●●● ●●● ●●●●● ●● ●● ●●● ● ●●● ●● ●●● ● ● ● ●● ● ●● ●● ●●●● ● ●● ●●● ●● ●● ●● ● ●● ●●● ● ●●●● ●● ●● ●● ●● ●● ●● ●●● ●●● ● ●● ●● ● ●●● ●● ● ● ●●● ● ●●● ●● ●●● ●●●● ● ●●●● ● ●● ●●●● ● ●● ●●● ● ● ●● ● ● ●● ●●● ●● ●●● ●●● ●●● ●● ●●● ●● ●● ●● ●● ●●● ●●● ● ● ● ● ● ●● ● ● ●● ●● ●● ●● ●●● ●● ●●●● ● ●● ● ●●● ●● ●● ●●●● ● ●●● ●●● ●●● ● ●●●● ● ●● ● ●● ●● ●● ●● ●● ●●● ● ●● ● ●● ●● ●● ●● ●●● ●● ●● ●● ●●●● ●●● ●● ● ● ●●● ●● ● ●● ●● ●●● ●●●●● ●● ● ●●●● ●● ● ● ●●●● ● ● ●● ●●● ● ●● ●●● ●●● ●● ●●● ●● ●●● ●● ●●● ●●●● ●● ●● ●● ●● ●●●● ●●● ●●● ●● ●●● ●● ●● ● ●● ● ●●● ●●● ●●● ● ●●●●● ● ●● ● ●●● ●●●●● ● ●● ●●● ●●● ●● ●●●●●● ●● ●● ● ●● ●● ● ●●● ●● ● ●●● ● ●● ●●●● ●● ● ● ●● ●●●● ●●● ●●● ●●● ●●● ● ●● ● ● ●● ● ●●● ●●● ●●● ● ●●● ● ●●● ●● ●●● ● ●●● ●● ● ●●● ●●●● ●●● ● ● ●●● ●●● ●● ●● ●● ●● ● ●● ●● ●● ●● ●● ●●● ● ● ●● ● ●● ●● ●● ●●●● ●●● ●●● ●● ●● ●● ●●● ●● ●● ●●● ●●●● ● ● ● ●● ●●● ●● ●● ●● ● ●● ●● ●● ●●● ●● ● ●● ●● ● ●●● ● ●●● ● ●● ● ●●●● ● ●● ●● ●● ●● ●● ●●● ●● ● ●● ●● ●● ●●● ● ●● ●● ●● ●● ●● ●● ● ●●● ●● ●●● ●●● ●● ● ●●● ● ●●●●● ● ● ● ●●●● ●● ● ●●●● ●●● ●● ● ●● ●● ● ● ●●● ● ●● ● ● ●● ●● ●● ●● ●●●● ●● ● ●● ●● ●● ●● ●● ●●●● ●● ●●●● ● ●● ●●● ●●● ●● ●● ●●●● ● ● ●●● ● ● ●● ● ●● ●● ●● ● ●● ● ●● ●● ●●● ●● ●● ●● ●● ●● ●● ●● ●●●● ● ●● ●● ●●● ●●●● ●●● ●● ● ● ●● ●●● ●●● ●● ●● ●● ●● ●● ●● ● ●● ●● ●●●● ● ●●● ● ●●● ● ●● ●● ●● ●●● ● ●●● ● ●●● ●●● ● ●●● ● ● ●●●● ● ●●● ●● ●● ●● ● ● ●● ●● ●●●● ●●● ●● ● ●●●● ●● ●●●●● ●● ●●●● ● ● ●● ●●●● ● ●●● ●●● ●●●● ●● ● ● ● ●● ● ●● ●● ●●●●● ●●● ● ●●● ●● ● ● ●●●●● ●●● ●● ●● ● ●● ●● ●●● ●● ● ●●● ● ●● ● ●● ●●● ● ● ●●●●● ● ● ●● ●● ●● ●●●● ●●● ●● ●●●● ●●● ● ●● ● ●● ● ●●●●● ● ●● ●● ●●●● ●●● ●● ● ● ●● ●● ●● ●● ●● ●● ● ● ●●● ●● ● ●●● ●● ● ●●● ● ● ●●● ●●● ●● ●● ● ●●● ●● ● ● ●● ● ● ● ●●● ●●● ●● ●● ●●● ● ● ●● ● ● ●● ●●●●● ● ● ●●●●●● ● ●● ●● ●●●● ●● ● ●● ● ●● ●● ●● ●● ●●● ● ●● ● ●● ●●● ● ●●● ●● ●●● ●●● ● ●● ●●●●● ●●●● ●● ●●● ● ●● ●● ●● ● ●●● ●● ●● ●●● ●● ● ●●● ● ●● ● ●● ● ●● ●●● ● ●●● ●●● ● ●●● ●●● ●● ●● ●● ●● ●● ●● ●● ●● ● ●●●● ●● ● ●● ●● ●●● ●● ●● ●● ●●● ●● ● ●●● ● ●●● ●● ● ●●●● ●● ●●●● ● ● ●●● ●● ● ●●● ●● ●●●● ●●●●● ●● ●●● ●●● ● ●● ●● ●●● ●● ●● ●●● ● ●● ● ●●● ● ●● ●● ●●●● ●● ●● ●● ●● ● ●● ●● ●●● ●● ●● ● ● ● ●● ● ●●● ●●● ● ●● ●●●● ●●● ● ●● ●● ●●● ●● ●●●● ● ●●● ● ●● ● ●●●● ●● ●● ● ●● ● ●● ● ● ●● ●●● ●● ●● ● ● ●● ●●● ●● ●●● ● ●● ●● ●● ● ●● ●●●● ●● ●●● ● ● ●● ● ●●●● ● ●●● ●● ●●● ●● ● ● ●● ●●● ● ●● ● ●● ● ●● ●● ●● ● ●● ●● ●● ●●●

●●

● ●●

●●

●●●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●●

●●

●● ●●● ●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

● ●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●●

● ●

●● ●

●●

● ●

● ●●

●●

●●

●●

●● ●

● ●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●● ●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●●

●●

● ●

●●

●●

● ●●

● ●

●●

●●

●●

●●

●●

● ●●

●●●

●●

●●

●●

●●

●● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

● ●

●●

●●

●●

●●

0 2 4 6 8

02

46

810

Observations

True

Mea

ns

Select via BHq(one-sided)

FCR-adjusted 95% CIs

Realized FCR

18/610 ≈ 0.03

Intervals two wide (upward)

Slope does not seem right

Page 42: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

eBayes: Yekutieli (‘12)

3 4 5 6 7 8 9

02

46

8

Observed Y

Effe

ct s

ize

Other follow ups: Weinstein, Fithian & Benjamini (’13), Efron (’16), ...

Page 43: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

A Simultaneous over all possible selection rules

B Simultaneous over the selected

C On the average over the selected (FDR/FCR)

D Conditional over the selected

Post-Selection Inference (POSI)Berk, Brown, Buja, Zhang and Zhao, 2013

Page 44: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Inference after selection in the linear model

y ∼ N (Xβ︸︷︷︸µ

, σ2I)

X: n× p design matrix

σ known (for convenience)

In reality, σ is unknown and POSI requires an ‘independent’ estimate of σ

think p < n and σ2 = MSEfull model

Extension: µ /∈ span(X)

Data analyst selects model after viewing data

Data analyst wishes to provide inference about parameters in selected model

Page 45: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Inference after selection in the linear model

y ∼ N (Xβ︸︷︷︸µ

, σ2I)

X: n× p design matrix

σ known (for convenience)

In reality, σ is unknown and POSI requires an ‘independent’ estimate of σ

think p < n and σ2 = MSEfull model

Extension: µ /∈ span(X)

Data analyst selects model after viewing data

Data analyst wishes to provide inference about parameters in selected model

Page 46: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Classical inference

Fixed model M ⊂ {1, . . . , p}Object of inference: slopes after adjusting for variables in M only

βM = X†Mµ = E[X†My]

X†M = (X ′MXM )−1X ′MβM = X†My is least-squares estimate

Sampling distribution (M fixed)

βM ∼ N (βM , σ2(X ′MXM )−1)

z-scores: Xj•M = lm(X[,j] ~ X[,setdiff(M,j)])$resid

zj•M =βj•M − βj•Mσ√

(X ′MXM )−1jj

=(y − µ)′Xj•M

σ‖Xj•M‖∼ N (0, 1)

Valid CIsβj•M ± z1−α/2σ‖Xj•M‖

If σ2 = MSEFull, then βj•M ± tn−p,1−α/2σ‖Xj•M‖

Page 47: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Classical inference

Fixed model M ⊂ {1, . . . , p}Object of inference: slopes after adjusting for variables in M only

βM = X†Mµ = E[X†My]

X†M = (X ′MXM )−1X ′MβM = X†My is least-squares estimate

Sampling distribution (M fixed)

βM ∼ N (βM , σ2(X ′MXM )−1)

z-scores: Xj•M = lm(X[,j] ~ X[,setdiff(M,j)])$resid

zj•M =βj•M − βj•Mσ√

(X ′MXM )−1jj

=(y − µ)′Xj•M

σ‖Xj•M‖∼ N (0, 1)

Valid CIsβj•M ± z1−α/2σ‖Xj•M‖

If σ2 = MSEFull, then βj•M ± tn−p,1−α/2σ‖Xj•M‖

Page 48: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Classical inference

Fixed model M ⊂ {1, . . . , p}Object of inference: slopes after adjusting for variables in M only

βM = X†Mµ = E[X†My]

X†M = (X ′MXM )−1X ′MβM = X†My is least-squares estimate

Sampling distribution (M fixed)

βM ∼ N (βM , σ2(X ′MXM )−1)

z-scores: Xj•M = lm(X[,j] ~ X[,setdiff(M,j)])$resid

zj•M =βj•M − βj•Mσ√

(X ′MXM )−1jj

=(y − µ)′Xj•M

σ‖Xj•M‖∼ N (0, 1)

Valid CIsβj•M ± z1−α/2σ‖Xj•M‖

If σ2 = MSEFull, then βj•M ± tn−p,1−α/2σ‖Xj•M‖

Page 49: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Classical inference

Fixed model M ⊂ {1, . . . , p}Object of inference: slopes after adjusting for variables in M only

βM = X†Mµ = E[X†My]

X†M = (X ′MXM )−1X ′MβM = X†My is least-squares estimate

Sampling distribution (M fixed)

βM ∼ N (βM , σ2(X ′MXM )−1)

z-scores: Xj•M = lm(X[,j] ~ X[,setdiff(M,j)])$resid

zj•M =βj•M − βj•Mσ√

(X ′MXM )−1jj

=(y − µ)′Xj•M

σ‖Xj•M‖∼ N (0, 1)

Valid CIsβj•M ± z1−α/2σ‖Xj•M‖

If σ2 = MSEFull, then βj•M ± tn−p,1−α/2σ‖Xj•M‖

Page 50: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

What sort of selective inference?

Variable selection procedure: M(y)

P(βj•M ∈ Cj•M | j ∈ M) ≥ 1− α (D) Cond. inference

P(∀j ∈ M : βj•M ∈ Cj•M ) ≥ 1− α (B) Simultaneous over selected

Object of inference is random: P(j ∈ M)?

Not at all obvious how to construct such CIs

Different variable selection procedures yield different CIs

Page 51: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

POSI: Universal validity for all selected procedures

∀M P(∀j ∈ M : βj•M ∈ Cj•M ) ≥ 1− α

Pros Simultaneous inference: strongest form of protection (no matterwhat the data scientist did)

Cons CI’s can be very wide (later)

Merit Got lots of people thinking...

The most valuable statistical analysesoften arise only after an iterative process

involving the data

Gelman and Loken (2013)

Page 52: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Is POSI doable?

Xj•M = lm(X[,j] ~ X[,setdiff(M,j)])$resid

zj•M =(y − µ)′Xj•M

σ‖Xj•M‖∼ N (0, 1)

Fact: for any variable selection procedure M

maxj∈M

|zj•M | ≤ maxM

maxj∈M|zj•M |

Theorem (Universal guarantee)

P(

maxM

maxj∈M|zj•M | ≤ K1−α/2

)≥ 1− α K1−α/2 is POSI constant

Then with Cj•M = βj•M ±K1−α/2σ‖Xj•M‖

∀M P(∀j ∈ M : βj•M ∈ Cj•M ) ≥ 1− α

Page 53: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Is POSI doable?

Xj•M = lm(X[,j] ~ X[,setdiff(M,j)])$resid

zj•M =(y − µ)′Xj•M

σ‖Xj•M‖∼ N (0, 1)

Fact: for any variable selection procedure M

maxj∈M

|zj•M | ≤ maxM

maxj∈M|zj•M |

Theorem (Universal guarantee)

P(

maxM

maxj∈M|zj•M | ≤ K1−α/2

)≥ 1− α K1−α/2 is POSI constant

Then with Cj•M = βj•M ±K1−α/2σ‖Xj•M‖

∀M P(∀j ∈ M : βj•M ∈ Cj•M ) ≥ 1− α

Page 54: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Computing the POSI constant

POSI constant is quantile of

maxM

maxj∈M|zj•M |

Difficulty: look at 2p models!

Can try developing bounds (asymptotics)

Range of POSI constant

√2 log p . K1−α(X) .

√p

Lower bound achieved for orthogonal designs

Upper bound achieved for SPAR1 designs

POSI constant can get very large (but necessarily so)

Page 55: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

POSI: conclusion

Spirit of Scheffe’s simultaneous CI’s for constrasts

c′β c ∈ C =

{Xj•M

‖Xj•M‖, j ∈M ⊂ {1, . . . , p}

}

Protection against all kinds of selection

Can be conservative

Perhaps difficult to implement

Alternative: split sample (not always possible)

Significant impact

Asked important questions and stimulated lots of thinking/questioning/research

Page 56: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

A Simultaneous over all possible selection rules

B Simultaneous over the selected

C On the average over the selected (FDR/FCR)

D Conditional over the selected

Selective Inference for LassoLee, Sun, Sun and Taylor, 2014

Page 57: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Lasso selection

y ∼ N (Xβ︸︷︷︸µ

, σ2I)

Restrict analyst’s choices

Lasso selection event

β = arg minb12 ‖y −Xb‖

22 + λ ‖b‖1 =⇒ M = {j : βj 6= 0}

Inference for selected model

Object of inference: βM := X†Mµ (regression coeff. in reduced model)

Goal: CIs covering parameters βM (M random)

Page 58: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Lasso selection

y ∼ N (Xβ︸︷︷︸µ

, σ2I)

Restrict analyst’s choices

Lasso selection event

β = arg minb12 ‖y −Xb‖

22 + λ ‖b‖1 =⇒ M = {j : βj 6= 0}

Inference for selected model

Object of inference: βM := X†Mµ (regression coeff. in reduced model)

Goal: CIs covering parameters βM (M random)

Page 59: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Selection event

Each region:selected set + sign pattern

polytope {y : Ay ≤ b}(easily described via KKT

conditions)

Main idea: condition on selection event and signs

y|{M = M, s = s} ∼ N (µ, σ2I) · 1(Ay ≤ b)︸ ︷︷ ︸truncated multivariate normal

Page 60: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Selection event

Each region:selected set + sign pattern

polytope {y : Ay ≤ b}(easily described via KKT

conditions)

Main idea: condition on selection event and signs

y|{M = M, s = s} ∼ N (µ, σ2I) · 1(Ay ≤ b)︸ ︷︷ ︸truncated multivariate normal

Page 61: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Conditional sampling distributions

Wish inference about βj•M = X ′j•Mµ := η′µ

Would need η′y | {Ay ≤ b}Complicated mixture of truncated normalsComputationally expensive to sample

Computationally tractable approach: condition on more

η′y∣∣{Ay ≤ b, Pη⊥y} d

= TN︸︷︷︸truncated normal

( η′µ︸︷︷︸mean

, σ2 ‖η‖2︸ ︷︷ ︸var

, I︸︷︷︸truncation interval

)

Page 62: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Conditional sampling distributions12 LEE ET AL.

Fig 2: A picture demonstrating that the set {Ay b} can be characterizedby {V� ⌘T y V+}. Assuming ⌃ = I and ||⌘||2 = 1, V� and V+ arefunctions of P⌘?y only, which is independent of ⌘T y.

5. Application to Inference for the Lasso. In this section, we applythe theory developed in in Sections 3 and 4 to the lasso. In particular, wewill construct confidence intervals for the active variables and test the chosenmodel based on the pivot developed in Section 4.

To summarize the developments so far, recall that our model says thaty ⇠ N(µ,�2I). The distribution of interest is y | {(E, zE) = (E, zE)}. ByTheorem 3.1, this is equivalent to y | {A(E, zE)y b(E, zE)} defined inProposition 3.2. Now we can apply Theorem 4.2 to obtain the (conditional)pivot

F[V�,V+]

⌘T µ, �2||⌘||22(⌘T y)

�� {(E, zE) = (E, zE)} ⇠ Unif(0, 1)(5.1)

for any ⌘, where V� and V+ are defined in (4.2) and (4.3). Note that A(E, zE)and b(E, zE) appear in this pivot through V� and V+. This pivot will playa central role in all of the applications that follow.

5.1. Confidence Intervals for the Active Variables. In this section, wedescribe how to form confidence intervals for the components of �?

E= X+

Eµ.

If we choose

(5.2) ⌘j = (XTE

)+ej ,

imsart-aos ver. 2008/08/29 file: how_long_lasso_ims.tex date: February 17, 2014

Computationally tractable approach: condition on more

η′y∣∣{Ay ≤ b, Pη⊥y} d

= TN︸︷︷︸truncated normal

( η′µ︸︷︷︸mean

, σ2 ‖η‖2︸ ︷︷ ︸var

, [V−(y),V+(y)]︸ ︷︷ ︸truncation interval

)

∴ With F[a,b]µ,σ2 the CDF of TN(µ, σ2; [a, b])

F[V−(y),V+(y)]η′µ,σ2‖η‖2 (η′y)

∣∣{Ay ≤ b, Pη⊥y} d= Unif(0, 1)

Page 63: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Conditional sampling distributions12 LEE ET AL.

Fig 2: A picture demonstrating that the set {Ay b} can be characterizedby {V� ⌘T y V+}. Assuming ⌃ = I and ||⌘||2 = 1, V� and V+ arefunctions of P⌘?y only, which is independent of ⌘T y.

5. Application to Inference for the Lasso. In this section, we applythe theory developed in in Sections 3 and 4 to the lasso. In particular, wewill construct confidence intervals for the active variables and test the chosenmodel based on the pivot developed in Section 4.

To summarize the developments so far, recall that our model says thaty ⇠ N(µ,�2I). The distribution of interest is y | {(E, zE) = (E, zE)}. ByTheorem 3.1, this is equivalent to y | {A(E, zE)y b(E, zE)} defined inProposition 3.2. Now we can apply Theorem 4.2 to obtain the (conditional)pivot

F[V�,V+]

⌘T µ, �2||⌘||22(⌘T y)

�� {(E, zE) = (E, zE)} ⇠ Unif(0, 1)(5.1)

for any ⌘, where V� and V+ are defined in (4.2) and (4.3). Note that A(E, zE)and b(E, zE) appear in this pivot through V� and V+. This pivot will playa central role in all of the applications that follow.

5.1. Confidence Intervals for the Active Variables. In this section, wedescribe how to form confidence intervals for the components of �?

E= X+

Eµ.

If we choose

(5.2) ⌘j = (XTE

)+ej ,

imsart-aos ver. 2008/08/29 file: how_long_lasso_ims.tex date: February 17, 2014

Computationally tractable approach: condition on more

η′y∣∣{Ay ≤ b, Pη⊥y} d

= TN︸︷︷︸truncated normal

( η′µ︸︷︷︸mean

, σ2 ‖η‖2︸ ︷︷ ︸var

, [V−(y),V+(y)]︸ ︷︷ ︸truncation interval

)

∴ With F[a,b]µ,σ2 the CDF of TN(µ, σ2; [a, b])

F[V−(y),V+(y)]η′µ,σ2‖η‖2 (η′y)

∣∣{Ay ≤ b, Pη⊥y} d= Unif(0, 1)

Page 64: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Pivotal quantity from Lee, Sun, Sun & Taylor, ’14

Theorem

Because η′y ⊥⊥ Pη⊥y, we can integrate w.r.t. Pη⊥y and obtain

F[V−(y),V+(y)]

η′µ,σ2‖η‖2 (η′y) | {Ay ≤ b} ∼ Unif(0, 1)

and is a pivotal quantity

0.0 0.2 0.4 0.6 0.8 1.0

F

0.0

0.2

0.4

0.6

0.8

1.0

1.2

Frequency

0.0 0.2 0.4 0.6 0.8 1.0

F

0.0

0.2

0.4

0.6

0.8

1.0

CD

F

Unif(0,1)

Empirical CDF

Figure: Pivotal quantity is uniform

Page 65: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Selective inference and FCR

T := F[V−(y),V+(y)]

η′µ,σ2‖η‖2 (η′y) | {Ay ≤ b} ∼ Unif(0, 1)

‘Invert’ pivotal quantity to obtain intervals with conditional type-I error control

0.025 ≤ T ≤ 0.975 =⇒ a−(η, y) ≤ η′µ ≤ a+(η, y)

=⇒ P(a−(η, y) ≤ η′µ ≤ a+(η, y) |Ay ≤ b) = 0.95

Conditional coverage

P(βj•M ∈ Cj | M = M, s = s

)= 1− α

Implies false coverage rate (FCR) control

E

[#{j ∈ M : Cj does not cover βj•M}

|M |

]≤ α

Page 66: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Selective inference and FCR

T := F[V−(y),V+(y)]

η′µ,σ2‖η‖2 (η′y) | {Ay ≤ b} ∼ Unif(0, 1)

‘Invert’ pivotal quantity to obtain intervals with conditional type-I error control

0.025 ≤ T ≤ 0.975 =⇒ a−(η, y) ≤ η′µ ≤ a+(η, y)

=⇒ P(a−(η, y) ≤ η′µ ≤ a+(η, y) |Ay ≤ b) = 0.95

Conditional coverage

P(βj•M ∈ Cj | M = M, s = s

)= 1− α

Implies false coverage rate (FCR) control

E

[#{j ∈ M : Cj does not cover βj•M}

|M |

]≤ α

Page 67: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Selective inference and FCR

T := F[V−(y),V+(y)]

η′µ,σ2‖η‖2 (η′y) | {Ay ≤ b} ∼ Unif(0, 1)

‘Invert’ pivotal quantity to obtain intervals with conditional type-I error control

0.025 ≤ T ≤ 0.975 =⇒ a−(η, y) ≤ η′µ ≤ a+(η, y)

=⇒ P(a−(η, y) ≤ η′µ ≤ a+(η, y) |Ay ≤ b) = 0.95

Conditional coverage

P(βj•M ∈ Cj | M = M, s = s

)= 1− α

Implies false coverage rate (FCR) control

E

[#{j ∈ M : Cj does not cover βj•M}

|M |

]≤ α

Page 68: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Comparison on diabetes dataset

BMI BP S3 S5600

400

200

0

200

400

600

800

1000

AdjustedUnadjusted (OLS)Data SplittingPOSI

Selective intervals ≈ z-intervals for significant variables

Data splitting widens intervals by√

2

POSI widens by 1.36

Page 69: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Coarsest selection event

CaveatConditioned on signsin addition to selected variables

X3X1

X2

Y

{1,3

} selected

0 5 10 15 20Variable Index

6

4

2

0

2

4

6

Coeff

icie

nt

λ=15

True signal

Minimal Intervals

Simple Intervals

0 5 10 15 20Variable Index

6

4

2

0

2

4

6

Coeff

icie

nt

λ=22

True signal

Minimal Intervals

Simple Intervals

Page 70: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Partial summary

Much shorter CIs than with POSI

Price to pay: commit to lasso (with fixed value of λ)

Does not work well when selection event has several dozens variables or more

many recent developments by J. Taylor and his group

http://statweb.stanford.edu/∼jtaylo/papers/index.htmlSelectiveInference R Package

Many other works: Fithian et al. (’14), Lee et al. (’15),

Lockart et al. (’14), Van de Geer et al (’14), Javanmard et

al (’14), Leeb et al (’14)...

Page 71: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

A Simultaneous over all possible selection rules

B Simultaneous over the selected

C On the average over the selected (FDR/FCR)

D Conditional over the selected

Who’s the Winner? Another View of Selective InferenceHung and Fithian (’16)

Slides after Will Fithian’s Ph. D. dissertation defense, Stanford U., May 2015

Extends location family result of Gutmann & Maymin (’87)

Page 72: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

The Iowa Republican poll (May, 2015)

Quinnipac poll of n = 667 Iowa Republican

Rank Candidate Result Votes1. Scott Walker 21 % 1402. Rand Paul 13 % 873. Marco Rubio 13 % 874. Ted Cruz 12 % 80...

...14. Bobby Jindal 1 % 715. Lindsey Graham 0 % 0

Question Is Scott Walker really winning?

Problem Selection bias (winner’s curse)

“Question selection”, not really “model selection”

Page 73: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Selective hypothesis testing

X = (X1, . . . , X15) ∼ Multinom(n, π)

After seeing data, ask whether candidate i really is in the lead (select Hi)(question we ask is data dependent): test

Hi = πi ≤ maxj 6=i

πj

=⋃

j 6=i

Hi≤j : πi ≤ πj

on the event

Ai =

{Xi > max

j 6=iXj

}

Test φi(X) is a selective level α-test if

E[φi(X) |Ai] ≤ α for any dist. in Hi

Page 74: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Selective hypothesis testing

X = (X1, . . . , X15) ∼ Multinom(n, π)

After seeing data, ask whether candidate i really is in the lead (select Hi)(question we ask is data dependent): test

Hi = πi ≤ maxj 6=i

πj

=⋃

j 6=i

Hi≤j : πi ≤ πj

on the event

Ai =

{Xi > max

j 6=iXj

}

Test φi(X) is a selective level α-test if

E[φi(X) |Ai] ≤ α for any dist. in Hi

Page 75: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Construction of a selective test

(1) Construct a selective p-value pi,j for Hi≤j on Ai

For i = 1, j = 2, p1,2 is based on

L(X1 | X1 +X2, X3:15, A1)

(X1 | · · · ) ∼ Bin(X1 +X2,

π1

π1+π2

)truncated binomial count

(2) Combined p-valuepi = max

j 6=ipi,j

Valid since

P (pi ≤ α | Ai) ≤ minj 6=i

P (pi,j ≤ α | Ai)

≤ α if any πj ≥ πi

Page 76: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Construction of a selective test

(1) Construct a selective p-value pi,j for Hi≤j on Ai

For i = 1, j = 2, p1,2 is based on

L(X1 | X1 +X2, X3:15, A1)

(X1 | · · · ) ∼ Bin(X1 +X2,

π1

π1+π2

)truncated binomial count

(2) Combined p-valuepi = max

j 6=ipi,j

Valid since

P (pi ≤ α | Ai) ≤ minj 6=i

P (pi,j ≤ α | Ai)

≤ α if any πj ≥ πi

Page 77: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Construction of a selective test

(1) Construct a selective p-value pi,j for Hi≤j on Ai

For i = 1, j = 2, p1,2 is based on

L(X1 | X1 +X2, X3:15, A1)

(X1 | · · · ) ∼ Bin(X1 +X2,

π1

π1+π2

)truncated binomial count

(2) Combined p-valuepi = max

j 6=ipi,j

Valid since

P (pi ≤ α | Ai) ≤ minj 6=i

P (pi,j ≤ α | Ai)

≤ α if any πj ≥ πi

Page 78: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Construction of a selective test

(1) Construct a selective p-value pi,j for Hi≤j on Ai

For i = 1, j = 2, p1,2 is based on

L(X1 | X1 +X2, X3:15, A1)

(X1 | · · · ) ∼ Bin(X1 +X2,

π1

π1+π2

)truncated binomial count

(2) Combined p-valuepi = max

j 6=ipi,j

Valid since

P (pi ≤ α | Ai) ≤ minj 6=i

P (pi,j ≤ α | Ai)

≤ α if any πj ≥ πi

Page 79: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Mechanics of the selective test

(X1 | · · · ) ∼ Bin(X1 +X2,

π1

π1+π2

)truncated binomial count

H0 : π1 ≤ π2 ⇐⇒ π1/(π1 + π2) ≤ 1/2

∴ test whether X1 ∼ bin(m, p) with p ≤ 1/2 and m = X1 +X2 conditioned onX1 > m/2

Page 80: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Mechanics of the selective test

(X1 | · · · ) ∼ Bin(X1 +X2,

π1

π1+π2

)truncated binomial count

H0 : π1 ≤ π2 ⇐⇒ π1/(π1 + π2) ≤ 1/2

∴ test whether X1 ∼ bin(m, p) with p ≤ 1/2 and m = X1 +X2 conditioned onX1 > m/2

Page 81: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Mechanics of the selective test

(X1 | · · · ) ∼ Bin(X1 +X2,

π1

π1+π2

)truncated binomial count

H0 : π1 ≤ π2 ⇐⇒ π1/(π1 + π2) ≤ 1/2

∴ test whether X1 ∼ bin(m, p) with p ≤ 1/2 and m = X1 +X2 conditioned onX1 > m/2

Page 82: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Selective Test

Rank Candidate Result Votes1. Scott Walker 21 % 1402. Rand Paul 13 % 87...

...

Walker vs. Paul: pSW,RP based on

L(XSW |XSW +XRP = 227, Xothers,SW wins) =

L(XSW |XSW +XRP = 227, XSW ≥ 114)

Selective inference recovers ‘classical’ answer see also Gutmann & Maymin (’87)

pSW = maxj 6=SW

pSW,j = 2P(Binom(227, 1/2) ≥ 140) = 0.00053

88% power under X∗ ∼ Multinom(667, π) (α = 0.05)

Scott Walker is next best by at least 22%

Page 83: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Selective Test

Rank Candidate Result Votes1. Scott Walker 21 % 1402. Rand Paul 13 % 87...

...

Walker vs. Paul: pSW,RP based on

L(XSW |XSW +XRP = 227, Xothers,SW wins) =

L(XSW |XSW +XRP = 227, XSW ≥ 114)

Selective inference recovers ‘classical’ answer see also Gutmann & Maymin (’87)

pSW = maxj 6=SW

pSW,j = 2P(Binom(227, 1/2) ≥ 140) = 0.00053

88% power under X∗ ∼ Multinom(667, π) (α = 0.05)

Scott Walker is next best by at least 22%

Page 84: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Selective Test

Rank Candidate Result Votes1. Scott Walker 21 % 1402. Rand Paul 13 % 87...

...

Walker vs. Paul: pSW,RP based on

L(XSW |XSW +XRP = 227, Xothers,SW wins) =

L(XSW |XSW +XRP = 227, XSW ≥ 114)

Selective inference recovers ‘classical’ answer see also Gutmann & Maymin (’87)

pSW = maxj 6=SW

pSW,j = 2P(Binom(227, 1/2) ≥ 140) = 0.00053

88% power under X∗ ∼ Multinom(667, π) (α = 0.05)

Scott Walker is next best by at least 22%

Page 85: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Selective Test

Rank Candidate Result Votes1. Scott Walker 21 % 1402. Rand Paul 13 % 87...

...

Walker vs. Paul: pSW,RP based on

L(XSW |XSW +XRP = 227, Xothers,SW wins) =

L(XSW |XSW +XRP = 227, XSW ≥ 114)

Selective inference recovers ‘classical’ answer see also Gutmann & Maymin (’87)

pSW = maxj 6=SW

pSW,j = 2P(Binom(227, 1/2) ≥ 140) = 0.00053

88% power under X∗ ∼ Multinom(667, π) (α = 0.05)

Scott Walker is next best by at least 22%

Page 86: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Summary: What’s Happening in Selective Inference?

Statisticians extraordinarily engaged in rewriting the theory and practice ofstatistics

Addresses the reproducibility issue (at least partially)

Already have some solutions

Need to continue to develop solutions as new problems come about

Need to communicate these solutions effectively

Education (undergraduate and graduate) will play a crucial role incommunicating ideas and methods

Page 87: Emmanuel Cand es, Stanford Universitycandes/talks/slides/Wald3.pdf · Lecture 3: Special dedication Maryam Mirzakhani 1977{2017 \Life is not supposed to be easy"

Thank You!


Recommended