Robustness of Conditional GANs to Noisy Labels04... · Spotlight presentation, NeurIPS 2018 Kiran...

Post on 20-Sep-2020

0 views 0 download

transcript

Robustness of Conditional GANs to Noisy LabelsSpotlight presentation, NeurIPS 2018

Kiran K. Thekumparampil1 Ashish Khetan1

Zinan Lin2 Sewoong Oh1

1University of Illinois at Urbana-Champaign

2Carnegie Mellon University

Poster #5, Tue, Dec 4 2018

Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 1 / 14

Conditional GAN (cGAN) is vital for achieving high quality

Input: Labeled real samples (X ,Y )

Output: Fake samples for label Y

cGAN“Cat”

Latent Code

[Brock et al. 2018]

Visual quality: cGAN >> GAN

[https://github.com/tensorflow/models/tree/master/research/gan]

Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 2 / 14

Conditional GAN (cGAN) is vital for achieving high quality

Input: Labeled real samples (X ,Y )

Output: Fake samples for label Y

cGAN“Cat”

Latent Code

[Brock et al. 2018]

Visual quality: cGAN >> GAN

[https://github.com/tensorflow/models/tree/master/research/gan]

Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 2 / 14

Conditional GAN (cGAN) is vital for achieving high quality

Input: Labeled real samples (X ,Y )

Output: Fake samples for label Y

cGAN“Cat”

Latent Code

[Brock et al. 2018]

Visual quality: cGAN >> GAN

[https://github.com/tensorflow/models/tree/master/research/gan]

Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 2 / 14

Conditional GAN is sensitive to noise in labels

cGAN trained with noisy labels produces samples

that are biased, generating examples from wrong classes, and,

of lower quality (red boxes).

noisy real data

label0123456789

standard cGAN our RCGAN

Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 3 / 14

Conditional GAN is sensitive to noise in labels

cGAN trained with noisy labels produces samples

that are biased, generating examples from wrong classes, and,

of lower quality (red boxes).

noisy real data

label0123456789

standard cGAN our RCGAN

Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 3 / 14

Conditional GAN is sensitive to noise in labels

cGAN trained with noisy labels produces samples

that are biased, generating examples from wrong classes, and,

of lower quality (red boxes).

noisy real data

label0123456789

standard cGAN

our RCGAN

Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 3 / 14

Conditional GAN is sensitive to noise in labels

cGAN trained with noisy labels produces samples

that are biased, generating examples from wrong classes, and,

of lower quality (red boxes).

noisy real data

label0123456789

standard cGAN our RCGAN

Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 3 / 14

Conditional GAN (cGAN)

G

D

z x

yreal

xreal

adversarialloss

EIGHT

EIGHTyEIGHTy

P

Q

minQ JS(P ||Q)

[Bora et al. 2018, Miyato et al. 2018, Sukhbaatar et al. 2015]

Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 4 / 14

Conditional GAN under noisy labeled data

G

D

z

y

x

yreal

xreal

adversarialloss

Cyreal

EIGHTyEIGHT

EIGHT

P

Q

minQ JS(P ||Q)

[Bora et al. 2018, Miyato et al. 2018, Sukhbaatar et al. 2015]

Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 5 / 14

Robust Conditional GAN (RCGAN) Architecture

EIGHT

G

D

z

y

x

yreal

xreal

adversarialloss

Cyreal

Cyreal

EIGHT

P

Q

ProjectionDiscriminator

minQ JS(P || Q)

[Bora et al. 2018, Miyato et al. 2018, Sukhbaatar et al. 2015]

Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 6 / 14

Minimizing noisy divergence minimizes true divergence

Let P & Q be the noisy labeled versions of P & Q.

Theorem 1 (Population-level Analysis)

TV(P, Q

)≤ TV (P,Q) ≤ MC TV

(P, Q

)JS(P∥∥∥ Q

)≤ JS(P ‖ Q) ≤ MC

√8 JS

(P∥∥∥ Q

) =⇒ Q = P⇒ Q = P

where TV: Total Variation, JS : Jensen-Shannon divergence andMC , maxi

∑j

∣∣(C−1)ij∣∣.

Neural Network Distance (dF ) w.r.t a class of parametric discriminatorfunctions F is known to generalize [Arora et al. 2017]

Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 7 / 14

Minimizing noisy divergence minimizes true divergence

Let P & Q be the noisy labeled versions of P & Q.

Theorem 1 (Population-level Analysis)

TV(P, Q

)≤ TV (P,Q) ≤ MC TV

(P, Q

)JS(P∥∥∥ Q

)≤ JS(P ‖ Q) ≤ MC

√8 JS

(P∥∥∥ Q

) =⇒ Q = P⇒ Q = P

where TV: Total Variation, JS : Jensen-Shannon divergence andMC , maxi

∑j

∣∣(C−1)ij∣∣.

Neural Network Distance (dF ) w.r.t a class of parametric discriminatorfunctions F is known to generalize [Arora et al. 2017]

Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 7 / 14

Minimizing noisy divergence minimizes true divergence

Let Pn & Qn be the empirical noisy real and generated distributions.

Theorem 2 (Finite Sample Analysis)

If F satisfies inclusion condition, then ∃c > 0 such that

dF (Pn, Qn)− ε ≤ dF (P,Q) ≤ MC

(dF (Pn, Qn) + ε

)with probability at least 1− e−p for any ε > 0 and n ≥ cp log (pL/ε) /ε2

when F is L-Lipschitz in p parameters

Projection Discriminator satisfies inclusion condition

Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 8 / 14

RCGAN generates correct class (MNIST)

0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.90.0

0.2

0.4

0.6

0.8

1.0

Noise Level

GeneratorLabel

Accuracy

−→ cGAN

Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 9 / 14

RCGAN generates correct class (MNIST)

0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.90.0

0.2

0.4

0.6

0.8

1.0

Noise Level

GeneratorLabel

Accuracy

−→ cGAN

−→ RCGAN

Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 10 / 14

RCGAN generates correct class (MNIST)

0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.90.0

0.2

0.4

0.6

0.8

1.0

Noise Level

GeneratorLabel

Accuracy

−→ cGAN

−→ RCGAN

−→ RCGAN-U

Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 11 / 14

RCGAN improves quality of samples (CIFAR-10)

0.0 0.2 0.4 0.6 0.87.5

7.6

7.7

7.8

7.9

8.0

8.1

8.2

Noise Level

InceptionScore

−→ cGAN

−→ RCGAN

−→ RCGAN-U

Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 12 / 14

RCGAN can correct noisy training labels (MNIST)

0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.90.0

0.2

0.4

0.6

0.8

1.0

Noise Level

LabelRecoveryAccuracy

−→ cGAN

−→ RCGAN

−→ RCGAN-U

Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 13 / 14

Thank you

Poster #5, Tue, Dec 04https://github.com/POLane16/Robust-Conditional-GAN

[Arora 2015] S. Arora, R. Ge, Y. Liang, T. Ma, and Y. Zhang. Generalization and equilibrium ingenerative adversarial nets (GANs), ICML 2018.[Bora 2018] A. Bora, E. Price, and A. G. Dimakis. AmbientGAN: Generative models from lossymeasurements, ICLR, 2018.[Brock 2018] A. Brock, J. Donahue, and K. Simonyan. Large scale gan training for high fidelitynatural image synthesis, arXiv preprint arXiv:1809.11096.[Miyato 2018] T. Miyato, and M. Koyama. cGANs with projection discriminator. ICLR, 2018.[Sukhbaatar 2015] S. Sukhbaatar, J. Bruna, M. Paluri, L. Bourdev, and R. Fergus. Trainingconvolutional networks with noisy labels. In ICLR, Workshop, 2015.

Thekumparampil (UIUC) Robust Conditional GAN NeurIPS 2018 14 / 14