+ All Categories
Home > Documents > Announcements

Announcements

Date post: 07-Jan-2016
Category:
Upload: iden
View: 34 times
Download: 1 times
Share this document with a friend
Description:
Announcements. Guest lecture today: Aseem Agarwala Final project out today you and your partner must submit a proposal by this Friday Today’s Reading - PowerPoint PPT Presentation
39
Announcements Guest lecture today: Aseem Agarwala Final project out today you and your partner must submit a proposal by this Friday Today’s Reading Alexei A. Efros and Thomas K. Leung, “Texture Synthesis by Non-parametric Sampling,” Proc. International Conference on Computer Vision (ICCV), 1999. – http://www.cs.berkeley.edu/~efros/research/NPS/ efros-iccv99.pdf
Transcript
Page 1: Announcements

Announcements• Guest lecture today: Aseem Agarwala• Final project out today

– you and your partner must submit a proposal by this Friday

Today’s Reading• Alexei A. Efros and Thomas K. Leung, “Texture Synthesis by Non-

parametric Sampling,” Proc. International Conference on Computer Vision (ICCV), 1999.

– http://www.cs.berkeley.edu/~efros/research/NPS/efros-iccv99.pdf

Page 2: Announcements

Modeling Texture

What is texture?

How can we model it?

Page 3: Announcements

Markov ChainsMarkov Chain

• a sequence of random variables

• is the state of the model at time t

• Markov assumption: each state is dependent only on the previous one– dependency given by a conditional probability:

• The above is actually a first-order Markov chain• An N’th-order Markov chain:

Page 4: Announcements

Markov Chain Example: Text“A dog is a man’s best friend. It’s a dog eat dog world out there.”

2/3 1/3

1/3 1/3 1/3

1

1

1

1

1

1

1

1

1

1

a

dogis

man’s

best

friendit’seat

worldout

there

dog

is man’s

best

friend

it’s

eatw

orld

out

there

a .

.

Page 5: Announcements

Text synthesisCreate plausible looking poetry, love letters, term papers, etc.

Most basic algorithm1. Build probability histogram

– find all blocks of N consecutive words/letters in training documents

– compute probability of occurance

2. Given words – compute by sampling from

Example on board...

Page 6: Announcements

[Scientific American, June 1989, Dewdney]

“I Spent an Interesting Evening Recently with a Grain of Salt”

- Mark V. Shaney(computer-generated contributor to UseNet News group called net.singles)

Output of 2nd order word-level Markov Chain after training on 90,000 word philosophical essay:

“Perhaps only the allegory of simulation is unendurable--more cruel than Artaud's Theatre of Cruelty, which was the first to practice deterrence, abstraction, disconnection, deterritorialisation, etc.; and if it were our own past. We are witnessing the end of the negative form. But nothing separates one pole from the very swing of voting ''rights'' to electoral...”

Page 7: Announcements

Modeling Texture

What is texture?• An image obeying some statistical properties• Similar structures repeated over and over again• Often has some degree of randomness

Page 8: Announcements

Markov Random Field

A Markov random field (MRF) • generalization of Markov chains to two or more dimensions.

First-order MRF:• probability that pixel X takes a certain value given the values

of neighbors A, B, C, and D:

D

C

X

A

B

X

X

• Higher order MRF’s have larger neighborhoods

Page 9: Announcements

Texture Synthesis [Efros & Leung, ICCV 99]

Can apply 2D version of text synthesis

Page 10: Announcements

Synthesizing One Pixel

sample image

Generated image

• What is ?

• Find all the windows in the image that match the neighborhood

– consider only pixels in the neighborhood that are already filled in

• To synthesize x

– pick one matching window at random

– assign x to be the center pixel of that window

SAMPLE

x

Slides courtesy of Alyosha Efros

Page 11: Announcements

Really Synthesizing One Pixel

sample image

• An exact neighbourhood match might not be present

• So we find the best matches using SSD error and randomly choose between them, preferring better matches with higher probability

SAMPLE

Generated image

x

Page 12: Announcements

Growing Texture

• Starting from the initial image, “grow” the texture one pixel at a time

Page 13: Announcements

Window Size Controls Regularity

Page 14: Announcements

More Synthesis Results

Increasing window size

Page 15: Announcements

More Results

aluminum wirereptile skin

Page 16: Announcements

Failure Cases

Growing garbage Verbatim copying

Page 17: Announcements

Image-Based Text Synthesis

Page 18: Announcements

Speed

• How fast is this?• To synthesis a patch of n pixels, given a source image of k pixels, how many pixel window lookups does this algorithm require?•O(nk)•Speedup?

Page 19: Announcements

pp

Efros & Leung ’99 extendedEfros & Leung ’99 extended

Observation: neighbor pixels are highly correlated

Input image

non-parametricsampling

BB

Idea:Idea: unit of synthesis = block unit of synthesis = block• Exactly the same but now we want P(B|N(B))

• Much faster: synthesize all pixels in a block at once

Synthesizing a block

Page 20: Announcements

Input texture

B1 B2

Random placement of blocks

block

B1 B2

Neighboring blocksconstrained by overlap

B1 B2

Minimal errorboundary cut

Page 21: Announcements

min. error boundary

Minimal error boundaryMinimal error boundary

overlapping blocks vertical boundary

__ ==22

overlap error

Page 22: Announcements

Their PhilosophyTheir Philosophy

The “Corrupt Professor’s Algorithm”:• Plagiarize as much of the source image as you can• Then try to cover up the evidence

Rationale: • Texture blocks are by definition correct samples of texture so

problem only connecting them together

Page 23: Announcements

Texture Transfer

Constraint

Texture sample

Page 24: Announcements

Texture TransferTexture Transfer

Take the texture from one object and “paint” it onto another object

• This requires separating texture and shape

• That’s HARD, but we can cheat

• Assume we can capture shape by boundary and rough shading

Then, just add another constraint when Then, just add another constraint when sampling: similarity to luminance of sampling: similarity to luminance of underlying image at that spotunderlying image at that spot

Page 25: Announcements

++ ==

++ ==

parmesan

rice

Page 26: Announcements

==++

Page 27: Announcements

Issues

• Imposes artificial grid of overlapping blocks on synthesized image, and greedily chooses blocks in left-right, top-bottom order

• Dynamic programming limits applicability to related problems.

• Solution: use graph cuts instead• Let’s explore two examples, first.

Page 28: Announcements

Combining two images

Page 29: Announcements

Graph cut setup

source

sink

Page 30: Announcements

Spatio-temporal texture synthesis

t

Source:

Destination:

Page 31: Announcements

Graphcut textures (Kwatra ’03)

x,y

||A(x,y) – B(x,y)||2 + ||A(x+1,y) – B(x+1,y)||2

x+1,y

Link cost

Page 32: Announcements

Progressive Refinement

A B A C

B

Page 33: Announcements

Comparison

Sample Image Quilting Graphcut

More results & details

Page 34: Announcements

Image Analogies (Hertzmann ’01)

A A’

B B’

Page 35: Announcements
Page 36: Announcements

A A’

B B’

Artistic Filters

Page 37: Announcements

Texture-by-numbers

A A’

B B’

Page 38: Announcements

Other applications of Image Analogies

• Texture synthesis• Super-resolution• Texture transfer• Image colorization• Simple filters (blur, emboss)• More details

• http://mrl.nyu.edu/projects/image-analogies/

Page 39: Announcements

Applications of Texture Modeling

Super-resolution• Freeman & Pasztor, 1999• Baker & Kanade, 2000

Image/video compression

Texture recognition,

segmentation• DeBonet

Restoration• removing scratches, holes, filtering• Zhu et al.

Art/entertainment


Recommended