+ All Categories
Home > Documents > Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov...

Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov...

Date post: 25-Jul-2020
Category:
Upload: others
View: 1 times
Download: 0 times
Share this document with a friend
40
Markov Random Fields Umamahesh Srinivas iPAL Group Meeting February 25, 2011
Transcript
Page 1: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Markov Random Fields

Umamahesh Srinivas

iPAL Group Meeting

February 25, 2011

Page 2: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Outline

1 Basic graph-theoretic concepts

2 Markov chain

3 Markov random field (MRF)

4 Gauss-Markov random field (GMRF), and applications

5 Other popular MRFs

02/25/2011 iPAL Group Meeting 2

Page 3: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

References

1 Charles Bouman, Markov random fields and stochastic imagemodels. Tutorial presented at ICIP 1995

2 Mario Figueiredo, Bayesian methods and Markov random fields.Tutorial presented at CVPR 1998

02/25/2011 iPAL Group Meeting 3

Page 4: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Basic graph-theoretic concepts

A graph G = (V, E) is a finite collection of nodes (or vertices)V = {n1, n2, . . . , nN} and set of edges E ⊂

(V2

)We consider only undirected graphs

Neighbor: Two nodes ni, nj ∈ V are neighbors if (ni, nj) ∈ E

Neighborhood of a node: N (ni) = {nj : (ni, nj) ∈ E}

Neighborhood is a symmetric relation: ni ∈ N (nj)⇔ nj ∈ N (ni)

Complete graph:∀ni ∈ V, N (ni) = {(ni, nj), j = {1, 2, . . . , N}\{i}}

Clique: a complete subgraph of G.

Maximal clique: Clique with maximal number of nodes; cannot addany other node while still retaining complete connectedness.

02/25/2011 iPAL Group Meeting 4

Page 5: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Basic graph-theoretic concepts

A graph G = (V, E) is a finite collection of nodes (or vertices)V = {n1, n2, . . . , nN} and set of edges E ⊂

(V2

)We consider only undirected graphs

Neighbor: Two nodes ni, nj ∈ V are neighbors if (ni, nj) ∈ E

Neighborhood of a node: N (ni) = {nj : (ni, nj) ∈ E}

Neighborhood is a symmetric relation: ni ∈ N (nj)⇔ nj ∈ N (ni)

Complete graph:∀ni ∈ V, N (ni) = {(ni, nj), j = {1, 2, . . . , N}\{i}}

Clique: a complete subgraph of G.

Maximal clique: Clique with maximal number of nodes; cannot addany other node while still retaining complete connectedness.

02/25/2011 iPAL Group Meeting 4

Page 6: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Basic graph-theoretic concepts

A graph G = (V, E) is a finite collection of nodes (or vertices)V = {n1, n2, . . . , nN} and set of edges E ⊂

(V2

)We consider only undirected graphs

Neighbor: Two nodes ni, nj ∈ V are neighbors if (ni, nj) ∈ E

Neighborhood of a node: N (ni) = {nj : (ni, nj) ∈ E}

Neighborhood is a symmetric relation: ni ∈ N (nj)⇔ nj ∈ N (ni)

Complete graph:∀ni ∈ V, N (ni) = {(ni, nj), j = {1, 2, . . . , N}\{i}}

Clique: a complete subgraph of G.

Maximal clique: Clique with maximal number of nodes; cannot addany other node while still retaining complete connectedness.

02/25/2011 iPAL Group Meeting 4

Page 7: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Illustration

V = {1, 2, 3, 4, 5, 6}

E = {(1, 2), (1, 3), (2, 4), (2, 5), (3, 4), (3, 6), (4, 6), (5, 6)}

N (4) = {2, 3, 6}

Examples of cliques: {(1), (3, 4, 6), (2, 5)}

Set of all cliques: V ∪ E ∪ {3, 4, 6}

02/25/2011 iPAL Group Meeting 5

Page 8: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Separation

Let A,B,C be three disjoint subsets of V

C separates A from B if any path from a node in A to a node in Bcontains some node in C

Example: C = {1, 4, 6} separates A = {3} from B = {2, 5}

02/25/2011 iPAL Group Meeting 6

Page 9: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Markov chains

Graphical model: Associate each node of a graph with a randomvariable (or a collection thereof)

Homogeneous 1-D Markov chain:

p(xn|xi, i < n) = p(xn|xn−1)

Probability of a sequence given by:

p(x) = p(x0)

N∏n=1

p(xn|xn−1)

02/25/2011 iPAL Group Meeting 7

Page 10: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

2-D Markov chains

Advantages:

Simple expressions for probability

Simple parameter estimation

Disadvantages:

No natural ordering of image pixels

Anisotropic model behavior

02/25/2011 iPAL Group Meeting 8

Page 11: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Random fields on graphs

Consider a collection of random variables x = (x1, x2, . . . , xN ) withassociated joint probability distribution p(x)

Let A,B,C be three disjoint subsets of V. Let xA denote thecollection of random variables in A.

Conditional independence: A ⊥⊥ B | C

A ⊥⊥ B | C ⇔ p(xA,xB |xC) = p(xA|xC)p(xB |xC)

Markov random field: undirected graphical model in which eachnode corresponds to a random variable or a collection of randomvariables, and the edges identify conditional dependencies.

02/25/2011 iPAL Group Meeting 9

Page 12: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Random fields on graphs

Consider a collection of random variables x = (x1, x2, . . . , xN ) withassociated joint probability distribution p(x)

Let A,B,C be three disjoint subsets of V. Let xA denote thecollection of random variables in A.

Conditional independence: A ⊥⊥ B | C

A ⊥⊥ B | C ⇔ p(xA,xB |xC) = p(xA|xC)p(xB |xC)

Markov random field: undirected graphical model in which eachnode corresponds to a random variable or a collection of randomvariables, and the edges identify conditional dependencies.

02/25/2011 iPAL Group Meeting 9

Page 13: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Markov properties

Pairwise Markovianity:

(ni, nj) /∈ E ⇒ xi and xj are independent when conditioned on allother variables

p(xi, xj |x\{i,j}) = p(xi|x\{i,j})p(xj |x\{i,j})

Local Markovianity:

Given its neighborhood, a variable is independent on the rest of thevariables

p(xi|xV\{i}) = p(xi|xN (i))

Global Markovianity:

Let A,B,C be three disjoint subsets of V. If

C separates A from B ⇒ p(xA,xB |xC) = p(xA|xC)p(xB |xC),

then p(·) is global Markov w.r.t. G.

02/25/2011 iPAL Group Meeting 10

Page 14: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Markov properties

Pairwise Markovianity:

(ni, nj) /∈ E ⇒ xi and xj are independent when conditioned on allother variables

p(xi, xj |x\{i,j}) = p(xi|x\{i,j})p(xj |x\{i,j})

Local Markovianity:

Given its neighborhood, a variable is independent on the rest of thevariables

p(xi|xV\{i}) = p(xi|xN (i))

Global Markovianity:

Let A,B,C be three disjoint subsets of V. If

C separates A from B ⇒ p(xA,xB |xC) = p(xA|xC)p(xB |xC),

then p(·) is global Markov w.r.t. G.

02/25/2011 iPAL Group Meeting 10

Page 15: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Markov properties

Pairwise Markovianity:

(ni, nj) /∈ E ⇒ xi and xj are independent when conditioned on allother variables

p(xi, xj |x\{i,j}) = p(xi|x\{i,j})p(xj |x\{i,j})

Local Markovianity:

Given its neighborhood, a variable is independent on the rest of thevariables

p(xi|xV\{i}) = p(xi|xN (i))

Global Markovianity:

Let A,B,C be three disjoint subsets of V. If

C separates A from B ⇒ p(xA,xB |xC) = p(xA|xC)p(xB |xC),

then p(·) is global Markov w.r.t. G.

02/25/2011 iPAL Group Meeting 10

Page 16: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Hammersley-Clifford Theorem

Consider a random field x on a graph G, such that p(x) > 0. Let Cdenote the set of all maximal cliques of the graph.

If the field has the local Markov property, then p(x) can be writtenas a Gibbs distribution:

p(x) =1

Zexp

{−∑C∈C

VC(xC)

},

where Z, the normalizing constant, is called the partition function;VC(xC) are the clique potentials

If p(x) can be written in Gibbs form for the cliques of some graph,then it has the global Markov property.

Fundamental consequence: every Markov random field can be specifiedvia clique potentials.

02/25/2011 iPAL Group Meeting 11

Page 17: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Hammersley-Clifford Theorem

Consider a random field x on a graph G, such that p(x) > 0. Let Cdenote the set of all maximal cliques of the graph.

If the field has the local Markov property, then p(x) can be writtenas a Gibbs distribution:

p(x) =1

Zexp

{−∑C∈C

VC(xC)

},

where Z, the normalizing constant, is called the partition function;VC(xC) are the clique potentials

If p(x) can be written in Gibbs form for the cliques of some graph,then it has the global Markov property.

Fundamental consequence: every Markov random field can be specifiedvia clique potentials.

02/25/2011 iPAL Group Meeting 11

Page 18: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Hammersley-Clifford Theorem

Consider a random field x on a graph G, such that p(x) > 0. Let Cdenote the set of all maximal cliques of the graph.

If the field has the local Markov property, then p(x) can be writtenas a Gibbs distribution:

p(x) =1

Zexp

{−∑C∈C

VC(xC)

},

where Z, the normalizing constant, is called the partition function;VC(xC) are the clique potentials

If p(x) can be written in Gibbs form for the cliques of some graph,then it has the global Markov property.

Fundamental consequence: every Markov random field can be specifiedvia clique potentials.

02/25/2011 iPAL Group Meeting 11

Page 19: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Regular rectangular lattices

V = {(i, j), i = 1, . . . ,M, j = 1, . . . , N}

Order-K neighborhood system:

NK(i, j) = {(m,n) : (i−m)2 + (j − n)2 ≤ K}

02/25/2011 iPAL Group Meeting 12

Page 20: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Auto-models

Only pair-wise interactions

In terms of clique potentials: |C| > 2⇒ VC(·) = 0

Simplest possible neighborhood models

02/25/2011 iPAL Group Meeting 13

Page 21: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Gauss-Markov Random Fields (GMRF)

Joint probability function (assuming zero mean):

p(x) =1

(2π)n/2|Σ|1/2exp

{−1

2xTΣ−1x

}Quadratic form in the exponent:

xTΣ−1x =∑i

∑j

xixiΣ−1i,j ⇒ auto-model

The neighborhood system is determined by the potential matrixΣ−1

Local conditionals are univariate Gaussian

02/25/2011 iPAL Group Meeting 14

Page 22: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Gauss-Markov Random Fields (GMRF)

Joint probability function (assuming zero mean):

p(x) =1

(2π)n/2|Σ|1/2exp

{−1

2xTΣ−1x

}Quadratic form in the exponent:

xTΣ−1x =∑i

∑j

xixiΣ−1i,j ⇒ auto-model

The neighborhood system is determined by the potential matrixΣ−1

Local conditionals are univariate Gaussian

02/25/2011 iPAL Group Meeting 14

Page 23: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Gauss-Markov Random Fields

Specification via clique potentials:

VC(xC) =1

2

(∑i∈C

αCi xi

)2

=1

2

(∑i∈V

αCi xi

)2

as long as i /∈ C ⇒ αCi = 0

The exponent of the GMRF density becomes:

−∑C∈C

VC(xC) = −1

2

∑C∈C

(∑i∈V

αCi xi

)2

=1

2

∑i∈V

∑j∈V

(∑C∈C

αCi α

Cj

)xixj = −1

2xTΣ−1x.

02/25/2011 iPAL Group Meeting 15

Page 24: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

GMRF: Application to image processing

Classical image “smoothing” prior

Consider an image to be a rectangular lattice with first-order pixelneighborhoods

Cliques: pairs of vertically or horizontally adjacent pixels

Clique potentials: squares of first-order differences (approximation ofcontinuous derivative)

V{(i,j),(i,j−1)}(xi,j , xi,j−1) =1

2(xi,j − xi,j−1)2

Resulting Σ−1: block-tridiagonal with tridiagonal blocks

02/25/2011 iPAL Group Meeting 16

Page 25: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Bayesian image restoration with GMRF prior

Observation model:

y = Hx + n, n ∼ N(0, σ2I)

Smoothing GMRF prior: p(x) ∝ exp{− 12xTΣ−1x}

MAP estimate:

x̂ = [σ2Σ−1 + HTH]−1HTy

02/25/2011 iPAL Group Meeting 17

Page 26: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Bayesian image restoration with GMRF prior

Figure: (a) Original image, (b) Blurred and slightly noisy image, (c) Restoredversion of (b), (d) No blur, severe noise, (e) Restored version of (d).

Deblurring: good

Denoising: oversmoothing; “edge discontinuities” smoothed out

How to preserve discontinuities?

Other prior models

Hidden/latent binary random variables

Robust potential functions (e.g. L2 vs. L1-norm)

02/25/2011 iPAL Group Meeting 18

Page 27: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Bayesian image restoration with GMRF prior

Figure: (a) Original image, (b) Blurred and slightly noisy image, (c) Restoredversion of (b), (d) No blur, severe noise, (e) Restored version of (d).

Deblurring: good

Denoising: oversmoothing; “edge discontinuities” smoothed out

How to preserve discontinuities?

Other prior models

Hidden/latent binary random variables

Robust potential functions (e.g. L2 vs. L1-norm)

02/25/2011 iPAL Group Meeting 18

Page 28: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Compound GMRFInsert binary variables v to “turn off” clique potentials

Modified clique potentials:

V (xi,j , xi,j−1, vi,j) =1

2(1− vi,j)(xi,j − xi,j−1)2

Intuitive explanation:

v = 0⇒ clique potential is quadratic (“on”)

v = 1⇒ VC(·) = 0→ no smoothing; image has an edge at thislocation.

Can choose separate latent variables v and h for vertical andhorizontal edges respectively

p(x|h,v) ∝ exp

{−1

2xTΣ−1(h,v)x

}MAP estimate:

x̂ = [σ2Σ−1(h,v) + HTH]−1HTy

02/25/2011 iPAL Group Meeting 19

Page 29: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Compound GMRFInsert binary variables v to “turn off” clique potentials

Modified clique potentials:

V (xi,j , xi,j−1, vi,j) =1

2(1− vi,j)(xi,j − xi,j−1)2

Intuitive explanation:

v = 0⇒ clique potential is quadratic (“on”)

v = 1⇒ VC(·) = 0→ no smoothing; image has an edge at thislocation.

Can choose separate latent variables v and h for vertical andhorizontal edges respectively

p(x|h,v) ∝ exp

{−1

2xTΣ−1(h,v)x

}MAP estimate:

x̂ = [σ2Σ−1(h,v) + HTH]−1HTy

02/25/2011 iPAL Group Meeting 19

Page 30: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Compound GMRFInsert binary variables v to “turn off” clique potentials

Modified clique potentials:

V (xi,j , xi,j−1, vi,j) =1

2(1− vi,j)(xi,j − xi,j−1)2

Intuitive explanation:

v = 0⇒ clique potential is quadratic (“on”)

v = 1⇒ VC(·) = 0→ no smoothing; image has an edge at thislocation.

Can choose separate latent variables v and h for vertical andhorizontal edges respectively

p(x|h,v) ∝ exp

{−1

2xTΣ−1(h,v)x

}MAP estimate:

x̂ = [σ2Σ−1(h,v) + HTH]−1HTy

02/25/2011 iPAL Group Meeting 19

Page 31: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Compound GMRFInsert binary variables v to “turn off” clique potentials

Modified clique potentials:

V (xi,j , xi,j−1, vi,j) =1

2(1− vi,j)(xi,j − xi,j−1)2

Intuitive explanation:

v = 0⇒ clique potential is quadratic (“on”)

v = 1⇒ VC(·) = 0→ no smoothing; image has an edge at thislocation.

Can choose separate latent variables v and h for vertical andhorizontal edges respectively

p(x|h,v) ∝ exp

{−1

2xTΣ−1(h,v)x

}MAP estimate:

x̂ = [σ2Σ−1(h,v) + HTH]−1HTy

02/25/2011 iPAL Group Meeting 19

Page 32: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Discontinuity-preserving restoration

Convex potentials:

Generalized Gaussians: V (x) = |x|p, p ∈ [1, 2]

Stevenson:

V (x) =

{x2, |x| < a

2a|x| − a2, |x| ≥ a

Green: V (x) = 2a2 log cosh(x/a)

Non-convex potentials:

Blake, Zisserman: V (x) = (min{|x|, a})2

Geman, McClure: V (x) = x2

x2+a2

02/25/2011 iPAL Group Meeting 20

Page 33: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Ising model (2-D MRF)

V (xi, xj) = βδ(xi 6= xj), β is a model parameter

Energy function: ∑C∈C

VC(xC) = β(boundary length)

Longer boundaries less probable

02/25/2011 iPAL Group Meeting 21

Page 34: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Application: Image segmentation

Discrete MRF used to model the segmentation field

Each class represented by a value Xs ∈ {0, . . . ,M − 1}

Joint distribution function:

P{Y ∈ dy,X = x} = p(y|x)p(x)

(Bayesian) MAP estimation:

X̂ = arg maxx

px|Y (x|Y ) = arg maxx

(log p(Y |x) + log(p(x)))

02/25/2011 iPAL Group Meeting 22

Page 35: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Application: Image segmentation

Discrete MRF used to model the segmentation field

Each class represented by a value Xs ∈ {0, . . . ,M − 1}

Joint distribution function:

P{Y ∈ dy,X = x} = p(y|x)p(x)

(Bayesian) MAP estimation:

X̂ = arg maxx

px|Y (x|Y ) = arg maxx

(log p(Y |x) + log(p(x)))

02/25/2011 iPAL Group Meeting 22

Page 36: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Application: Image segmentation

Discrete MRF used to model the segmentation field

Each class represented by a value Xs ∈ {0, . . . ,M − 1}

Joint distribution function:

P{Y ∈ dy,X = x} = p(y|x)p(x)

(Bayesian) MAP estimation:

X̂ = arg maxx

px|Y (x|Y ) = arg maxx

(log p(Y |x) + log(p(x)))

02/25/2011 iPAL Group Meeting 22

Page 37: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

MAP optimization for segmentation

Data model:py|x(y|x) =

∏s∈S

p(ys|xs)

Prior (Ising) model:

px(x) =1

Zexp{−βt1(x)},

where t1(x) is the number of horizontal and vertical neighbors of xhaving a different value

MAP estimate:

x̂ = arg minx{− log py|x(y|x) + βt1(x)}

Hard optimization problem

02/25/2011 iPAL Group Meeting 23

Page 38: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

MAP optimization for segmentation

Data model:py|x(y|x) =

∏s∈S

p(ys|xs)

Prior (Ising) model:

px(x) =1

Zexp{−βt1(x)},

where t1(x) is the number of horizontal and vertical neighbors of xhaving a different value

MAP estimate:

x̂ = arg minx{− log py|x(y|x) + βt1(x)}

Hard optimization problem

02/25/2011 iPAL Group Meeting 23

Page 39: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Some proposed approachesIterated conditional modes: iterative minimization w.r.t. each pixel

Simulated annealing: Generate samples from prior distribution

Multi-scale resolution segmentation

Figure: (a) Synthetic image with three textures, (b) ICM, (c) Simulatedannealing, (d) Multi-resolution approach.

02/25/2011 iPAL Group Meeting 24

Page 40: Markov Random Fields - Pennsylvania State Universitysignal.ee.psu.edu/mrf.pdf · Gauss-Markov Random Fields Speci cation via clique potentials: V C(x C) = 1 2 X i2C C i x i! 2 = 1

Summary

Graphical models study probability distributions whose conditionaldependencies arise out of specific graph structures

Markov random field is an undirected graphical model with specialfactorization properties

2-D MRFs have been widely used as priors in image processingproblems

Choice of potential functions leads to different optimization problems

02/25/2011 iPAL Group Meeting 25


Recommended