+ All Categories
Home > Technology > Probabilistic programming2

Probabilistic programming2

Date post: 15-Apr-2017
Category:
Upload: bredelings
View: 178 times
Download: 0 times
Share this document with a friend
34
Probabilistic Programming for Evolutionary Biology Benjamin Redelings June 24, 2014
Transcript

Probabilistic Programming for Evolutionary Biology

Benjamin Redelings

June 24, 2014

Probabilistic Programming

Easy to think =⇒ easy to write, easy to run

1. Write model description, generate inference method.

2. Modular models

3. Don’t resort to C++/Java to write simple things.

4. Allow graphical models with changing graphs & data structures

5. Lazy computation for MCMC.

Probabilistic Programming

Easy to think =⇒ easy to write, easy to run

1. Write model description, generate inference method.

2. Modular models

3. Don’t resort to C++/Java to write simple things.

4. Allow graphical models with changing graphs & data structures

5. Lazy computation for MCMC.

Probabilistic Programming

Easy to think =⇒ easy to write, easy to run

1. Write model description, generate inference method.

2. Modular models

3. Don’t resort to C++/Java to write simple things.

4. Allow graphical models with changing graphs & data structures

5. Lazy computation for MCMC.

Probabilistic Programming

Easy to think =⇒ easy to write, easy to run

1. Write model description, generate inference method.

2. Modular models

3. Don’t resort to C++/Java to write simple things.

4. Allow graphical models with changing graphs & data structures

5. Lazy computation for MCMC.

Probabilistic Programming

Easy to think =⇒ easy to write, easy to run

1. Write model description, generate inference method.

2. Modular models

3. Don’t resort to C++/Java to write simple things.

4. Allow graphical models with changing graphs & data structures

5. Lazy computation for MCMC.

Probabilistic Programming

Easy to think =⇒ easy to write, easy to run

1. Write model description, generate inference method.

2. Modular models

3. Don’t resort to C++/Java to write simple things.

4. Allow graphical models with changing graphs & data structures

5. Lazy computation for MCMC.

Graphical Models

Graphical Models

Graphical Models

x ∼ normal (if i then y else z , σ2)

Graphical Models

x ∼ normal (if i then y else z , σ2)

Graphical Models

x ∼ normal (if i then y else z , σ2)

Graphical Models

x ∼ normal (if i then y else z .1, σ2)

Extensions of Graphical Models

1. Control flow

I x ∼ normal (if i then y else z , σ2)I x [i] = z[category[i]]I x [i] ∼ normal(x [parent[i]], σ2)

2. Data structures (Native)(x,y)[x,y,z]ReversibleMarkov α Q π Λ

3. Random numbers of random variablesI n ∼ geometric 0.5

x ∼ iid n (normal 0 1)

Extensions of Graphical Models

1. Control flowI x ∼ normal (if i then y else z , σ2)

I x [i] = z[category[i]]I x [i] ∼ normal(x [parent[i]], σ2)

2. Data structures (Native)(x,y)[x,y,z]ReversibleMarkov α Q π Λ

3. Random numbers of random variablesI n ∼ geometric 0.5

x ∼ iid n (normal 0 1)

Extensions of Graphical Models

1. Control flowI x ∼ normal (if i then y else z , σ2)I x [i] = z[category[i]]

I x [i] ∼ normal(x [parent[i]], σ2)

2. Data structures (Native)(x,y)[x,y,z]ReversibleMarkov α Q π Λ

3. Random numbers of random variablesI n ∼ geometric 0.5

x ∼ iid n (normal 0 1)

Extensions of Graphical Models

1. Control flowI x ∼ normal (if i then y else z , σ2)I x [i] = z[category[i]]I x [i] ∼ normal(x [parent[i]], σ2)

2. Data structures (Native)(x,y)[x,y,z]ReversibleMarkov α Q π Λ

3. Random numbers of random variablesI n ∼ geometric 0.5

x ∼ iid n (normal 0 1)

Extensions of Graphical Models

1. Control flowI x ∼ normal (if i then y else z , σ2)I x [i] = z[category[i]]I x [i] ∼ normal(x [parent[i]], σ2)

2. Data structures (Native)(x,y)[x,y,z]ReversibleMarkov α Q π Λ

3. Random numbers of random variablesI n ∼ geometric 0.5

x ∼ iid n (normal 0 1)

Extensions of Graphical Models

1. Control flowI x ∼ normal (if i then y else z , σ2)I x [i] = z[category[i]]I x [i] ∼ normal(x [parent[i]], σ2)

2. Data structures (Native)(x,y)[x,y,z]ReversibleMarkov α Q π Λ

3. Random numbers of random variablesI n ∼ geometric 0.5

x ∼ iid n (normal 0 1)

Models (and Distributions) are functions

Consider the M0(κ, ω) codon model for positive selection

I Really, HKY (κ) is a nucleotide submodel → M0(HKY (κ), ω)

We really want models to be parameterized by other models

I ... and distributions parameterized by distributions!I logNormal µ σ = expTransform (normal µ σ)I dirichlet_process n α (normal 0 1)

SOMEtimes we want models parameterized by FUNCTIONS on models.

I M8 = mixture (beta a b) (\w -> m0(k,w))

Models (and Distributions) are functions

Consider the M0(κ, ω) codon model for positive selectionI Really, HKY (κ) is a nucleotide submodel → M0(HKY (κ), ω)

We really want models to be parameterized by other models

I ... and distributions parameterized by distributions!I logNormal µ σ = expTransform (normal µ σ)I dirichlet_process n α (normal 0 1)

SOMEtimes we want models parameterized by FUNCTIONS on models.

I M8 = mixture (beta a b) (\w -> m0(k,w))

Models (and Distributions) are functions

Consider the M0(κ, ω) codon model for positive selectionI Really, HKY (κ) is a nucleotide submodel → M0(HKY (κ), ω)

We really want models to be parameterized by other models

I ... and distributions parameterized by distributions!I logNormal µ σ = expTransform (normal µ σ)I dirichlet_process n α (normal 0 1)

SOMEtimes we want models parameterized by FUNCTIONS on models.

I M8 = mixture (beta a b) (\w -> m0(k,w))

Models (and Distributions) are functions

Consider the M0(κ, ω) codon model for positive selectionI Really, HKY (κ) is a nucleotide submodel → M0(HKY (κ), ω)

We really want models to be parameterized by other modelsI ... and distributions parameterized by distributions!

I logNormal µ σ = expTransform (normal µ σ)I dirichlet_process n α (normal 0 1)

SOMEtimes we want models parameterized by FUNCTIONS on models.

I M8 = mixture (beta a b) (\w -> m0(k,w))

Models (and Distributions) are functions

Consider the M0(κ, ω) codon model for positive selectionI Really, HKY (κ) is a nucleotide submodel → M0(HKY (κ), ω)

We really want models to be parameterized by other modelsI ... and distributions parameterized by distributions!I logNormal µ σ = expTransform (normal µ σ)

I dirichlet_process n α (normal 0 1)

SOMEtimes we want models parameterized by FUNCTIONS on models.

I M8 = mixture (beta a b) (\w -> m0(k,w))

Models (and Distributions) are functions

Consider the M0(κ, ω) codon model for positive selectionI Really, HKY (κ) is a nucleotide submodel → M0(HKY (κ), ω)

We really want models to be parameterized by other modelsI ... and distributions parameterized by distributions!I logNormal µ σ = expTransform (normal µ σ)I dirichlet_process n α (normal 0 1)

SOMEtimes we want models parameterized by FUNCTIONS on models.

I M8 = mixture (beta a b) (\w -> m0(k,w))

Models (and Distributions) are functions

Consider the M0(κ, ω) codon model for positive selectionI Really, HKY (κ) is a nucleotide submodel → M0(HKY (κ), ω)

We really want models to be parameterized by other modelsI ... and distributions parameterized by distributions!I logNormal µ σ = expTransform (normal µ σ)I dirichlet_process n α (normal 0 1)

SOMEtimes we want models parameterized by FUNCTIONS on models.

I M8 = mixture (beta a b) (\w -> m0(k,w))

Models (and Distributions) are functions

Consider the M0(κ, ω) codon model for positive selectionI Really, HKY (κ) is a nucleotide submodel → M0(HKY (κ), ω)

We really want models to be parameterized by other modelsI ... and distributions parameterized by distributions!I logNormal µ σ = expTransform (normal µ σ)I dirichlet_process n α (normal 0 1)

SOMEtimes we want models parameterized by FUNCTIONS on models.I M8 = mixture (beta a b) (\w -> m0(k,w))

Future Work

1. Dynamic instantiation of random variables:

I x = repeat (normal 0 1)I n = geometric 0.5I y = f (take n xs)

Future Work

1. Dynamic instantiation of random variables:I x = repeat (normal 0 1)

I n = geometric 0.5I y = f (take n xs)

Future Work

1. Dynamic instantiation of random variables:I x = repeat (normal 0 1)I n = geometric 0.5

I y = f (take n xs)

Future Work

1. Dynamic instantiation of random variables:I x = repeat (normal 0 1)I n = geometric 0.5I y = f (take n xs)

Source

https://github.com/bredelings/BAli-Phy

Other software for Bayesian InferenceI RevBayesI BEAST 1I BEAST 2I ChurchI Venture

Source

https://github.com/bredelings/BAli-Phy

Other software for Bayesian InferenceI RevBayesI BEAST 1I BEAST 2I ChurchI Venture


Recommended