de valpine nimble
TRANSCRIPT
NIMBLE
(First release by summer)
model language (BUGS)
Algorithm language
Why algorithm programmability?
• Simple advanced MCMC: block updaters, mulHple updaters, mulHple scales, modularity
• Advanced advanced MCMC: bridge sampling, reversible jump, adapHve MCMC, tempering, approximaHng models
• ParHcle filtering • Importance sampling • Laplace approximaHon, adapHve Gaussian quadrature • Kalman Filtering, Unscented KF, extended KF • Prior sensiHvity analysis by reweighHng • Monte Carlo ExpectaHon MaximizaHon (MCEM) • Data cloning • Approximate Bayesian ComputaHon • Posterior predicHve checks (without re-‐running MCMC) • Normalizing constants: bridge sampling; see above • CombinaHon algorithms: PF + MCMC, MCMC + Laplace
Algorithm developers need a system for model-‐flexibility and distribuHon.
What we do with BUGS models
model { mu ~ dnorm(0, 10) logit(p) ~ dnorm(mu, sigma) y ~ dbin(N, p) }
What we do with BUGS models
model { mu ~ dnorm(0, 10) logit(p) ~ dnorm(mu, sigma) y ~ dbin(N, p) }
Extract all semanHc relaHonships. Build a graphical model object R (igraph). Generate C++ code Compile, load, and provide interface objects.
What we do with BUGS models
model { mu ~ dnorm(0, 10) logit(p) ~ dnorm(mu, sigma) y ~ dbin(N, p) }
Extract all semanHc relaHonships. Build a graphical model object R (igraph). Generate C++ code Compile, load, and provide interface objects.
mymodel$mu <-‐ 5 toCalc <-‐ getDependencies(mymodel, "mu") calculate(mymodel, toCalc)
What we do with BUGS models
Some "automaHc" extensions of BUGS 1. Expressions as arguments 2. MulHple parameterizaHons for distribuHons (e.g. precision or std. deviaHon) 3. Compile-‐Hme if-‐then-‐else 4. Single-‐line moHfs: y[1:10] ~ glmm(X * A + (1 | Group), family = "binomial")
model { mu ~ dnorm(0, 10) logit(p) ~ dnorm(mu, sigma) y ~ dbin(N, p) }
Extract all semanHc relaHonships. Build a graphical model object R (igraph). Generate C++ code Compile, load, and provide interface objects.
mymodel$mu <-‐ 5 toCalc <-‐ getDependencies(mymodel, "mu") calculate(mymodel, toCalc)
How we write algorithms Random-‐walk MCMC updater
1. setup arguments: model, targetNode
How we write algorithms Random-‐walk MCMC updater
1. setup arguments: model, targetNode
2. setup code: use model structure to query targetNode dependencies
Processed ONCE, in R
How we write algorithms Random-‐walk MCMC updater
1. setup arguments: model, targetNode
2. setup code: use model structure to query targetNode dependencies
Processed ONCE, in R
3. run-‐Hme code: (i) put a proposal value in the targetNode (ii) calculate needed log likelihoods (iii) accept or reject
Becomes C++ code, compiled & interfaced
How we write algorithms Random-‐walk MCMC updater
1. setup arguments: model, targetNode
2. setup code: use model structure to query targetNode dependencies
Processed ONCE, in R
3. run-‐Hme code: (i) put a proposal value in the targetNode (ii) calculate needed log likelihoods (iii) accept or reject
Becomes C++ code, compiled & interfaced
Building a program with many funcHons: • model structure (setup arguments) processed in R • algorithm executed in C++
How we write algorithms Random-‐walk MCMC updater
1. setup arguments: model, targetNode
2. setup code: use model structure to query targetNode dependencies
Processed ONCE, in R
3. run-‐Hme code: (i) put a proposal value in the targetNode (ii) calculate needed log likelihoods (iii) accept or reject
Becomes C++ code, compiled & interfaced
Building a program with many funcHons: • model structure (setup arguments) processed in R • algorithm executed in C++
We provide data structures for sets of model values (e.g. MCMC output)
ConnecHons to other engines
When another system has a great algorithm, we want to wrap access to it.
ConnecHons to other engines
When another system has a great algorithm, we want to wrap access to it.
Example: Monte Carlo ExpectaHon MaximizaHon (MCEM) • Uses MCMC repeatedly on latent states • Parameters updated between MCMCs • We don't have to use our own MCMC if someone else has a great one
ConnecHons to other engines
When another system has a great algorithm, we want to wrap access to it.
Example: Monte Carlo ExpectaHon MaximizaHon (MCEM) • Uses MCMC repeatedly on latent states • Parameters updated between MCMCs • We don't have to use our own MCMC if someone else has a great one
How? We have high-‐level representaHon of model structure. Can generate specificaHons needed for other packages.
Who we think will use NIMBLE
1. Algorithm developers
2. Data analysts
3. Workflow pipelines
Numerical!Integration of!Mixture Models for!Bayesian and !Likelihood!Estimation!
Team:!Perry de Valpine!Chris Paciorek!Daniel Turek!Cliff Anderson-Bergmann!Ras Bodik!Duncan Temple Lang!
Funding:!NSF Advances in !Biological Informatics!
Common situaHon
model language
(Black box algorithm)
How we build an MCMC
1. R funcHons use model/graph objects in R to inspect a parHcular model: -‐ Build list of updaters
-‐ Can be inspected, modified -‐ You can write your own
How we build an MCMC
1. R funcHons use model/graph objects in R to inspect a parHcular model: -‐ Build list of updaters
-‐ Can be inspected, modified -‐ You can write your own
2. Generate C++ code for all updaters.
How we build an MCMC
1. R funcHons use model/graph objects in R to inspect a parHcular model: -‐ Build list of updaters
-‐ Can be inspected, modified -‐ You can write your own
2. Generate C++ code for all updaters.
3. Compile, load, build and interface to objects.