mici 0.2.1

Creator: bradpython12

Last updated:

Add to Cart

Description:

mici 0.2.1

Mici is a Python package providing implementations of Markov chain Monte
Carlo (MCMC) methods for approximate inference in probabilistic models, with a
particular focus on MCMC methods based on simulating Hamiltonian dynamics on a
manifold.
Features
Key features include

a modular design allowing use of a wide range of inference algorithms by
mixing and matching different components, and making it easy to
extend the package,
a pure Python code base with minimal dependencies,
allowing easy integration within other code,
implementations of MCMC methods for sampling from distributions on embedded
manifolds implicitly-defined by a constraint equation and distributions on
Riemannian manifolds with a user-specified metric,
computationally efficient inference via transparent caching of the results
of expensive operations and intermediate results calculated in derivative
computations allowing later reuse without recalculation,
memory efficient inference for large models by memory-mapping chains to
disk, allowing long runs on large models without hitting memory issues.

Installation
To install and use Mici the minimal requirements are a Python 3.9+ environment
with NumPy and SciPy
installed. The latest Mici release on PyPI (and its dependencies) can be
installed in the current Python environment by running
pip install mici

To instead install the latest development version from the main branch on Github run
pip install git+https://github.com/matt-graham/mici

If available in the installed Python environment the following additional
packages provide extra functionality and features

Autograd: if available Autograd will
be used to automatically compute the required derivatives of the model
functions (providing they are specified using functions from the
autograd.numpy and autograd.scipy interfaces). To sample chains in
parallel using autograd functions you also need to install
multiprocess. This will
cause multiprocess.Pool to be used in preference to the in-built
mutiprocessing.Pool for parallelisation as multiprocess supports
serialisation (via dill) of a much
wider range of types, including of Autograd generated functions. Both
Autograd and multiprocess can be installed alongside Mici by running pip install mici[autodiff].
ArviZ: if ArviZ is
available the traces (dictionary) output of a sampling run can be directly
converted to an arviz.InferenceData container object using
arviz.convert_to_inference_data or implicitly converted by passing the
traces dictionary as the data argument
to ArviZ API functions,
allowing straightforward use of the ArviZ's extensive visualisation and
diagnostic functions.

Why Mici?
Mici is named for Augusta 'Mici'
Teller, who along with
Arianna Rosenbluth
developed the code for the MANIAC I
computer used in the seminal paper Equations of State Calculations by Fast
Computing Machines which introduced the
first example of a Markov chain Monte Carlo method.
Related projects
Other Python packages for performing MCMC inference include
PyMC,
PyStan (the Python interface to
Stan), Pyro /
NumPyro, TensorFlow
Probability,
emcee,
Sampyl and
BlackJAX.
Unlike PyMC, PyStan, (Num)Pyro and TensorFlow Probability which are complete
probabilistic programming frameworks including functionality for definining a
probabilistic model / program, but like emcee, Sampyl and BlackJAX, Mici is solely
focussed on providing implementations of inference algorithms, with the user
expected to be able to define at a minimum a function specifying the negative
log (unnormalized) density of the distribution of interest.
Further while PyStan, (Num)Pyro and TensorFlow Probability all push the
sampling loop into external compiled non-Python code, in Mici the sampling loop
is run directly within Python. This has the consequence that for small models
in which the negative log density of the target distribution and other model
functions are cheap to evaluate, the interpreter overhead in iterating over the
chains in Python can dominate the computational cost, making sampling much
slower than packages which outsource the sampling loop to a efficient compiled
implementation.
Overview of package
API documentation for the package is available
here. The three main user-facing
modules within the mici package are the systems, integrators and
samplers modules and you will generally need to create an instance of one
class from each module.
mici.systems -
Hamiltonian systems encapsulating model functions and their derivatives

EuclideanMetricSystem - systems with a metric on the position space with
a constant matrix representation,
GaussianEuclideanMetricSystem - systems in which the target distribution
is defined by a density with respect to the standard Gaussian measure on
the position space allowing analytically solving for flow corresponding to
the quadratic components of Hamiltonian
(Shahbaba et al., 2014),
RiemannianMetricSystem - systems with a metric on the position space
with a position-dependent matrix representation
(Girolami and Calderhead, 2011),
SoftAbsRiemannianMetricSystem - system with SoftAbs
eigenvalue-regularized Hessian of negative log target density as metric
matrix representation (Betancourt, 2013),
DenseConstrainedEuclideanMetricSystem - Euclidean-metric system subject
to holonomic constraints
(Hartmann and Schütte, 2005;
Brubaker, Salzmann and Urtasun, 2012;
Lelièvre, Rousset and Stoltz, 2019)
with a dense constraint function Jacobian matrix,

mici.integrators -
symplectic integrators for Hamiltonian dynamics

LeapfrogIntegrator - explicit leapfrog (Störmer-Verlet) integrator for
separable Hamiltonian systems
(Leimkulher and Reich, 2004),
ImplicitLeapfrogIntegrator - implicit (or generalized) leapfrog
integrator for Hamiltonian systems with non-separable component
(Leimkulher and Reich, 2004),
ImplicitMidpointIntegrator - implicit midpoint
integrator for general Hamiltonian systems
(Leimkulher and Reich, 2004),
SymmetricCompositionIntegrator - family of symplectic integrators for Hamiltonians
that can be split in to components with tractable flow maps, with specific
two-, three- and four-stage instantations due to Blanes, Casas and Sanz-Serna (2014),
ConstrainedLeapfrogIntegrator - constrained leapfrog integrator for
Hamiltonian systems subject to holonomic constraints
(Andersen, 1983;
Leimkuhler and Reich, 1994).

mici.samplers - MCMC
samplers for peforming inference

StaticMetropolisHMC - static integration time Hamiltonian Monte Carlo
with Metropolis accept step (Duane et al., 1987),
RandomMetropolisHMC - random integration time Hamiltonian Monte Carlo
with Metropolis accept step (Mackenzie, 1989),
DynamicSliceHMC - dynamic integration time Hamiltonian Monte Carlo
with slice sampling from trajectory, equivalent to the original 'NUTS' algorithm
(Hoffman and Gelman, 2014).
DynamicMultinomialHMC - dynamic integration time Hamiltonian Monte Carlo
with multinomial sampling from trajectory, equivalent to the current default
MCMC algorithm in Stan
(Hoffman and Gelman, 2014;
Betancourt, 2017).

Notebooks
The manifold MCMC methods implemented in Mici have been used in several research projects. Below links are provided to a selection of Jupyter notebooks associated with these projects as demonstrations of how to use Mici and to illustrate some of the settings in which manifold MCMC methods can be computationally advantageous.


Manifold lifting: MCMC in the vanishing noise regime


Open non-interactive version with nbviewer







Open interactive version with Binder







Open interactive version with Google Colab









Manifold MCMC methods for inference in diffusion models


Open non-interactive version with nbviewer







Open interactive version with Binder







Open interactive version with Google Colab







Example: sampling on a torus

A simple complete example of using the package to compute approximate samples
from a distribution on a two-dimensional torus embedded in a three-dimensional
space is given below. The computed samples are visualized in the animation
above. Here we use autograd to automatically construct functions to calculate
the required derivatives (gradient of negative log density of target
distribution and Jacobian of constraint function), sample four chains in
parallel using multiprocess, use arviz to calculate diagnostics and use
matplotlib to plot the samples.

⚠️ If you do not have multiprocess installed the example code below will hang or raise an error when sampling the chains as the inbuilt multiprocessing module does not support pickling Autograd functions.

from mici import systems, integrators, samplers
import autograd.numpy as np
import matplotlib.pyplot as plt
from mpl_toolkits.mplot3d import Axes3D
import matplotlib.animation as animation
import arviz

# Define fixed model parameters
R = 1.0 # toroidal radius ∈ (0, ∞)
r = 0.5 # poloidal radius ∈ (0, R)
α = 0.9 # density fluctuation amplitude ∈ [0, 1)

# Define constraint function such that the set {q : constr(q) == 0} is a torus
def constr(q):
x, y, z = q.T
return np.stack([((x**2 + y**2)**0.5 - R)**2 + z**2 - r**2], -1)

# Define negative log density for the target distribution on torus
# (with respect to 2D 'area' measure for torus)
def neg_log_dens(q):
x, y, z = q.T
θ = np.arctan2(y, x)
ϕ = np.arctan2(z, x / np.cos(θ) - R)
return np.log1p(r * np.cos(ϕ) / R) - np.log1p(np.sin(4*θ) * np.cos(ϕ) * α)

# Specify constrained Hamiltonian system with default identity metric
system = systems.DenseConstrainedEuclideanMetricSystem(neg_log_dens, constr)

# System is constrained therefore use constrained leapfrog integrator
integrator = integrators.ConstrainedLeapfrogIntegrator(system)

# Seed a random number generator
rng = np.random.default_rng(seed=1234)

# Use dynamic integration-time HMC implementation as MCMC sampler
sampler = samplers.DynamicMultinomialHMC(system, integrator, rng)

# Sample initial positions on torus using parameterisation (θ, ϕ) ∈ [0, 2π)²
# x, y, z = (R + r * cos(ϕ)) * cos(θ), (R + r * cos(ϕ)) * sin(θ), r * sin(ϕ)
n_chain = 4
θ_init, ϕ_init = rng.uniform(0, 2 * np.pi, size=(2, n_chain))
q_init = np.stack([
(R + r * np.cos(ϕ_init)) * np.cos(θ_init),
(R + r * np.cos(ϕ_init)) * np.sin(θ_init),
r * np.sin(ϕ_init)], -1)

# Define function to extract variables to trace during sampling
def trace_func(state):
x, y, z = state.pos
return {'x': x, 'y': y, 'z': z}

# Sample 4 chains in parallel with 500 adaptive warm up iterations in which the
# integrator step size is tuned, followed by 2000 non-adaptive iterations
final_states, traces, stats = sampler.sample_chains(
n_warm_up_iter=500,
n_main_iter=2000,
init_states=q_init,
n_process=4,
trace_funcs=[trace_func]
)

# Print average accept probability and number of integrator steps per chain
for c in range(n_chain):
print(f"Chain {c}:")
print(f" Average accept prob. = {stats['accept_stat'][c].mean():.2f}")
print(f" Average number steps = {stats['n_step'][c].mean():.1f}")

# Print summary statistics and diagnostics computed using ArviZ
print(arviz.summary(traces))

# Visualize concatentated chain samples as animated 3D scatter plot
fig = plt.figure(figsize=(4, 4))
ax = Axes3D(fig, [0., 0., 1., 1.], proj_type='ortho')
points_3d, = ax.plot(*(np.concatenate(traces[k]) for k in 'xyz'), '.', ms=0.5)
ax.axis('off')
for set_lim in [ax.set_xlim, ax.set_ylim, ax.set_zlim]:
set_lim((-1, 1))

def update(i):
angle = 45 * (np.sin(2 * np.pi * i / 60) + 1)
ax.view_init(elev=angle, azim=angle)
return (points_3d,)

anim = animation.FuncAnimation(fig, update, frames=60, interval=100, blit=True)

References

Andersen, H.C., 1983. RATTLE: A “velocity”
version of the SHAKE algorithm for molecular dynamics calculations.
Journal of Computational Physics, 52(1), pp.24-34.

Duane, S., Kennedy, A.D., Pendleton, B.J. and
Roweth, D., 1987. Hybrid Monte Carlo. Physics letters B, 195(2),
pp.216-222.

Mackenzie, P.B., 1989. An improved
hybrid Monte Carlo method. Physics Letters B, 226(3-4), pp.369-371.

Horowitz, A.M., 1991. A generalized
guided Monte Carlo algorithm. Physics Letters B, 268(CERN-TH-6172-91),
pp.247-252.

Leimkuhler, B. and Reich, S., 1994.
Symplectic integration of constrained Hamiltonian systems. Mathematics of
Computation, 63(208), pp.589-605.

Leimkuhler, B. and Reich, S., 2004.
Simulating Hamiltonian dynamics (Vol. 14). Cambridge University Press.

Hartmann, C. and Schütte, C., 2005. A
constrained hybrid Monte‐Carlo algorithm and the problem of calculating the
free energy in several variables. ZAMM ‐ Journal of Applied Mathematics and
Mechanics, 85(10), pp.700-710.

Girolami, M. and Calderhead, B., 2011.
Riemann manifold Langevin and Hamiltonian Monte Carlo methods. Journal of
the Royal Statistical Society: Series B (Statistical Methodology), 73(2), pp.123-214.

Brubaker, M., Salzmann, M. and
Urtasun, R., 2012. A family of MCMC methods on implicitly defined
manifolds. In Artificial intelligence and statistics (pp. 161-172).

Betancourt, M., 2013. A general metric
for Riemannian manifold Hamiltonian Monte Carlo. In Geometric science of
information (pp. 327-334).


Hoffman, M.D. and Gelman, A., 2014. The
No-U-turn sampler: adaptively setting path lengths in Hamiltonian Monte
Carlo. Journal of Machine Learning Research, 15(1), pp.1593-1623.


Shahbaba, B., Lan, S., Johnson, W.O. and
Neal, R.M., 2014. Split Hamiltonian Monte Carlo. Statistics and
Computing, 24(3), pp.339-349.


Blanes, S., Casas, F., & Sanz-Serna, J. M., 2014.
Numerical integrators for the Hybrid Monte Carlo method.
SIAM Journal on Scientific Computing, 36(4), A1556-A1580.


Betancourt, M., 2017. A conceptual
introduction to Hamiltonian Monte Carlo.

Lelièvre, T., Rousset, M. and Stoltz, G.,
2019. Hybrid Monte Carlo methods for sampling probability measures on
submanifolds. In Numerische Mathematik, 143(2), (pp.379-421).

License

For personal and professional use. You cannot resell or redistribute these repositories in their original state.

Customer Reviews

There are no reviews.