Liesel is a probabilistic programming framework with a focus on semi-parametric regression. It includes:
The name “Liesel” is an homage to the Gänseliesel fountain, landmark of Liesel’s birth city Göttingen.
The following example shows how to build a simple i.i.d. normal model with Liesel. We set up two parameters and one observed variable, and combine them in a model.
import jax.numpy as jnp
import tensorflow_probability.substrates.jax.distributions as tfd
import liesel.model as lsl
loc = lsl.param(0.0, name="loc")
scale = lsl.param(1.0, name="scale")
y = lsl.obs(
value=jnp.array([1.314, 0.861, -1.813, 0.587, -1.408]),
distribution=lsl.Dist(tfd.Normal, loc=loc, scale=scale),
name="y",
)
model = lsl.Model([loc, scale, y])
The model allows us to evaluate the log-probability through a property, which is updated automatically if the value of a node is modified.
model.log_prob
Array(-8.635652, dtype=float32)
model.vars["loc"].value = -0.5
model.log_prob
Array(-9.031153, dtype=float32)
We can estimate the mean parameter with Goose and a NUTS sampler.
Goose’s workhorse to run an MCMC algorithm is the Engine
, which can be
constructed with the EngineBuilder
. The builder allows us to assign
different MCMC kernels to one or more parameters. We also need to
specify the model, the initial values, and the sampling duration, before
we can run the sampler.
import liesel.goose as gs
builder = gs.EngineBuilder(seed=42, num_chains=4)
builder.add_kernel(gs.NUTSKernel(["loc"]))
builder.set_model(gs.LieselInterface(model))
builder.set_initial_values(model.state)
builder.set_duration(warmup_duration=1000, posterior_duration=1000)
engine = builder.build()
liesel.goose.builder - WARNING - No jitter functions provided. The initial values won't be jittered
liesel.goose.engine - INFO - Initializing kernels...
liesel.goose.engine - INFO - Done
engine.sample_all_epochs()
liesel.goose.engine - INFO - Starting epoch: FAST_ADAPTATION, 75 transitions, 25 jitted together
liesel.goose.engine - WARNING - Errors per chain for kernel_00: 2, 1, 2, 0 / 75 transitions
liesel.goose.engine - INFO - Finished epoch
liesel.goose.engine - INFO - Starting epoch: SLOW_ADAPTATION, 25 transitions, 25 jitted together
liesel.goose.engine - WARNING - Errors per chain for kernel_00: 1, 1, 1, 1 / 25 transitions
liesel.goose.engine - INFO - Finished epoch
liesel.goose.engine - INFO - Starting epoch: SLOW_ADAPTATION, 50 transitions, 25 jitted together
liesel.goose.engine - WARNING - Errors per chain for kernel_00: 1, 1, 1, 1 / 50 transitions
liesel.goose.engine - INFO - Finished epoch
liesel.goose.engine - INFO - Starting epoch: SLOW_ADAPTATION, 100 transitions, 25 jitted together
liesel.goose.engine - WARNING - Errors per chain for kernel_00: 1, 2, 2, 1 / 100 transitions
liesel.goose.engine - INFO - Finished epoch
liesel.goose.engine - INFO - Starting epoch: SLOW_ADAPTATION, 200 transitions, 25 jitted together
liesel.goose.engine - WARNING - Errors per chain for kernel_00: 1, 4, 1, 1 / 200 transitions
liesel.goose.engine - INFO - Finished epoch
liesel.goose.engine - INFO - Starting epoch: SLOW_ADAPTATION, 500 transitions, 25 jitted together
liesel.goose.engine - WARNING - Errors per chain for kernel_00: 2, 1, 1, 2 / 500 transitions
liesel.goose.engine - INFO - Finished epoch
liesel.goose.engine - INFO - Starting epoch: FAST_ADAPTATION, 50 transitions, 25 jitted together
liesel.goose.engine - WARNING - Errors per chain for kernel_00: 1, 1, 2, 2 / 50 transitions
liesel.goose.engine - INFO - Finished epoch
liesel.goose.engine - INFO - Finished warmup
liesel.goose.engine - INFO - Starting epoch: POSTERIOR, 1000 transitions, 25 jitted together
liesel.goose.engine - INFO - Finished epoch
Finally, we can print a summary table and view some diagnostic plots.
results = engine.get_results()
gs.Summary(results)
Parameter summary:
kernel | mean | sd | q_0.05 | q_0.5 | q_0.95 | sample_size | ess_bulk | ess_tail | rhat | ||
---|---|---|---|---|---|---|---|---|---|---|---|
parameter | index | ||||||||||
loc | () | kernel_00 | -0.083 | 0.445 | -0.810 | -0.091 | 0.652 | 4000 | 1459.234 | 1874.643 | 1.002 |
Error summary:
count | relative | ||||
---|---|---|---|---|---|
kernel | error_code | error_msg | phase | ||
kernel_00 | 1 | divergent transition | warmup | 38 | 0.009 |
posterior | 0 | 0.000 |
gs.plot_param(results, param="loc")
Liesel requires Python ≥ 3.10. Create and activate a virtual environment, and install the latest release from PyPI:
pip install liesel
You can also install the development version from GitHub:
git clone https://github.com/liesel-devs/liesel.git
cd liesel
pip install .
# or `pip install -e .[dev]` for an editable install including the dev utils
Liesel depends on JAX and jaxlib
. As of now, there are no official
jaxlib
wheels for Windows. If you are on Windows, the JAX developers
recommend using the Windows Subsystem for
Linux.
Alternatively, you can build jaxlib
from
source
or try the unofficial jaxlib
wheels from
https://github.com/cloudhan/jax-windows-builder.
If you are using the lsl.plot_model()
function, installing
pygraphviz
will greatly improve the layout of the model graphs. Make
sure you have the Graphviz development headers
on your system and run:
pip install pygraphviz
Again, the installation is a bit more challenging on Windows, but there
are instructions on the pygraphviz
website.
Please run
pre-commit run -a
before committing your work,pytest --run-mcmc
, andpytest --doctest-modules liesel
.Liesel is being developed by Paul Wiemann, Hannes Riebl, Johannes Brachem and Gianmarco Callegher with support from Thomas Kneib. Important contributions were made by Joel Beck and Alex Afanasev. We are grateful to the German Research Foundation (DFG) for funding the development through grant KN 922/11-1.