sbi-dev/sbi
sbi is a Python package for simulation-based inference, designed to meet the needs of both researchers and practitioners. Whether you need fine-grained control or an easy-to-use interface, sbi has you covered.
sbi: Simulation-Based Inference
Getting Started |
Documentation | Discord Server
sbi is a Python package for simulation-based inference, designed to meet the needs of
both researchers and practitioners. Whether you need fine-grained control or an
easy-to-use interface, sbi has you covered.
With sbi, you can perform parameter inference using Bayesian inference: Given a
simulator that models a real-world process, SBI estimates the full posterior
distribution over the simulator’s parameters based on observed data. This distribution
indicates the most likely parameter values while additionally quantifying uncertainty
and revealing potential interactions between parameters.
Key Features of sbi
sbi offers a blend of flexibility and ease of use:
- Low-Level Interfaces: For those who require maximum control over the inference
process,sbiprovides low-level interfaces that allow you to fine-tune many aspects
of your workflow. - High-Level Interfaces: If you prefer simplicity and efficiency,
sbialso offers
high-level interfaces that enable quick and easy implementation of complex inference
tasks.
In addition, sbi supports a wide range of state-of-the-art inference algorithms (see
below for a list of implemented methods):
- Amortized Methods: These methods enable the reuse of posterior estimators across
multiple observations without the need to retrain. - Sequential Methods: These methods focus on individual observations, optimizing the
number of simulations required.
Beyond inference, sbi also provides:
- Validation Tools: Built-in methods to validate and verify the accuracy of your
inferred posteriors. - Plotting and Analysis Tools: Comprehensive functions for visualizing and analyzing
results, helping you interpret the posterior distributions with ease.
Getting started with sbi is straightforward, requiring only a few lines of code:
from sbi.inference import NPE
# Given: parameters theta and corresponding simulations x
inference = NPE(prior=prior)
inference.append_simulations(theta, x).train()
posterior = inference.build_posterior()Installation
sbi requires Python 3.10 or higher. While a GPU isn't necessary, it can improve
performance in some cases. We recommend using a virtual environment with
conda for an easy setup.
If conda is installed on the system, an environment for installing sbi can be created as follows:
conda create -n sbi_env python=3.10 && conda activate sbi_envFrom PyPI
To install sbi from PyPI run
python -m pip install sbiFrom conda-forge
To install and add sbi to a project with pixi, from the project directory run
pixi add sbiand to install into a particular conda environment with conda, in the activated environment run
conda install --channel conda-forge sbiIf uv is installed on the system, an environment for installing sbi can be created as follows:
uv venv -p 3.10Then activate the virtual enviroment by running:
-
For
macOSorLinuxuserssource .venv/bin/activate -
For
Windowsusers.venv\Scripts\activate
To install sbi run
uv add sbiTesting the installation
Open a Python prompt and run
from sbi.examples.minimal import simple
posterior = simple()
print(posterior)Tutorials
If you're new to sbi, we recommend starting with our Getting
Started tutorial.
You can also access and run these tutorials directly in your browser by opening
Codespace. To do so, click the green
“Code” button on the GitHub repository and select “Open with Codespaces.” This provides
a fully functional environment where you can explore sbi through Jupyter notebooks.
You might also find this tutorial paper useful: Deistler, M., Boelts, J., Steinbach, P., Moss, G., Moreau, T., Gloeckler, M., ... & Macke, J. H. (2025). Simulation-based inference: A practical guide. arXiv preprint arXiv:2508.12939.. It describes the SBI workflow and offers practical guidelines and diagnostic tools for every stage of the process: from setting up the simulator and prior, choosing and training inference networks, to performing inference and validating the results. It also includes several worked examples.
Inference Algorithms
The following inference algorithms are currently available. You can find instructions on
how to run each of these methods
here.
Neural Posterior Estimation: amortized (NPE) and sequential (SNPE)
-
(S)NPE_A
(including amortized single-roundNPE) from Papamakarios G and Murray I Fast
ε-free Inference of Simulation Models with Bayesian Conditional Density
Estimation
(NeurIPS 2016). -
(S)NPE_B
from Lueckmann JM, Goncalves P, Bassetto G, Öcal K, Nonnenmacher M, and Macke J Flexible
statistical inference for mechanistic models of neural dynamics
(NeurIPS 2017). -
(S)NPE_C
orAPTfrom Greenberg D, Nonnenmacher M, and Macke J Automatic Posterior
Transformation for likelihood-free inference (ICML
2019). -
TSNPEfrom Deistler M, Goncalves P, and Macke J Truncated proposals for scalable
and hassle-free simulation-based inference
(NeurIPS 2022). -
FMPE
from Wildberger, J., Dax, M., Buchholz, S., Green, S., Macke, J. H., & Schölkopf, B.
Flow matching for scalable simulation-based
inference.
(NeurIPS 2023). -
NPSEfrom
Geffner, T., Papamakarios, G., & Mnih, A. Compositional score modeling for
simulation-based inference.
(ICML 2023)
Neural Likelihood Estimation: amortized (NLE) and sequential (SNLE)
(S)NLE
or justSNLfrom Papamakarios G, Sterrat DC and Murray I Sequential Neural
Likelihood (AISTATS 2019).
Neural Ratio Estimation: amortized (NRE) and sequential (SNRE)
-
(S)NRE_A
orAALRfrom Hermans J, Begy V, and Louppe G. Likelihood-free Inference with
Amortized Approximate Likelihood Ratios (ICML
2020). -
(S)NRE_B
orSREfrom Durkan C, Murray I, and Papamakarios G. On Contrastive Learning for
Likelihood-free Inference (ICML 2020). -
(S)NRE_C
orNRE-Cfrom Miller BK, Weniger C, Forré P. Contrastive Neural Ratio
Estimation (NeurIPS 2022). -
BNREfrom
Delaunoy A, Hermans J, Rozet F, Wehenkel A, and Louppe G. Towards Reliable
Simulation-Based Inference with Balanced Neural Ratio
Estimation (NeurIPS 2022).
Neural Variational Inference, amortized (NVI) and sequential (SNVI)
SNVI
from Glöckler M, Deistler M, Macke J, Variational methods for simulation-based
inference (ICLR 2022).
Mixed Neural Likelihood Estimation (MNLE)
MNLEfrom
Boelts J, Lueckmann JM, Gao R, Macke J, Flexible and efficient simulation-based
inference for models of decision-making
(eLife 2022).
Feedback and Contributions
We welcome any feedback on how sbi is working for your inference problems (see
Discussions) and are happy to receive bug
reports, pull requests, and other feedback (see
contribute). We wish to maintain a
positive and respectful community; please read our Code of
Conduct.
Acknowledgments
sbi is the successor (using PyTorch) of the
delfi package. It started as a fork of Conor M.
Durkan's lfi. sbi runs as a community project. See also
credits.
Support
sbi has been supported by the German Federal Ministry of Education and Research (BMBF)
through project ADIMEM (FKZ 01IS18052 A-D), project SiMaLeSAM (FKZ 01IS21055A) and the
Tübingen AI Center (FKZ 01IS18039A). Since 2024, sbi is supported by the appliedAI
Institute for Europe, and by NumFOCUS.
License
Apache License Version 2.0 (Apache-2.0)
Citation
The sbi package has grown and improved significantly since its initial release, with
contributions from a large and diverse community. To reflect these developments and the
expanded functionality, we published an updated JOSS
paper. We encourage you to cite this
newer version as the primary reference:
@article{BoeltsDeistler_sbi_2025,
doi = {10.21105/joss.07754},
url = {https://doi.org/10.21105/joss.07754},
year = {2025},
publisher = {The Open Journal},
volume = {10},
number = {108},
pages = {7754},
author = {Jan Boelts and Michael Deistler and Manuel Gloeckler and Álvaro Tejero-Cantero and Jan-Matthis Lueckmann and Guy Moss and Peter Steinbach and Thomas Moreau and Fabio Muratore and Julia Linhart and Conor Durkan and Julius Vetter and Benjamin Kurt Miller and Maternus Herold and Abolfazl Ziaeemehr and Matthijs Pals and Theo Gruner and Sebastian Bischoff and Nastya Krouglova and Richard Gao and Janne K. Lappalainen and Bálint Mucsányi and Felix Pei and Auguste Schulz and Zinovia Stefanidi and Pedro Rodrigues and Cornelius Schröder and Faried Abu Zaid and Jonas Beck and Jaivardhan Kapoor and David S. Greenberg and Pedro J. Gonçalves and Jakob H. Macke},
title = {sbi reloaded: a toolkit for simulation-based inference workflows},
journal = {Journal of Open Source Software}
}This updated paper, with its expanded author list, reflects the broader community
contributions and the package's enhanced capabilities in releases
0.23.0 and later.
If you are using a version of sbi prior to 0.23.0, please cite the original sbi
software paper:
@article{tejero-cantero2020sbi,
doi = {10.21105/joss.02505},
url = {https://doi.org/10.21105/joss.02505},
year = {2020},
publisher = {The Open Journal},
volume = {5},
number = {52},
pages = {2505},
author = {Alvaro Tejero-Cantero and Jan Boelts and Michael Deistler and Jan-Matthis Lueckmann and Conor Durkan and Pedro J. Gonçalves and David S. Greenberg and Jakob H. Macke},
title = {sbi: A toolkit for simulation-based inference},
journal = {Journal of Open Source Software}
}Regardless of which software paper you cite, please also remember to cite the original
research articles describing the specific sbi-algorithm(s) you are using.
Specific releases of sbi are also citable via
Zenodo, where we generate a new software DOI for
each release.