Skip to content
Merged
Show file tree
Hide file tree
Changes from 5 commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
8a77ffa
DOC improve organisation
tomMoral Apr 22, 2024
14dbf9c
FIX pre-commit hooks
tomMoral Apr 22, 2024
89b45a0
add index for tutorials
tomMoral Apr 22, 2024
8818d70
refactor new landing page.
janfb May 7, 2024
867355a
update docs dependencies
janfb May 7, 2024
b20136f
wip: replace nav-bar dropdown with index.
janfb Jun 3, 2024
5891f82
shorter snippet, add publications to main page.
janfb Jun 11, 2024
4552d15
show faq as ordered list, format install.md.
janfb Jun 11, 2024
b19e4d2
join tutorial instructions from README and tutorials/index.
janfb Jun 11, 2024
d388422
shorten snippet, remove navigation bar dropdown details
janfb Jun 11, 2024
ca32ce1
fix snippet
janfb Jun 11, 2024
8d93d5e
fix: links and md headings.
janfb Jun 20, 2024
c7480c1
FIX small changes in index.md
tomMoral Jul 30, 2024
e5d7209
DOC expose doc on PR
tomMoral Jul 30, 2024
2f63976
FIX linter
tomMoral Jul 30, 2024
d2a7b28
FIX workflow target
tomMoral Jul 30, 2024
d1c555d
FIX check doc workflow
tomMoral Jul 30, 2024
2f30d43
Merge branch 'main' into DOC_improve_doc
janfb Aug 6, 2024
b732f6c
refactor: improve landing page and credits; update methods
janfb Aug 6, 2024
7f22346
docs: change gh action to convert nbs and deploy docs upon release
janfb Aug 7, 2024
1a17846
CLN remove mkdocs-jupyter pluggin
tomMoral Aug 7, 2024
1393d8a
DOC remove mkdocs-jupyter+add doc version control
tomMoral Aug 7, 2024
d4675c6
MTN update .gitignore
tomMoral Aug 7, 2024
9af6295
fix: griffe warnings about .md links; refactoring text.
janfb Aug 8, 2024
988afa4
fix: configure gh user in action for pushing to gh-pages
janfb Aug 8, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 0 additions & 7 deletions docs/docs/contribute.md
Original file line number Diff line number Diff line change
Expand Up @@ -216,10 +216,3 @@ mkdocs serve
and open a browser on the page proposed by `mkdocs`. Now, whenever you
make changes to the markdown files of the documentation, you can see the results
almost immediately in the browser.

Note that the tutorials and examples are initially written in jupyter notebooks
and then converted to markdown programatically. To do so locally, you should run
```
jupyter nbconvert --to markdown ../tutorials/*.ipynb --output-dir docs/tutorial/
jupyter nbconvert --to markdown ../examples/*.ipynb --output-dir docs/examples/
```
1 change: 1 addition & 0 deletions docs/docs/examples
146 changes: 72 additions & 74 deletions docs/docs/index.md
Original file line number Diff line number Diff line change
@@ -1,39 +1,53 @@
# `sbi`: simulation-based inference
# `sbi`: simulation-based inference toolkit

`sbi`: A Python toolbox for simulation-based inference.
`sbi` provides access to simulation-based inference methods via a user-friendly
interface:

![using sbi](static/infer_demo.gif)
```python
# simulation
theta = prior.sample((1000,))
x = simulator(theta)

Inference can be run in a single line of code
# training
inference = SNPE(prior).append_simulations(theta, x)
inference.train()

```python
posterior = infer(simulator, prior, method='SNPE', num_simulations=1000)
# inference
posterior = inference.build_posterior()
posterior_samples = posterior.sample((1000,), x=x_o)
```

or in a few lines for more flexibility:
## Overview

```python
inference = SNPE(prior=prior)
_ = inference.append_simulations(theta, x).train()
posterior = inference.build_posterior()
**To get started, install the `sbi` package with:**

```commandline
pip install sbi
```

`sbi` lets you choose from a variety of _amortized_ and _sequential_ SBI methods:
for more advanced install options, see our [Install Guide](install.md).

Amortized methods return a posterior that can be applied to many different observations without retraining,
whereas sequential methods focus the inference on one particular observation to be more simulation-efficient.
For an overview of implemented methods see below, or checkout or [GitHub page](https://github.com/mackelab/sbi).
Then, check out our material:

## Overview
<div class="grid cards" markdown>

- :dart: [__Motivation and approach__](#motivation-and-approach)
<br/><br/>
*General motivation for the SBI framework and methods included in `sbi`.*

- :rocket: [__Tutorials__](tutorials/)
<br/><br/>
*Various examples illustrating how to use the `sbi` package.*

- To learn about the general motivation behind simulation-based inference, and the
inference methods included in `sbi`, read on below.
- :building_construction: [__Reference API__](reference/)
<br/><br/>
*The detailed description of the package classes and functions.*

- For example applications to canonical problems in neuroscience, browse the recent
research article [Training deep neural density estimators to identify mechanistic models of neural dynamics](https://doi.org/10.7554/eLife.56261).
- :book: [__Citation__](citation.md)
<br/><br/>
*How to cite the `sbi` package.*

- If you want to get started using `sbi` on your own problem, jump to
[installation](install.md) and then check out the [tutorial](tutorial/00_getting_started.md).
</div>

## Motivation and approach

Expand All @@ -42,9 +56,9 @@ numerical simulations to describe the structure and dynamics of the processes be
investigated.

A key challenge in simulation-based science is constraining these simulation models'
parameters, which are intepretable quantities, with observational data. Bayesian
parameters, which are interpretable quantities, with observational data. Bayesian
inference provides a general and powerful framework to invert the simulators, i.e.
describe the parameters which are consistent both with empirical data and prior
describe the parameters that are consistent both with empirical data and prior
knowledge.

In the case of simulators, a key quantity required for statistical inference, the
Expand All @@ -63,71 +77,55 @@ parameter space and the observation space, one of the methods will be more suita

![](./static/goal.png)

**Goal: Algorithmically identify mechanistic models which are consistent with data.**
**Goal: Algorithmically identify mechanistic models that are consistent with data.**

Each of the methods above needs three inputs: A candidate mechanistic model, prior
knowledge or constraints on model parameters, and observational data (or summary statistics
thereof).
Each of the methods above needs three inputs: A candidate mechanistic model,
prior knowledge or constraints on model parameters, and observational data (or
summary statistics thereof).

The methods then proceed by

1. sampling parameters from the prior followed by simulating synthetic data from
these parameters,
2. learning the (probabilistic) association between data (or
data features) and underlying parameters, i.e. to learn statistical inference from
simulated data. The way in which this association is learned differs between the
above methods, but all use deep neural networks.
3. This learned neural network is then applied to empirical data to derive the full
space of parameters consistent with the data and the prior, i.e. the posterior
distribution. High posterior probability is assigned to parameters which are
consistent with both the data and the prior, low probability to inconsistent
parameters. While SNPE directly learns the posterior distribution, SNLE and SNRE need
an extra MCMC sampling step to construct a posterior.
4. If needed, an initial estimate of the posterior can be used to adaptively generate
additional informative simulations.

## Publications
2. learning the (probabilistic) association between data (or data features) and
underlying parameters, i.e. to learn statistical inference from simulated
data. How this association is learned differs between the above methods, but
all use deep neural networks.
3. This learned neural network is then applied to empirical data to derive the
full space of parameters consistent with the data and the prior, i.e. the
posterior distribution. The posterior assigns high probability to parameters
that are consistent with both the data and the prior, and low probability to
inconsistent parameters. While SNPE directly learns the posterior
distribution, SNLE and SNRE need an extra MCMC sampling step to construct a
posterior.
4. If needed, an initial estimate of the posterior can be used to adaptively
generate additional informative simulations.

See [Cranmer, Brehmer, Louppe (2020)](https://doi.org/10.1073/pnas.1912789117) for a recent
review on simulation-based inference.

The following papers offer additional details on the inference methods implemented in `sbi`.
You can find a tutorial on how to run each of these methods [here](https://sbi-dev.github.io/sbi/tutorial/16_implemented_methods/).

### Posterior estimation (`(S)NPE`)

- **Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation**<br> by Papamakarios & Murray (NeurIPS 2016) <br>[[PDF]](https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation.pdf) [[BibTeX]](https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation/bibtex)

- **Flexible statistical inference for mechanistic models of neural dynamics** <br> by Lueckmann, Goncalves, Bassetto, Öcal, Nonnenmacher & Macke (NeurIPS 2017) <br>[[PDF]](https://papers.nips.cc/paper/6728-flexible-statistical-inference-for-mechanistic-models-of-neural-dynamics.pdf) [[BibTeX]](https://papers.nips.cc/paper/6728-flexible-statistical-inference-for-mechanistic-models-of-neural-dynamics/bibtex)

- **Automatic posterior transformation for likelihood-free inference**<br>by Greenberg, Nonnenmacher & Macke (ICML 2019) <br>[[PDF]](http://proceedings.mlr.press/v97/greenberg19a/greenberg19a.pdf) [[BibTeX]](data:text/plain;charset=utf-8,%0A%0A%0A%0A%0A%0A%40InProceedings%7Bpmlr-v97-greenberg19a%2C%0A%20%20title%20%3D%20%09%20%7BAutomatic%20Posterior%20Transformation%20for%20Likelihood-Free%20Inference%7D%2C%0A%20%20author%20%3D%20%09%20%7BGreenberg%2C%20David%20and%20Nonnenmacher%2C%20Marcel%20and%20Macke%2C%20Jakob%7D%2C%0A%20%20booktitle%20%3D%20%09%20%7BProceedings%20of%20the%2036th%20International%20Conference%20on%20Machine%20Learning%7D%2C%0A%20%20pages%20%3D%20%09%20%7B2404--2414%7D%2C%0A%20%20year%20%3D%20%09%20%7B2019%7D%2C%0A%20%20editor%20%3D%20%09%20%7BChaudhuri%2C%20Kamalika%20and%20Salakhutdinov%2C%20Ruslan%7D%2C%0A%20%20volume%20%3D%20%09%20%7B97%7D%2C%0A%20%20series%20%3D%20%09%20%7BProceedings%20of%20Machine%20Learning%20Research%7D%2C%0A%20%20address%20%3D%20%09%20%7BLong%20Beach%2C%20California%2C%20USA%7D%2C%0A%20%20month%20%3D%20%09%20%7B09--15%20Jun%7D%2C%0A%20%20publisher%20%3D%20%09%20%7BPMLR%7D%2C%0A%20%20pdf%20%3D%20%09%20%7Bhttp%3A%2F%2Fproceedings.mlr.press%2Fv97%2Fgreenberg19a%2Fgreenberg19a.pdf%7D%2C%0A%20%20url%20%3D%20%09%20%7Bhttp%3A%2F%2Fproceedings.mlr.press%2Fv97%2Fgreenberg19a.html%7D%2C%0A%20%20abstract%20%3D%20%09%20%7BHow%20can%20one%20perform%20Bayesian%20inference%20on%20stochastic%20simulators%20with%20intractable%20likelihoods%3F%20A%20recent%20approach%20is%20to%20learn%20the%20posterior%20from%20adaptively%20proposed%20simulations%20using%20neural%20network-based%20conditional%20density%20estimators.%20However%2C%20existing%20methods%20are%20limited%20to%20a%20narrow%20range%20of%20proposal%20distributions%20or%20require%20importance%20weighting%20that%20can%20limit%20performance%20in%20practice.%20Here%20we%20present%20automatic%20posterior%20transformation%20(APT)%2C%20a%20new%20sequential%20neural%20posterior%20estimation%20method%20for%20simulation-based%20inference.%20APT%20can%20modify%20the%20posterior%20estimate%20using%20arbitrary%2C%20dynamically%20updated%20proposals%2C%20and%20is%20compatible%20with%20powerful%20flow-based%20density%20estimators.%20It%20is%20more%20flexible%2C%20scalable%20and%20efficient%20than%20previous%20simulation-based%20inference%20techniques.%20APT%20can%20operate%20directly%20on%20high-dimensional%20time%20series%20and%20image%20data%2C%20opening%20up%20new%20applications%20for%20likelihood-free%20inference.%7D%0A%7D%0A)
## Getting started with the `sbi` package

- **Truncated proposals for scalable and hassle-free simulation-based inference** <br> by Deistler, Goncalves & Macke (NeurIPS 2022) <br>[[Paper]](https://arxiv.org/abs/2210.04815)
Once `sbi` is installed, inference can be run in a single line of code

```python
posterior = infer(simulator, prior, method='SNPE', num_simulations=1000)
```

### Likelihood-estimation (`(S)NLE`)

- **Sequential neural likelihood: Fast likelihood-free inference with autoregressive flows**<br>by Papamakarios, Sterratt & Murray (AISTATS 2019) <br>[[PDF]](http://proceedings.mlr.press/v89/papamakarios19a/papamakarios19a.pdf) [[BibTeX]](https://gpapamak.github.io/bibtex/snl.bib)

- **Variational methods for simulation-based inference** <br> by Glöckler, Deistler, Macke (ICLR 2022) <br>[[Paper]](https://arxiv.org/abs/2203.04176)

- **Flexible and efficient simulation-based inference for models of decision-making** <br> by Boelts, Lueckmann, Gao, Macke (Elife 2022) <br>[[Paper]](https://elifesciences.org/articles/77220)


### Likelihood-ratio-estimation (`(S)NRE`)

- **Likelihood-free MCMC with Amortized Approximate Likelihood Ratios**<br>by Hermans, Begy & Louppe (ICML 2020) <br>[[PDF]](http://proceedings.mlr.press/v119/hermans20a/hermans20a.pdf)

- **On Contrastive Learning for Likelihood-free Inference**<br>Durkan, Murray & Papamakarios (ICML 2020) <br>[[PDF]](http://proceedings.mlr.press/v119/durkan20a/durkan20a.pdf)

- **Towards Reliable Simulation-Based Inference with Balanced Neural Ratio Estimation**<br>by Delaunoy, Hermans, Rozet, Wehenkel & Louppe (NeurIPS 2022) <br>[[PDF]](https://arxiv.org/pdf/2208.13624.pdf)
or in a few lines for more flexibility:

- **Contrastive Neural Ratio Estimation**<br>Benjamin Kurt Miller, Christoph Weniger, Patrick Forré (NeurIPS 2022) <br>[[PDF]](https://arxiv.org/pdf/2210.06170.pdf)
```python
inference = SNPE(prior=prior)
_ = inference.append_simulations(theta, x).train()
posterior = inference.build_posterior()
```

### Utilities
`sbi` lets you choose from a variety of _amortized_ and _sequential_ SBI methods:

- **Restriction estimator**<br>by Deistler, Macke & Goncalves (PNAS 2022) <br>[[Paper]](https://www.pnas.org/doi/10.1073/pnas.2207632119)
Amortized methods return a posterior that can be applied to many different
observations without retraining, whereas sequential methods focus the inference
on one particular observation to be more simulation-efficient.

- **Simulation-based calibration**<br>by Talts, Betancourt, Simpson, Vehtari, Gelman (arxiv 2018) <br>[[Paper]](https://arxiv.org/abs/1804.06788))

- **Expected coverage (sample-based)**<br>as computed in Deistler, Goncalves, Macke [[Paper]](https://arxiv.org/abs/2210.04815) and in Rozet, Louppe [[Paper]](https://matheo.uliege.be/handle/2268.2/12993)
For an overview of implemented methods see [the Inference API's reference](
reference/inference/), or checkout or [GitHub page](https://github.com/mackelab/sbi).
42 changes: 42 additions & 0 deletions docs/docs/publications.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
# Publications

This page references papers to get additional details on the inference metods implemented in `sbi`.
You can find a tutorial on how to run each of these methods [here](../tutorial/16_implemented_methods/).

## Posterior estimation (`(S)NPE`)

- **Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation**<br> by Papamakarios & Murray (NeurIPS 2016) <br>[[PDF]](https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation.pdf) [[BibTeX]](https://papers.nips.cc/paper/6084-fast-free-inference-of-simulation-models-with-bayesian-conditional-density-estimation/bibtex)

- **Flexible statistical inference for mechanistic models of neural dynamics** <br> by Lueckmann, Goncalves, Bassetto, Öcal, Nonnenmacher & Macke (NeurIPS 2017) <br>[[PDF]](https://papers.nips.cc/paper/6728-flexible-statistical-inference-for-mechanistic-models-of-neural-dynamics.pdf) [[BibTeX]](https://papers.nips.cc/paper/6728-flexible-statistical-inference-for-mechanistic-models-of-neural-dynamics/bibtex)

- **Automatic posterior transformation for likelihood-free inference**<br>by Greenberg, Nonnenmacher & Macke (ICML 2019) <br>[[PDF]](http://proceedings.mlr.press/v97/greenberg19a/greenberg19a.pdf) [[BibTeX]](data:text/plain;charset=utf-8,%0A%0A%0A%0A%0A%0A%40InProceedings%7Bpmlr-v97-greenberg19a%2C%0A%20%20title%20%3D%20%09%20%7BAutomatic%20Posterior%20Transformation%20for%20Likelihood-Free%20Inference%7D%2C%0A%20%20author%20%3D%20%09%20%7BGreenberg%2C%20David%20and%20Nonnenmacher%2C%20Marcel%20and%20Macke%2C%20Jakob%7D%2C%0A%20%20booktitle%20%3D%20%09%20%7BProceedings%20of%20the%2036th%20International%20Conference%20on%20Machine%20Learning%7D%2C%0A%20%20pages%20%3D%20%09%20%7B2404--2414%7D%2C%0A%20%20year%20%3D%20%09%20%7B2019%7D%2C%0A%20%20editor%20%3D%20%09%20%7BChaudhuri%2C%20Kamalika%20and%20Salakhutdinov%2C%20Ruslan%7D%2C%0A%20%20volume%20%3D%20%09%20%7B97%7D%2C%0A%20%20series%20%3D%20%09%20%7BProceedings%20of%20Machine%20Learning%20Research%7D%2C%0A%20%20address%20%3D%20%09%20%7BLong%20Beach%2C%20California%2C%20USA%7D%2C%0A%20%20month%20%3D%20%09%20%7B09--15%20Jun%7D%2C%0A%20%20publisher%20%3D%20%09%20%7BPMLR%7D%2C%0A%20%20pdf%20%3D%20%09%20%7Bhttp%3A%2F%2Fproceedings.mlr.press%2Fv97%2Fgreenberg19a%2Fgreenberg19a.pdf%7D%2C%0A%20%20url%20%3D%20%09%20%7Bhttp%3A%2F%2Fproceedings.mlr.press%2Fv97%2Fgreenberg19a.html%7D%2C%0A%20%20abstract%20%3D%20%09%20%7BHow%20can%20one%20perform%20Bayesian%20inference%20on%20stochastic%20simulators%20with%20intractable%20likelihoods%3F%20A%20recent%20approach%20is%20to%20learn%20the%20posterior%20from%20adaptively%20proposed%20simulations%20using%20neural%20network-based%20conditional%20density%20estimators.%20However%2C%20existing%20methods%20are%20limited%20to%20a%20narrow%20range%20of%20proposal%20distributions%20or%20require%20importance%20weighting%20that%20can%20limit%20performance%20in%20practice.%20Here%20we%20present%20automatic%20posterior%20transformation%20(APT)%2C%20a%20new%20sequential%20neural%20posterior%20estimation%20method%20for%20simulation-based%20inference.%20APT%20can%20modify%20the%20posterior%20estimate%20using%20arbitrary%2C%20dynamically%20updated%20proposals%2C%20and%20is%20compatible%20with%20powerful%20flow-based%20density%20estimators.%20It%20is%20more%20flexible%2C%20scalable%20and%20efficient%20than%20previous%20simulation-based%20inference%20techniques.%20APT%20can%20operate%20directly%20on%20high-dimensional%20time%20series%20and%20image%20data%2C%20opening%20up%20new%20applications%20for%20likelihood-free%20inference.%7D%0A%7D%0A)

- **Truncated proposals for scalable and hassle-free simulation-based inference** <br> by Deistler, Goncalves & Macke (NeurIPS 2022) <br>[[Paper]](https://arxiv.org/abs/2210.04815)


## Likelihood-estimation (`(S)NLE`)

- **Sequential neural likelihood: Fast likelihood-free inference with autoregressive flows**<br>by Papamakarios, Sterratt & Murray (AISTATS 2019) <br>[[PDF]](http://proceedings.mlr.press/v89/papamakarios19a/papamakarios19a.pdf) [[BibTeX]](https://gpapamak.github.io/bibtex/snl.bib)

- **Variational methods for simulation-based inference** <br> by Glöckler, Deistler, Macke (ICLR 2022) <br>[[Paper]](https://arxiv.org/abs/2203.04176)

- **Flexible and efficient simulation-based inference for models of decision-making** <br> by Boelts, Lueckmann, Gao, Macke (Elife 2022) <br>[[Paper]](https://elifesciences.org/articles/77220)


## Likelihood-ratio-estimation (`(S)NRE`)

- **Likelihood-free MCMC with Amortized Approximate Likelihood Ratios**<br>by Hermans, Begy & Louppe (ICML 2020) <br>[[PDF]](http://proceedings.mlr.press/v119/hermans20a/hermans20a.pdf)

- **On Contrastive Learning for Likelihood-free Inference**<br>Durkan, Murray & Papamakarios (ICML 2020) <br>[[PDF]](http://proceedings.mlr.press/v119/durkan20a/durkan20a.pdf)

- **Towards Reliable Simulation-Based Inference with Balanced Neural Ratio Estimation**<br>by Delaunoy, Hermans, Rozet, Wehenkel & Louppe (NeurIPS 2022) <br>[[PDF]](https://arxiv.org/pdf/2208.13624.pdf)

- **Contrastive Neural Ratio Estimation**<br>Benjamin Kurt Miller, Christoph Weniger, Patrick Forré (NeurIPS 2022) <br>[[PDF]](https://arxiv.org/pdf/2210.06170.pdf)

## Utilities

- **Restriction estimator**<br>by Deistler, Macke & Goncalves (PNAS 2022) <br>[[Paper]](https://www.pnas.org/doi/10.1073/pnas.2207632119)

- **Simulation-based calibration**<br>by Talts, Betancourt, Simpson, Vehtari, Gelman (arxiv 2018) <br>[[Paper]](https://arxiv.org/abs/1804.06788))

- **Expected coverage (sample-based)**<br>as computed in Deistler, Goncalves, Macke [[Paper]](https://arxiv.org/abs/2210.04815) and in Rozet, Louppe [[Paper]](https://matheo.uliege.be/handle/2268.2/12993)
21 changes: 21 additions & 0 deletions docs/docs/reference/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
# API Reference:

<div class="grid cards" markdown>

- [Inference](inference.md)
<br/>
*XXX*
- [Neural Networks](models.md)
<br/>
*Models to perform posterior approximation and signal embeddings.*
- [Posteriors](posteriors.md)
<br/>
*XXX*
- [Potentials](potentials.md)
<br/>
*XXX*
- [Analysis](analysis.md)
<br/>
*XXX*

</div>
1 change: 1 addition & 0 deletions docs/docs/tutorials
Loading