Skip to content

Commit d1bf8cf

Browse files
committed
docs: refactor readme.
1 parent 5584f13 commit d1bf8cf

File tree

1 file changed

+121
-49
lines changed

1 file changed

+121
-49
lines changed

README.md

Lines changed: 121 additions & 49 deletions
Original file line numberDiff line numberDiff line change
@@ -5,48 +5,80 @@
55
[![GitHub license](https://img.shields.io/github/license/sbi-dev/sbi)](https://github.com/sbi-dev/sbi/blob/master/LICENSE.txt)
66
[![DOI](https://joss.theoj.org/papers/10.21105/joss.02505/status.svg)](https://doi.org/10.21105/joss.02505)
77

8-
## sbi: simulation-based inference
8+
## `sbi`: Simulation-Based Inference
99

10-
[Getting Started](https://sbi-dev.github.io/sbi/latest/tutorial/00_getting_started/) | [Documentation](https://sbi-dev.github.io/sbi/)
10+
[Getting Started](https://sbi-dev.github.io/sbi/latest/tutorial/00_getting_started/) |
11+
[Documentation](https://sbi-dev.github.io/sbi/)
1112

12-
`sbi` is a PyTorch package for simulation-based inference. Simulation-based inference is
13-
the process of finding parameters of a simulator from observations.
13+
`sbi` is a Python package for simulation-based inference, designed to meet the needs of
14+
both researchers and practitioners. Whether you need fine-grained control or an
15+
easy-to-use interface, `sbi` has you covered.
1416

15-
`sbi` takes a Bayesian approach and returns a full posterior distribution over the
16-
parameters of the simulator, conditional on the observations. The package implements a
17-
variety of inference algorithms, including _amortized_ and _sequential_ methods.
18-
Amortized methods return a posterior that can be applied to many different observations
19-
without retraining; sequential methods focus the inference on one particular observation
20-
to be more simulation-efficient. See below for an overview of implemented methods.
17+
With `sbi`, you can perform simulation-based inference (SBI) using a Bayesian approach:
18+
Given a simulator that models a real-world process, SBI estimates the full posterior
19+
distribution over the simulator’s parameters based on observed data. This distribution
20+
indicates the most likely parameter values while additionally quantifying uncertainty
21+
and revealing potential interactions between parameters.
2122

22-
`sbi` offers a simple interface for posterior inference in a few lines of code
23+
### Key Features of `sbi`
24+
25+
`sbi` offers a blend of flexibility and ease of use:
26+
27+
- **Low-Level Interfaces**: For those who require maximum control over the inference
28+
process, `sbi` provides low-level interfaces that allow you to fine-tune many aspects
29+
of your work.
30+
- **High-Level Interfaces**: If you prefer simplicity and efficiency, `sbi` also offers
31+
high-level interfaces that enable quick and easy implementation of complex inference
32+
tasks.
33+
34+
In addition, `sbi` supports a wide range of state-of-the-art inference algorithms (see
35+
below for a list of implemented methods):
36+
37+
- **Amortized Methods**: These methods enable the reuse of posterior estimators across
38+
multiple observations without the need to retrain.
39+
- **Sequential Methods**: These methods focus on individual observations, optimizing the
40+
number of simulations required.
41+
42+
Beyond inference, `sbi` also provides:
43+
44+
- **Validation Tools**: Built-in methods to validate and verify the accuracy of your
45+
inferred posteriors.
46+
- **Plotting and Analysis Tools**: Comprehensive functions for visualizing and analyzing
47+
results, helping you interpret the posterior distributions with ease.
48+
49+
Getting started with `sbi` is straightforward, requiring only a few lines of code:
2350

2451
```python
2552
from sbi.inference import SNPE
26-
# import your simulator, define your prior over the parameters
27-
# sample parameters theta and observations x
53+
# Given: parameters theta and corresponding simulations x
2854
inference = SNPE(prior=prior)
29-
_ = inference.append_simulations(theta, x).train()
55+
inference.append_simulations(theta, x).train()
3056
posterior = inference.build_posterior()
3157
```
3258

33-
## Installation
59+
### Installation
3460

35-
`sbi` requires Python 3.8 or higher. A GPU is not required, but can lead to speed-up in some cases. We recommend to use a [`conda`](https://docs.conda.io/en/latest/miniconda.html) virtual
36-
environment ([Miniconda installation instructions](https://docs.conda.io/en/latest/miniconda.html)). If `conda` is installed on the system, an environment for installing `sbi` can be created as follows:
61+
`sbi` requires Python 3.9 or higher. While a GPU isn't necessary, it can improve
62+
performance in some cases. We recommend using a virtual environment with
63+
[`conda`](https://docs.conda.io/en/latest/miniconda.html) for an easy setup.
3764

38-
```commandline
39-
# Create an environment for sbi (indicate Python 3.8 or higher); activate it
40-
$ conda create -n sbi_env python=3.10 && conda activate sbi_env
41-
```
65+
To install `sbi`, follow these steps:
4266

43-
Independent of whether you are using `conda` or not, `sbi` can be installed using `pip`:
67+
1. **Create a Conda Environment** (if using Conda):
4468

45-
```commandline
46-
pip install sbi
47-
```
69+
```bash
70+
conda create -n sbi_env python=3.9 && conda activate sbi_env
71+
```
72+
73+
2. **Install `sbi`**: Independent of whether you are using `conda` or not, `sbi` can be
74+
installed using `pip`:
4875

49-
To test the installation, drop into a python prompt and run
76+
```commandline
77+
pip install sbi
78+
```
79+
80+
3. **Test the installation**:
81+
Open a Python prompt and run
5082

5183
```python
5284
from sbi.examples.minimal import simple
@@ -59,48 +91,83 @@ print(posterior)
5991
If you're new to `sbi`, we recommend starting with our [Getting
6092
Started](https://sbi-dev.github.io/sbi/latest/tutorial/00_getting_started/) tutorial.
6193

62-
You can easily access and run these tutorials by opening a
63-
[Codespace](https://docs.github.com/en/codespaces/overview) on this repo. To do
64-
so, click on the green "Code" button and select "Open with Codespaces". This will
65-
provide you with a fully functional environment where you can run the tutorials as
66-
Jupyter notebooks.
94+
You can also access and run these tutorials directly in your browser by opening
95+
[Codespace](https://docs.github.com/en/codespaces/overview). To do so, click the green
96+
“Code” button on the GitHub repository and select “Open with Codespaces.” This provides
97+
a fully functional environment where you can explore sbi through Jupyter notebooks.
6798

6899
## Inference Algorithms
69100

70-
The following inference algorithms are currently available. You can find instructions on how to run each of these methods [here](https://sbi-dev.github.io/sbi/latest/tutorial/16_implemented_methods/).
101+
The following inference algorithms are currently available. You can find instructions on
102+
how to run each of these methods
103+
[here](https://sbi-dev.github.io/sbi/latest/tutorial/16_implemented_methods/).
71104

72105
### Neural Posterior Estimation: amortized (NPE) and sequential (SNPE)
73106

74-
* [`SNPE_A`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snpe.snpe_a.SNPE_A) (including amortized single-round `NPE`) from Papamakarios G and Murray I [_Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation_](https://proceedings.neurips.cc/paper/2016/hash/6aca97005c68f1206823815f66102863-Abstract.html) (NeurIPS 2016).
107+
* [`SNPE_A`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snpe.snpe_a.SNPE_A)
108+
(including amortized single-round `NPE`) from Papamakarios G and Murray I [_Fast
109+
ε-free Inference of Simulation Models with Bayesian Conditional Density
110+
Estimation_](https://proceedings.neurips.cc/paper/2016/hash/6aca97005c68f1206823815f66102863-Abstract.html)
111+
(NeurIPS 2016).
112+
113+
* [`SNPE_C`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snpe.snpe_c.SNPE_C)
114+
or `APT` from Greenberg D, Nonnenmacher M, and Macke J [_Automatic Posterior
115+
Transformation for likelihood-free inference_](https://arxiv.org/abs/1905.07488) (ICML
116+
2019).
117+
118+
* `TSNPE` from Deistler M, Goncalves P, and Macke J [_Truncated proposals for scalable
119+
and hassle-free simulation-based inference_](https://arxiv.org/abs/2210.04815)
120+
(NeurIPS 2022).
75121

76-
* [`SNPE_C`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snpe.snpe_c.SNPE_C) or `APT` from Greenberg D, Nonnenmacher M, and Macke J [_Automatic
77-
Posterior Transformation for likelihood-free
78-
inference_](https://arxiv.org/abs/1905.07488) (ICML 2019).
122+
* [`FMPE`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.fmpe.fmpe_base.FMPE)
123+
from Wildberger, J., Dax, M., Buchholz, S., Green, S., Macke, J. H., & Schölkopf, B.
124+
[_Flow matching for scalable simulation-based inference_]
125+
(https://proceedings.neurips.cc/paper_files/paper/2023/hash/3663ae53ec078860bb0b9c6606e092a0-Abstract-Conference.html).(NeurIPS
126+
2023).
79127

80-
* `TSNPE` from Deistler M, Goncalves P, and Macke J [_Truncated proposals for scalable and hassle-free simulation-based inference_](https://arxiv.org/abs/2210.04815) (NeurIPS 2022).
128+
* [`NPSE`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.npse.npse.NPSE) from
129+
Geffner, T., Papamakarios, G., & Mnih, A. [_Compositional score modeling for
130+
simulation-based inference_](https://proceedings.mlr.press/v202/geffner23a.html).
131+
(ICML 2023).
81132

82133
### Neural Likelihood Estimation: amortized (NLE) and sequential (SNLE)
83134

84-
* [`SNLE_A`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snle.snle_a.SNLE_A) or just `SNL` from Papamakarios G, Sterrat DC and Murray I [_Sequential
85-
Neural Likelihood_](https://arxiv.org/abs/1805.07226) (AISTATS 2019).
135+
* [`SNLE_A`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snle.snle_a.SNLE_A)
136+
or just `SNL` from Papamakarios G, Sterrat DC and Murray I [_Sequential Neural
137+
Likelihood_](https://arxiv.org/abs/1805.07226) (AISTATS 2019).
86138

87139
### Neural Ratio Estimation: amortized (NRE) and sequential (SNRE)
88140

89-
* [`(S)NRE_A`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snre.snre_a.SNRE_A) or `AALR` from Hermans J, Begy V, and Louppe G. [_Likelihood-free Inference with Amortized Approximate Likelihood Ratios_](https://arxiv.org/abs/1903.04057) (ICML 2020).
141+
* [`(S)NRE_A`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snre.snre_a.SNRE_A)
142+
or `AALR` from Hermans J, Begy V, and Louppe G. [_Likelihood-free Inference with
143+
Amortized Approximate Likelihood Ratios_](https://arxiv.org/abs/1903.04057) (ICML
144+
2020).
90145

91-
* [`(S)NRE_B`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snre.snre_b.SNRE_B) or `SRE` from Durkan C, Murray I, and Papamakarios G. [_On Contrastive Learning for Likelihood-free Inference_](https://arxiv.org/abs/2002.03712) (ICML 2020).
146+
* [`(S)NRE_B`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snre.snre_b.SNRE_B)
147+
or `SRE` from Durkan C, Murray I, and Papamakarios G. [_On Contrastive Learning for
148+
Likelihood-free Inference_](https://arxiv.org/abs/2002.03712) (ICML 2020).
92149

93-
* [`BNRE`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snre.bnre.BNRE) from Delaunoy A, Hermans J, Rozet F, Wehenkel A, and Louppe G. [_Towards Reliable Simulation-Based Inference with Balanced Neural Ratio Estimation_](https://arxiv.org/abs/2208.13624) (NeurIPS 2022).
150+
* [`BNRE`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snre.bnre.BNRE) from
151+
Delaunoy A, Hermans J, Rozet F, Wehenkel A, and Louppe G. [_Towards Reliable
152+
Simulation-Based Inference with Balanced Neural Ratio
153+
Estimation_](https://arxiv.org/abs/2208.13624) (NeurIPS 2022).
94154

95-
* [`(S)NRE_C`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snre.snre_c.SNRE_C) or `NRE-C` from Miller BK, Weniger C, Forré P. [_Contrastive Neural Ratio Estimation_](https://arxiv.org/abs/2210.06170) (NeurIPS 2022).
155+
* [`(S)NRE_C`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snre.snre_c.SNRE_C)
156+
or `NRE-C` from Miller BK, Weniger C, Forré P. [_Contrastive Neural Ratio
157+
Estimation_](https://arxiv.org/abs/2210.06170) (NeurIPS 2022).
96158

97159
### Neural Variational Inference, amortized (NVI) and sequential (SNVI)
98160

99-
* [`SNVI`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.posteriors.vi_posterior) from Glöckler M, Deistler M, Macke J, [_Variational methods for simulation-based inference_](https://openreview.net/forum?id=kZ0UYdhqkNY) (ICLR 2022).
161+
* [`SNVI`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.posteriors.vi_posterior)
162+
from Glöckler M, Deistler M, Macke J, [_Variational methods for simulation-based
163+
inference_](https://openreview.net/forum?id=kZ0UYdhqkNY) (ICLR 2022).
100164

101165
### Mixed Neural Likelihood Estimation (MNLE)
102166

103-
* [`MNLE`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snle.mnle.MNLE) from Boelts J, Lueckmann JM, Gao R, Macke J, [_Flexible and efficient simulation-based inference for models of decision-making_](https://elifesciences.org/articles/77220) (eLife 2022).
167+
* [`MNLE`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snle.mnle.MNLE) from
168+
Boelts J, Lueckmann JM, Gao R, Macke J, [_Flexible and efficient simulation-based
169+
inference for models of decision-making_](https://elifesciences.org/articles/77220)
170+
(eLife 2022).
104171

105172
## Feedback and Contributions
106173

@@ -121,15 +188,18 @@ Durkan's `lfi`. `sbi` runs as a community project. See also
121188

122189
`sbi` has been supported by the German Federal Ministry of Education and Research (BMBF)
123190
through project ADIMEM (FKZ 01IS18052 A-D), project SiMaLeSAM (FKZ 01IS21055A) and the
124-
Tübingen AI Center (FKZ 01IS18039A).
191+
Tübingen AI Center (FKZ 01IS18039A). Since 2024, `sbi` is supported by the appliedAI
192+
Institute for Europe.
125193

126194
## License
127195

128196
[Apache License Version 2.0 (Apache-2.0)](https://www.apache.org/licenses/LICENSE-2.0)
129197

130198
## Citation
131199

132-
If you use `sbi` consider citing the [sbi software paper](https://doi.org/10.21105/joss.02505), in addition to the original research articles describing the specific sbi-algorithm(s) you are using.
200+
If you use `sbi` consider citing the [sbi software
201+
paper](https://doi.org/10.21105/joss.02505), in addition to the original research
202+
articles describing the specific sbi-algorithm(s) you are using.
133203

134204
```latex
135205
@article{tejero-cantero2020sbi,
@@ -146,5 +216,7 @@ If you use `sbi` consider citing the [sbi software paper](https://doi.org/10.211
146216
}
147217
```
148218

149-
The above citation refers to the original version of the `sbi` project and has a persistent DOI.
150-
Additionally, new releases of `sbi` are citable via [Zenodo](https://zenodo.org/record/3993098), where we create a new DOI for every release.
219+
The above citation refers to the original version of the `sbi` project and has a
220+
persistent DOI. Additionally, new releases of `sbi` are citable via
221+
[Zenodo](https://zenodo.org/record/3993098), where we create a new DOI for every
222+
release.

0 commit comments

Comments
 (0)