You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
`sbi` is a PyTorch package for simulation-based inference. Simulation-based inference is
13
-
the process of finding parameters of a simulator from observations.
13
+
`sbi` is a Python package for simulation-based inference, designed to meet the needs of
14
+
both researchers and practitioners. Whether you need fine-grained control or an
15
+
easy-to-use interface, `sbi` has you covered.
14
16
15
-
`sbi` takes a Bayesian approach and returns a full posterior distribution over the
16
-
parameters of the simulator, conditional on the observations. The package implements a
17
-
variety of inference algorithms, including _amortized_ and _sequential_ methods.
18
-
Amortized methods return a posterior that can be applied to many different observations
19
-
without retraining; sequential methods focus the inference on one particular observation
20
-
to be more simulation-efficient. See below for an overview of implemented methods.
17
+
With `sbi`, you can perform simulation-based inference (SBI) using a Bayesian approach:
18
+
Given a simulator that models a real-world process, SBI estimates the full posterior
19
+
distribution over the simulator’s parameters based on observed data. This distribution
20
+
indicates the most likely parameter values while additionally quantifying uncertainty
21
+
and revealing potential interactions between parameters.
21
22
22
-
`sbi` offers a simple interface for posterior inference in a few lines of code
23
+
### Key Features of `sbi`
24
+
25
+
`sbi` offers a blend of flexibility and ease of use:
26
+
27
+
-**Low-Level Interfaces**: For those who require maximum control over the inference
28
+
process, `sbi` provides low-level interfaces that allow you to fine-tune many aspects
29
+
of your work.
30
+
-**High-Level Interfaces**: If you prefer simplicity and efficiency, `sbi` also offers
31
+
high-level interfaces that enable quick and easy implementation of complex inference
32
+
tasks.
33
+
34
+
In addition, `sbi` supports a wide range of state-of-the-art inference algorithms (see
35
+
below for a list of implemented methods):
36
+
37
+
-**Amortized Methods**: These methods enable the reuse of posterior estimators across
38
+
multiple observations without the need to retrain.
39
+
-**Sequential Methods**: These methods focus on individual observations, optimizing the
40
+
number of simulations required.
41
+
42
+
Beyond inference, `sbi` also provides:
43
+
44
+
-**Validation Tools**: Built-in methods to validate and verify the accuracy of your
45
+
inferred posteriors.
46
+
-**Plotting and Analysis Tools**: Comprehensive functions for visualizing and analyzing
47
+
results, helping you interpret the posterior distributions with ease.
48
+
49
+
Getting started with `sbi` is straightforward, requiring only a few lines of code:
23
50
24
51
```python
25
52
from sbi.inference importSNPE
26
-
# import your simulator, define your prior over the parameters
27
-
# sample parameters theta and observations x
53
+
# Given: parameters theta and corresponding simulations x
28
54
inference = SNPE(prior=prior)
29
-
_ =inference.append_simulations(theta, x).train()
55
+
inference.append_simulations(theta, x).train()
30
56
posterior = inference.build_posterior()
31
57
```
32
58
33
-
## Installation
59
+
###Installation
34
60
35
-
`sbi` requires Python 3.8 or higher. A GPU is not required, but can lead to speed-up in some cases. We recommend to use a [`conda`](https://docs.conda.io/en/latest/miniconda.html) virtual
36
-
environment ([Miniconda installation instructions](https://docs.conda.io/en/latest/miniconda.html)). If `conda` is installed on the system, an environment for installing `sbi` can be created as follows:
61
+
`sbi` requires Python 3.9 or higher. While a GPU isn't necessary, it can improve
62
+
performance in some cases. We recommend using a virtual environment with
63
+
[`conda`](https://docs.conda.io/en/latest/miniconda.html) for an easy setup.
37
64
38
-
```commandline
39
-
# Create an environment for sbi (indicate Python 3.8 or higher); activate it
You can easily access and run these tutorials by opening a
63
-
[Codespace](https://docs.github.com/en/codespaces/overview) on this repo. To do
64
-
so, click on the green "Code" button and select "Open with Codespaces". This will
65
-
provide you with a fully functional environment where you can run the tutorials as
66
-
Jupyter notebooks.
94
+
You can also access and run these tutorials directly in your browser by opening
95
+
[Codespace](https://docs.github.com/en/codespaces/overview). To do so, click the green
96
+
“Code” button on the GitHub repository and select “Open with Codespaces.” This provides
97
+
a fully functional environment where you can explore sbi through Jupyter notebooks.
67
98
68
99
## Inference Algorithms
69
100
70
-
The following inference algorithms are currently available. You can find instructions on how to run each of these methods [here](https://sbi-dev.github.io/sbi/latest/tutorial/16_implemented_methods/).
101
+
The following inference algorithms are currently available. You can find instructions on
### Neural Posterior Estimation: amortized (NPE) and sequential (SNPE)
73
106
74
-
*[`SNPE_A`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snpe.snpe_a.SNPE_A) (including amortized single-round `NPE`) from Papamakarios G and Murray I [_Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation_](https://proceedings.neurips.cc/paper/2016/hash/6aca97005c68f1206823815f66102863-Abstract.html) (NeurIPS 2016).
or `APT` from Greenberg D, Nonnenmacher M, and Macke J [_Automatic Posterior
115
+
Transformation for likelihood-free inference_](https://arxiv.org/abs/1905.07488) (ICML
116
+
2019).
117
+
118
+
*`TSNPE` from Deistler M, Goncalves P, and Macke J [_Truncated proposals for scalable
119
+
and hassle-free simulation-based inference_](https://arxiv.org/abs/2210.04815)
120
+
(NeurIPS 2022).
75
121
76
-
*[`SNPE_C`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snpe.snpe_c.SNPE_C) or `APT` from Greenberg D, Nonnenmacher M, and Macke J [_Automatic
*`TSNPE` from Deistler M, Goncalves P, and Macke J [_Truncated proposals for scalable and hassle-free simulation-based inference_](https://arxiv.org/abs/2210.04815) (NeurIPS 2022).
128
+
*[`NPSE`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.npse.npse.NPSE) from
129
+
Geffner, T., Papamakarios, G., & Mnih, A. [_Compositional score modeling for
### Neural Likelihood Estimation: amortized (NLE) and sequential (SNLE)
83
134
84
-
*[`SNLE_A`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snle.snle_a.SNLE_A) or just `SNL` from Papamakarios G, Sterrat DC and Murray I [_Sequential
### Neural Ratio Estimation: amortized (NRE) and sequential (SNRE)
88
140
89
-
*[`(S)NRE_A`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snre.snre_a.SNRE_A) or `AALR` from Hermans J, Begy V, and Louppe G. [_Likelihood-free Inference with Amortized Approximate Likelihood Ratios_](https://arxiv.org/abs/1903.04057) (ICML 2020).
*[`(S)NRE_B`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snre.snre_b.SNRE_B) or `SRE` from Durkan C, Murray I, and Papamakarios G. [_On Contrastive Learning for Likelihood-free Inference_](https://arxiv.org/abs/2002.03712) (ICML 2020).
*[`BNRE`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snre.bnre.BNRE) from Delaunoy A, Hermans J, Rozet F, Wehenkel A, and Louppe G. [_Towards Reliable Simulation-Based Inference with Balanced Neural Ratio Estimation_](https://arxiv.org/abs/2208.13624) (NeurIPS 2022).
150
+
*[`BNRE`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snre.bnre.BNRE) from
151
+
Delaunoy A, Hermans J, Rozet F, Wehenkel A, and Louppe G. [_Towards Reliable
152
+
Simulation-Based Inference with Balanced Neural Ratio
*[`(S)NRE_C`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.snre.snre_c.SNRE_C) or `NRE-C` from Miller BK, Weniger C, Forré P. [_Contrastive Neural Ratio Estimation_](https://arxiv.org/abs/2210.06170) (NeurIPS 2022).
### Neural Variational Inference, amortized (NVI) and sequential (SNVI)
98
160
99
-
*[`SNVI`](https://sbi-dev.github.io/sbi/reference/#sbi.inference.posteriors.vi_posterior) from Glöckler M, Deistler M, Macke J, [_Variational methods for simulation-based inference_](https://openreview.net/forum?id=kZ0UYdhqkNY) (ICLR 2022).
inference for models of decision-making_](https://elifesciences.org/articles/77220)
170
+
(eLife 2022).
104
171
105
172
## Feedback and Contributions
106
173
@@ -121,15 +188,18 @@ Durkan's `lfi`. `sbi` runs as a community project. See also
121
188
122
189
`sbi` has been supported by the German Federal Ministry of Education and Research (BMBF)
123
190
through project ADIMEM (FKZ 01IS18052 A-D), project SiMaLeSAM (FKZ 01IS21055A) and the
124
-
Tübingen AI Center (FKZ 01IS18039A).
191
+
Tübingen AI Center (FKZ 01IS18039A). Since 2024, `sbi` is supported by the appliedAI
192
+
Institute for Europe.
125
193
126
194
## License
127
195
128
196
[Apache License Version 2.0 (Apache-2.0)](https://www.apache.org/licenses/LICENSE-2.0)
129
197
130
198
## Citation
131
199
132
-
If you use `sbi` consider citing the [sbi software paper](https://doi.org/10.21105/joss.02505), in addition to the original research articles describing the specific sbi-algorithm(s) you are using.
200
+
If you use `sbi` consider citing the [sbi software
201
+
paper](https://doi.org/10.21105/joss.02505), in addition to the original research
202
+
articles describing the specific sbi-algorithm(s) you are using.
133
203
134
204
```latex
135
205
@article{tejero-cantero2020sbi,
@@ -146,5 +216,7 @@ If you use `sbi` consider citing the [sbi software paper](https://doi.org/10.211
146
216
}
147
217
```
148
218
149
-
The above citation refers to the original version of the `sbi` project and has a persistent DOI.
150
-
Additionally, new releases of `sbi` are citable via [Zenodo](https://zenodo.org/record/3993098), where we create a new DOI for every release.
219
+
The above citation refers to the original version of the `sbi` project and has a
220
+
persistent DOI. Additionally, new releases of `sbi` are citable via
221
+
[Zenodo](https://zenodo.org/record/3993098), where we create a new DOI for every
0 commit comments