diff --git a/docs/README.md b/docs/README.md
index a4ba93584..d2dc10858 100644
--- a/docs/README.md
+++ b/docs/README.md
@@ -15,12 +15,11 @@ locally, follow these steps:
```
2. Convert the current version of the documentation notebooks to markdown and build the
- website locally using `mike`:
+ website locally using `mkdocs`:
```bash
jupyter nbconvert --to markdown ../tutorials/*.ipynb --output-dir docs/tutorials/
- jupyter nbconvert --to markdown ../examples/*.ipynb --output-dir docs/examples/
- mike serve
+ mkdocs serve
```
### Deployment
diff --git a/docs/docs/contribute.md b/docs/docs/contribute.md
index 092ac9e79..425b7d759 100644
--- a/docs/docs/contribute.md
+++ b/docs/docs/contribute.md
@@ -224,18 +224,16 @@ pip install -e ".[doc]"
Then, you can build the website locally by executing in the `docs` folder
```bash
-mike serve
+mkdocs serve
```
This will build the website on a local host address shown in the terminal. Changes to
the website files or a browser refresh will immediately rebuild the website.
-If you want to build the latest version of the tutorial notebooks, you need to convert
-them to markdown first:
+If you updated the tutorials or examples, you need to convert them to markdown first:
```bash
cd docs
-jupyter nbconvert --to markdown ../examples/*.ipynb --output-dir docs/examples/
jupyter nbconvert --to markdown ../tutorials/*.ipynb --output-dir docs/tutorials/
-mike serve
+mkdocs serve
```
diff --git a/docs/docs/examples/.gitignore b/docs/docs/examples/.gitignore
deleted file mode 100644
index ecce1be80..000000000
--- a/docs/docs/examples/.gitignore
+++ /dev/null
@@ -1,2 +0,0 @@
-*.md
-*.png
diff --git a/docs/docs/index.md b/docs/docs/index.md
index d9f7052a3..9c7c1d5d0 100644
--- a/docs/docs/index.md
+++ b/docs/docs/index.md
@@ -149,6 +149,15 @@ methods](tutorials/16_implemented_methods.md).
inference**
by Deistler, Goncalves & Macke (NeurIPS 2022)
[[Paper]](https://arxiv.org/abs/2210.04815)
+- **Flow matching for scalable simulation-based inference**
by Dax, M., Wildberger,
+ J., Buchholz, S., Green, S. R., Macke, J. H., & Schölkopf, B. (NeurIPS, 2023)
+ [[Paper]](https://arxiv.org/abs/2305.17161)
+
+- **Compositional Score Modeling for Simulation-Based Inference**
by Geffner, T.,
+ Papamakarios, G., & Mnih, A. (2023, July). Compositional score modeling for
+ simulation-based inference. (ICML 2023)
+ [[Paper]](https://proceedings.mlr.press/v202/geffner23a.html)
+
### Likelihood-estimation (`(S)NLE`)
- **Sequential neural likelihood: Fast likelihood-free inference with
diff --git a/docs/docs/tutorials/index.md b/docs/docs/tutorials/index.md
index 785b646e2..1f06cea8e 100644
--- a/docs/docs/tutorials/index.md
+++ b/docs/docs/tutorials/index.md
@@ -23,37 +23,36 @@ inference.
## Advanced
-- [Multi-round inference](03_multiround_inference.md)
-- [Sampling algorithms in sbi](11_sampler_interface.md)
-- [Custom density estimators](04_density_estimators.md)
-- [Embedding nets for observations](05_embedding_net.md)
-- [SBI with trial-based data](14_iid_data_and_permutation_invariant_embeddings.md)
-- [Handling invalid simulations](08_restriction_estimator.md)
-- [Crafting summary statistics](10_crafting_summary_statistics.md)
-- [Importance sampling posteriors](17_importance_sampled_posteriors.md)
+- [Multi-round inference](02_multiround_inference.md)
+- [Sampling algorithms in sbi](09_sampler_interface.md)
+- [Custom density estimators](03_density_estimators.md)
+- [Embedding nets for observations](04_embedding_networks.md)
+- [SBI with trial-based data](12_iid_data_and_permutation_invariant_embeddings.md)
+- [Handling invalid simulations](06_restriction_estimator.md)
+- [Crafting summary statistics](08_crafting_summary_statistics.md)
+- [Importance sampling posteriors](15_importance_sampled_posteriors.md)
## Diagnostics
-- [Posterior predictive checks](12_diagnostics_posterior_predictive_check.md)
-- [Simulation-based calibration](13_diagnostics_simulation_based_calibration.md)
-- [Density plots and MCMC diagnostics with ArviZ](15_mcmc_diagnostics_with_arviz.md)
-- [Local-C2ST coverage checks](18_diagnostics_lc2st.md)
+- [Posterior predictive checks](10_diagnostics_posterior_predictive_checks.md)
+- [Simulation-based calibration](11_diagnostics_simulation_based_calibration.md)
+- [Local-C2ST coverage checks](13_diagnostics_lc2st.md)
+- [Density plots and MCMC diagnostics with ArviZ](14_mcmc_diagnostics_with_arviz.md)
-
## Analysis
-- [Conditional distributions](07_conditional_distributions.md)
-- [Posterior sensitivity analysis](09_sensitivity_analysis.md)
-- [Plotting functionality](19_plotting_functionality.md)
+- [Conditional distributions](05_conditional_distributions.md)
+- [Posterior sensitivity analysis](07_sensitivity_analysis.md)
+- [Plotting functionality](17_plotting_functionality.md)
## Examples
-- [Hodgkin-Huxley model](../examples/00_HH_simulator.md)
-- [Decision-making model](../examples/01_decision_making_model.md)
+- [Hodgkin-Huxley model](Example_00_HodgkinHuxleyModel.md)
+- [Decision-making model](Example_01_DecisionMakingModel.md)
diff --git a/tutorials/00_getting_started_flexible.ipynb b/tutorials/00_getting_started_flexible.ipynb
index 00732f1e9..350d30304 100644
--- a/tutorials/00_getting_started_flexible.ipynb
+++ b/tutorials/00_getting_started_flexible.ipynb
@@ -11,7 +11,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "Note, you can find the original version of this notebook at [https://github.com/sbi-dev/sbi/blob/main/tutorials/00_getting_started_flexible.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/00_getting_started_flexible.ipynb) in the `sbi` repository."
+ "Note, you can find the original version of this notebook at [/tutorials/00_getting_started_flexible.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/00_getting_started_flexible.ipynb) in the `sbi` repository."
]
},
{
@@ -423,10 +423,18 @@
"source": [
"## Next steps\n",
"\n",
- "To learn more about the capabilities of `sbi`, you can head over to the tutorial on [inferring parameters for multiple observations ](https://sbi-dev.github.io/sbi/tutorial/01_gaussian_amortized/) which introduces the concept of amortization. \n",
+ "To learn more about the capabilities of `sbi`, you can head over to the tutorial\n",
+ "[01_gaussian_amortized](01_gaussian_amortized.md), for inferring parameters for multiple\n",
+ "observations without retraining.\n",
"\n",
- "Alternatively, for an example with an __actual__ simulator, you can read our [example for a scientific simulator from neuroscience](https://sbi-dev.github.io/sbi/examples/00_HH_simulator/)."
+ "Alternatively, for an example with an __actual__ simulator, you can read our example\n",
+ "for a scientific simulator from neuroscience under [Example_00_HodgkinHuxleyModel](Example_00_HodgkinHuxleyModel.md)."
]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": []
}
],
"metadata": {
diff --git a/tutorials/01_gaussian_amortized.ipynb b/tutorials/01_gaussian_amortized.ipynb
index fe999176c..f2a613aa2 100644
--- a/tutorials/01_gaussian_amortized.ipynb
+++ b/tutorials/01_gaussian_amortized.ipynb
@@ -11,7 +11,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "Note, you can find the original version of this notebook at [https://github.com/sbi-dev/sbi/blob/main/tutorials/01_gaussian_amortized.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/01_gaussian_amortized.ipynb) in the `sbi` repository."
+ "Note, you can find the original version of this notebook at [tutorials/01_gaussian_amortized.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/01_gaussian_amortized.ipynb) in the `sbi` repository."
]
},
{
@@ -258,8 +258,8 @@
"source": [
"# Next steps\n",
"\n",
- "Now that you got familiar with amortization and are probably good to go and have a first shot at applying `sbi` to your own inference problem. If you want to learn more, we recommend checking out our tutorial on\n",
- "[multiround inference ](https://sbi-dev.github.io/sbi/tutorial/03_multiround_inference/) which aims to make inference for a single observation more sampling efficient."
+ "Now that you got familiar with amortization and are probably good to go and have a first shot at applying `sbi` to your own inference problem. If you want to learn more, we recommend checking out our tutorial\n",
+ "[02_multiround_inference](02_multiround_inference.md) which aims to make inference for a single observation more sampling efficient."
]
}
],
diff --git a/tutorials/03_multiround_inference.ipynb b/tutorials/02_multiround_inference.ipynb
similarity index 99%
rename from tutorials/03_multiround_inference.ipynb
rename to tutorials/02_multiround_inference.ipynb
index 002835f7a..2fe4011ab 100644
--- a/tutorials/03_multiround_inference.ipynb
+++ b/tutorials/02_multiround_inference.ipynb
@@ -17,7 +17,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "Note, you can find the original version of this notebook at [https://github.com/sbi-dev/sbi/blob/main/tutorials/03_multiround_inference.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/03_multiround_inference.ipynb) in the `sbi` repository.\n"
+ "Note, you can find the original version of this notebook at [tutorials/02_multiround_inference.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/02_multiround_inference.ipynb) in the `sbi` repository.\n"
]
},
{
@@ -358,7 +358,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
- "version": "3.8.19"
+ "version": "3.12.0"
}
},
"nbformat": 4,
diff --git a/tutorials/04_density_estimators.ipynb b/tutorials/03_density_estimators.ipynb
similarity index 96%
rename from tutorials/04_density_estimators.ipynb
rename to tutorials/03_density_estimators.ipynb
index c180513a3..4c64ffbf1 100644
--- a/tutorials/04_density_estimators.ipynb
+++ b/tutorials/03_density_estimators.ipynb
@@ -101,7 +101,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "It is also possible to pass an `embedding_net` to `posterior_nn()` which learn summary statistics from high-dimensional simulation outputs. You can find a more detailed tutorial on this [here](https://sbi-dev.github.io/sbi/tutorial/05_embedding_net/).\n"
+ "It is also possible to pass an `embedding_net` to `posterior_nn()` which learn summary\n",
+ "statistics from high-dimensional simulation outputs. You can find a more detailed\n",
+ "tutorial on this in [04_embedding_networks](04_embedding_networks.md).\n"
]
},
{
diff --git a/tutorials/05_embedding_net.ipynb b/tutorials/04_embedding_networks.ipynb
similarity index 99%
rename from tutorials/05_embedding_net.ipynb
rename to tutorials/04_embedding_networks.ipynb
index 1f7aa54c0..5abfd99ce 100644
--- a/tutorials/05_embedding_net.ipynb
+++ b/tutorials/04_embedding_networks.ipynb
@@ -7,7 +7,7 @@
"# Embedding nets for observations\n",
"\n",
"!!! note\n",
- " You can find the original version of this notebook at [tutorials/05_embedding_net.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/05_embedding_net.ipynb) in the `sbi` repository.\n",
+ " You can find the original version of this notebook at [tutorials/04_embedding_networks.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/04_embedding_networks.ipynb) in the `sbi` repository.\n",
"\n",
"## Introduction\n",
"\n",
@@ -481,7 +481,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
- "version": "3.10.14"
+ "version": "3.12.0"
}
},
"nbformat": 4,
diff --git a/tutorials/07_conditional_distributions.ipynb b/tutorials/05_conditional_distributions.ipynb
similarity index 99%
rename from tutorials/07_conditional_distributions.ipynb
rename to tutorials/05_conditional_distributions.ipynb
index a98549f88..6694e0988 100644
--- a/tutorials/07_conditional_distributions.ipynb
+++ b/tutorials/05_conditional_distributions.ipynb
@@ -15,7 +15,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "Note, you can find the original version of this notebook at [https://github.com/sbi-dev/sbi/blob/main/tutorials/07_conditional_distributions.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/07_conditional_distributions.ipynb) in the `sbi` repository.\n"
+ "Note, you can find the original version of this notebook at [tutorials/05_conditional_distributions.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/05_conditional_distributions.ipynb) in the `sbi` repository.\n"
]
},
{
@@ -26182,7 +26182,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "Now we want to build the conditional potential (please read throught the [sampler interface tutorial](https://www.mackelab.org/sbi/tutorial/11_sampler_interface/) for an explanation of potential functions). For this, we have to pass a `condition`. In our case, we want to condition the forth parameter on $\\theta_4=0.2$. Regardless of how many parameters one wants to condition on, in `sbi`, one has to pass a `condition` value for all parameters. The first three values will simply be ignored. We can tell the algorithm which parameters should be kept fixed and which ones should be sampled with the argument `dims_to_sample`.\n"
+ "Now we want to build the conditional potential (please read throught the tutorial [09_sampler_interface](https://www.mackelab.org/sbi/tutorial/09_sampler_interface/) for an explanation of potential functions). For this, we have to pass a `condition`. In our case, we want to condition the forth parameter on $\\theta_4=0.2$. Regardless of how many parameters one wants to condition on, in `sbi`, one has to pass a `condition` value for all parameters. The first three values will simply be ignored. We can tell the algorithm which parameters should be kept fixed and which ones should be sampled with the argument `dims_to_sample`.\n"
]
},
{
@@ -26322,7 +26322,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
- "version": "3.11.8"
+ "version": "3.12.0"
}
},
"nbformat": 4,
diff --git a/tutorials/08_restriction_estimator.ipynb b/tutorials/06_restriction_estimator.ipynb
similarity index 99%
rename from tutorials/08_restriction_estimator.ipynb
rename to tutorials/06_restriction_estimator.ipynb
index c0e3774f7..7610c3cfa 100644
--- a/tutorials/08_restriction_estimator.ipynb
+++ b/tutorials/06_restriction_estimator.ipynb
@@ -378,7 +378,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
- "version": "3.8.18"
+ "version": "3.12.0"
}
},
"nbformat": 4,
diff --git a/tutorials/09_sensitivity_analysis.ipynb b/tutorials/07_sensitivity_analysis.ipynb
similarity index 99%
rename from tutorials/09_sensitivity_analysis.ipynb
rename to tutorials/07_sensitivity_analysis.ipynb
index 5065f306f..5f40789f3 100644
--- a/tutorials/09_sensitivity_analysis.ipynb
+++ b/tutorials/07_sensitivity_analysis.ipynb
@@ -471,7 +471,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
- "version": "3.8.18"
+ "version": "3.12.0"
}
},
"nbformat": 4,
diff --git a/tutorials/10_crafting_summary_statistics.ipynb b/tutorials/08_crafting_summary_statistics.ipynb
similarity index 99%
rename from tutorials/10_crafting_summary_statistics.ipynb
rename to tutorials/08_crafting_summary_statistics.ipynb
index a7d09503e..557a319b1 100644
--- a/tutorials/10_crafting_summary_statistics.ipynb
+++ b/tutorials/08_crafting_summary_statistics.ipynb
@@ -11,7 +11,12 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "Many simulators produce outputs that are high-dimesional. For example, a simulator might generate a time series or an image. In a [previous tutorial](https://sbi-dev.github.io/sbi/tutorial/05_embedding_net/), we discussed how a neural networks can be used to learn summary statistics from such data. In this notebook, we will instead focus on hand-crafting summary statistics. We demonstrate that the choice of summary statistics can be crucial for the performance of the inference algorithm.\n"
+ "Many simulators produce outputs that are high-dimesional. For example, a simulator might\n",
+ "generate a time series or an image. In the tutorial on [04_embedding_networks](04_embedding_networks.md), we discussed how a\n",
+ "neural networks can be used to learn summary statistics from such data. In this\n",
+ "notebook, we will instead focus on hand-crafting summary statistics. We demonstrate that\n",
+ "the choice of summary statistics can be crucial for the performance of the inference\n",
+ "algorithm.\n"
]
},
{
@@ -781,7 +786,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
- "version": "3.11.7"
+ "version": "3.12.0"
}
},
"nbformat": 4,
diff --git a/tutorials/11_sampler_interface.ipynb b/tutorials/09_sampler_interface.ipynb
similarity index 98%
rename from tutorials/11_sampler_interface.ipynb
rename to tutorials/09_sampler_interface.ipynb
index bcbb5a3ef..7b0a6c3c9 100644
--- a/tutorials/11_sampler_interface.ipynb
+++ b/tutorials/09_sampler_interface.ipynb
@@ -6,8 +6,6 @@
"source": [
"# Sampling algorithms in `sbi`\n",
"\n",
- "Note: this tutorial requires that the user is already familiar with the [flexible interface](https://sbi-dev.github.io/sbi/tutorial/02_flexible_interface/).\n",
- "\n",
"`sbi` implements three methods: SNPE, SNLE, and SNRE. When using SNPE, the trained neural network directly approximates the posterior. Thus, sampling from the posterior can be done by sampling from the trained neural network. The neural networks trained in SNLE and SNRE approximate the likelihood(-ratio). Thus, in order to draw samples from the posterior, one has to perform additional sampling steps, e.g. Markov-chain Monte-Carlo (MCMC). In `sbi`, the implemented samplers are:\n",
"\n",
"- Markov-chain Monte-Carlo (MCMC)\n",
@@ -358,7 +356,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
- "version": "3.8.18"
+ "version": "3.12.0"
}
},
"nbformat": 4,
diff --git a/tutorials/12_diagnostics_posterior_predictive_check.ipynb b/tutorials/10_diagnostics_posterior_predictive_checks.ipynb
similarity index 100%
rename from tutorials/12_diagnostics_posterior_predictive_check.ipynb
rename to tutorials/10_diagnostics_posterior_predictive_checks.ipynb
diff --git a/tutorials/13_diagnostics_simulation_based_calibration.ipynb b/tutorials/11_diagnostics_simulation_based_calibration.ipynb
similarity index 99%
rename from tutorials/13_diagnostics_simulation_based_calibration.ipynb
rename to tutorials/11_diagnostics_simulation_based_calibration.ipynb
index ed08198a6..68ea49b29 100644
--- a/tutorials/13_diagnostics_simulation_based_calibration.ipynb
+++ b/tutorials/11_diagnostics_simulation_based_calibration.ipynb
@@ -6,7 +6,16 @@
"source": [
"# Simulation-based Calibration in SBI\n",
"\n",
- "After a density estimator has been trained with simulated data to obtain a posterior, the estimator should be made subject to several **diagnostic tests**. This needs to be performed before being used for inference given the actual observed data. _Posterior Predictive Checks_ (see [previous tutorial](https://sbi-dev.github.io/sbi/tutorial/12_diagnostics_posterior_predictive_check/)) provide one way to \"critique\" a trained estimator based on its predictive performance. Another important approach to such diagnostics is simulation-based calibration as developed by [Cook et al, 2006](https://www.tandfonline.com/doi/abs/10.1198/106186006X136976) and [Talts et al, 2018](https://arxiv.org/abs/1804.06788). This tutorial will demonstrate and teach you this technique with sbi.\n",
+ "After a density estimator has been trained with simulated data to obtain a posterior,\n",
+ "the estimator should be made subject to several **diagnostic tests**. This needs to be\n",
+ "performed before being used for inference given the actual observed data. _Posterior\n",
+ "Predictive Checks_ (see [10_diagnostics_posterior_predictive_checks\n",
+ "tutorial](10_diagnostics_posterior_predictive_checks.md)) provide one way to \"critique\" a trained\n",
+ "estimator based on its predictive performance. Another important approach to such\n",
+ "diagnostics is simulation-based calibration as developed by [Cook et al,\n",
+ "2006](https://www.tandfonline.com/doi/abs/10.1198/106186006X136976) and [Talts et al,\n",
+ "2018](https://arxiv.org/abs/1804.06788). This tutorial will demonstrate and teach you\n",
+ "this technique with sbi.\n",
"\n",
"**Simulation-based calibration** (SBC) provides a (qualitative) view and a quantitive measure to check, whether the variances of the posterior are balanced, i.e., neither over-confident nor under-confident. As such, SBC can be viewed as a necessary condition (but not sufficient) for a valid inference algorithm: If SBC checks fail, this tells you that your inference is invalid. If SBC checks pass, this is no guarantee that the posterior estimation is working.\n"
]
@@ -38,7 +47,7 @@
"\n",
"**SBC can inform us whether we are not wrong.** However, it cannot tell us whether we are right, i.e., SBC checks a necessary condition. For example, imagine you run SBC using the prior as a posterior. The ranks would be perfectly uniform. But the inference would be wrong as this scenario would only occur if the posterior is uninformative.\n",
"\n",
- "**The Posterior Predictive Checks (see [tutorial 12](https://sbi-dev.github.io/sbi/tutorial/12_diagnostics_posterior_predictive_check/)) can be seen as the complementary sufficient check** for the posterior (only as a methaphor, no theoretical guarantees here). Using the prior as a posterior and then doing predictive checks would clearly show that inference failed.\n",
+ "**Posterior Predictive Checks can be seen as the complementary sufficient check** for the posterior (only as a methaphor, no theoretical guarantees here). Using the prior as a posterior and then doing predictive checks would clearly show that inference failed.\n",
"\n",
"To summarize, SBC can:\n",
"\n",
@@ -1115,11 +1124,6 @@
"interpreting TARP like SBC and complementing coverage checks with posterior predictive\n",
"checks."
]
- },
- {
- "cell_type": "markdown",
- "metadata": {},
- "source": []
}
],
"metadata": {
diff --git a/tutorials/14_iid_data_and_permutation_invariant_embeddings.ipynb b/tutorials/12_iid_data_and_permutation_invariant_embeddings.ipynb
similarity index 99%
rename from tutorials/14_iid_data_and_permutation_invariant_embeddings.ipynb
rename to tutorials/12_iid_data_and_permutation_invariant_embeddings.ipynb
index 65d4b2386..64303599d 100644
--- a/tutorials/14_iid_data_and_permutation_invariant_embeddings.ipynb
+++ b/tutorials/12_iid_data_and_permutation_invariant_embeddings.ipynb
@@ -881,13 +881,6 @@
" fontsize=12,\n",
");"
]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [],
- "source": []
}
],
"metadata": {
@@ -906,7 +899,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
- "version": "3.10.14"
+ "version": "3.12.0"
},
"toc": {
"base_numbering": 1,
diff --git a/tutorials/18_diagnostics_lc2st.ipynb b/tutorials/13_diagnostics_lc2st.ipynb
similarity index 99%
rename from tutorials/18_diagnostics_lc2st.ipynb
rename to tutorials/13_diagnostics_lc2st.ipynb
index bcf96caf2..f7476c41e 100644
--- a/tutorials/18_diagnostics_lc2st.ipynb
+++ b/tutorials/13_diagnostics_lc2st.ipynb
@@ -9,9 +9,9 @@
"\n",
" After a density estimator has been trained with simulated data to obtain a posterior, the estimator should be made subject to several diagnostic tests. This diagnostic should be performed before the posterior is used for inference given the actual observed data. \n",
" \n",
- "*Posterior Predictive Checks* (see [previous tutorial](https://sbi-dev.github.io/sbi/tutorial/12_diagnostics_posterior_predictive_check/)) provide one way to \"critique\" a trained estimator via its predictive performance. \n",
+ "*Posterior Predictive Checks* (see [tutorial 10](10_diagnostics_posterior_predictive_checks.md)) provide one way to \"critique\" a trained estimator via its predictive performance. \n",
" \n",
- "Another approach is *Simulation-Based Calibration* (SBC, see [previous tutorial](https://sbi-dev.github.io/sbi/tutorial/13_diagnostics_simulation_based_calibration/)). SBC evaluates whether the estimated posterior is balanced, i.e., neither over-confident nor under-confident. These checks are performed ***in expectation (on average) over the observation space***, i.e. they are performed on a set of $(\\theta,x)$ pairs sampled from the joint distribution over simulator parameters $\\theta$ and corresponding observations $x$. As such, SBC is a ***global validation method*** that can be viewed as a necessary condition (but not sufficient) for a valid inference algorithm: If SBC checks fail, this tells you that your inference is invalid. If SBC checks pass, *this is no guarantee that the posterior estimation is working*.\n",
+ "Another approach is *Simulation-Based Calibration* (SBC, see [tutorial 11](11_diagnostics_simulation_based_calibration.md)). SBC evaluates whether the estimated posterior is balanced, i.e., neither over-confident nor under-confident. These checks are performed ***in expectation (on average) over the observation space***, i.e. they are performed on a set of $(\\theta,x)$ pairs sampled from the joint distribution over simulator parameters $\\theta$ and corresponding observations $x$. As such, SBC is a ***global validation method*** that can be viewed as a necessary condition (but not sufficient) for a valid inference algorithm: If SBC checks fail, this tells you that your inference is invalid. If SBC checks pass, *this is no guarantee that the posterior estimation is working*.\n",
"\n",
"**Local Classifier Two-Sample Tests** ($\\ell$-C2ST) as developed by [Linhart et al, 2023](https://arxiv.org/abs/2306.03580) present a new ***local validation method*** that allows to evaluate the correctness of the posterior estimator ***at a fixed observation***, i.e. they work on a single $(\\theta,x)$ pair. They provide necessary *and sufficient* conditions for the validity of the SBI algorithm, as well as easy-to-interpret qualitative and quantitative diagnostics. \n",
" \n",
diff --git a/tutorials/15_mcmc_diagnostics_with_arviz.ipynb b/tutorials/14_mcmc_diagnostics_with_arviz.ipynb
similarity index 99%
rename from tutorials/15_mcmc_diagnostics_with_arviz.ipynb
rename to tutorials/14_mcmc_diagnostics_with_arviz.ipynb
index ab3a11379..e79ba2784 100644
--- a/tutorials/15_mcmc_diagnostics_with_arviz.ipynb
+++ b/tutorials/14_mcmc_diagnostics_with_arviz.ipynb
@@ -65,7 +65,8 @@
"source": [
"## Train MNLE to approximate the likelihood\n",
"\n",
- "For this tutorial, we will use a simple simulator with two parameters. For details see the [example on the decision making model](https://sbi-dev.github.io/sbi/examples/01_decision_making_model/).\n",
+ "For this tutorial, we will use a simple simulator with two parameters. For details see\n",
+ "the [example on the decision making model](Example_01_DecisionMakingModel.md).\n",
"\n",
"Here, we pass `mcmc_method=\"nuts\"` in order to use the underlying [`pyro` No-U-turn sampler](https://docs.pyro.ai/en/1.8.1/mcmc.html#nuts), but it would work as well with other samplers (e.g. \"slice_np_vectorized\", \"hmc\").\n",
"\n",
@@ -438,13 +439,6 @@
" figsize=(10, 10),\n",
")"
]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "metadata": {},
- "outputs": [],
- "source": []
}
],
"metadata": {
@@ -463,7 +457,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
- "version": "3.8.18"
+ "version": "3.12.0"
},
"vscode": {
"interpreter": {
diff --git a/tutorials/17_importance_sampled_posteriors.ipynb b/tutorials/15_importance_sampled_posteriors.ipynb
similarity index 99%
rename from tutorials/17_importance_sampled_posteriors.ipynb
rename to tutorials/15_importance_sampled_posteriors.ipynb
index 7c17ebf5b..dc40d3205 100644
--- a/tutorials/17_importance_sampled_posteriors.ipynb
+++ b/tutorials/15_importance_sampled_posteriors.ipynb
@@ -1,15 +1,11 @@
{
"cells": [
{
- "cell_type": "code",
- "execution_count": null,
+ "cell_type": "markdown",
"id": "cf9f505b-f478-4da4-9ccd-5208aa806825",
"metadata": {},
- "outputs": [],
"source": [
- "\n",
- "\n",
- "### TLDR:"
+ "# Refining posterior estimates with importance sampling"
]
},
{
diff --git a/tutorials/19_plotting_functionality.ipynb b/tutorials/17_plotting_functionality.ipynb
similarity index 99%
rename from tutorials/19_plotting_functionality.ipynb
rename to tutorials/17_plotting_functionality.ipynb
index 486603272..7adad60f8 100644
--- a/tutorials/19_plotting_functionality.ipynb
+++ b/tutorials/17_plotting_functionality.ipynb
@@ -1,24 +1,5 @@
{
"cells": [
- {
- "cell_type": "code",
- "execution_count": 1,
- "metadata": {},
- "outputs": [
- {
- "name": "stderr",
- "output_type": "stream",
- "text": [
- "WARNING (pytensor.tensor.blas): Using NumPy C-API based implementation for BLAS functions.\n"
- ]
- }
- ],
- "source": [
- "import torch\n",
- "\n",
- "from sbi.analysis.plot import marginal_plot, pairplot"
- ]
- },
{
"cell_type": "markdown",
"metadata": {},
@@ -41,6 +22,8 @@
"metadata": {},
"outputs": [],
"source": [
+ "import torch \n",
+ "from sbi.analysis import pairplot\n",
"from toy_posterior_for_07_cc import ExamplePosterior\n",
"\n",
"posterior = ExamplePosterior()\n",
diff --git a/examples/00_HH_simulator.ipynb b/tutorials/Example_00_HodgkinHuxleyModel.ipynb
similarity index 99%
rename from examples/00_HH_simulator.ipynb
rename to tutorials/Example_00_HodgkinHuxleyModel.ipynb
index 6a731aa38..e343f08a8 100644
--- a/examples/00_HH_simulator.ipynb
+++ b/tutorials/Example_00_HodgkinHuxleyModel.ipynb
@@ -19,7 +19,8 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "Note, you find the original version of this notebook at [https://github.com/sbi-dev/sbi/blob/main/examples/00_HH_simulator.ipynb](https://github.com/sbi-dev/sbi/blob/main/examples/00_HH_simulator.ipynb) in the `sbi` repository.\n"
+ "Note, you find the original version of this notebook in the `sbi` repository under\n",
+ "[tutorials/Example_00_HodgkinHuxleyModel.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/Example_00_HodgkinHuxleyModel.ipynb).\n"
]
},
{
@@ -664,7 +665,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
- "version": "3.8.18"
+ "version": "3.12.0"
}
},
"nbformat": 4,
diff --git a/examples/01_decision_making_model.ipynb b/tutorials/Example_01_DecisionMakingModel.ipynb
similarity index 99%
rename from examples/01_decision_making_model.ipynb
rename to tutorials/Example_01_DecisionMakingModel.ipynb
index b6d8a0c12..645e97e5c 100644
--- a/examples/01_decision_making_model.ipynb
+++ b/tutorials/Example_01_DecisionMakingModel.ipynb
@@ -15,6 +15,14 @@
"inference in such models with the `MNLE` method.\n"
]
},
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Note, you find the original version of this notebook in the `sbi` repository under\n",
+ "[tutorials/Example_01_DecisionMakingModel.ipynb](https://github.com/sbi-dev/sbi/blob/main/tutorials/Example_01_DecisionMakingModel.ipynb)."
+ ]
+ },
{
"cell_type": "markdown",
"metadata": {},
diff --git a/examples/HH_helper_functions.py b/tutorials/HH_helper_functions.py
similarity index 100%
rename from examples/HH_helper_functions.py
rename to tutorials/HH_helper_functions.py