Skip to content

Commit 5bcabc1

Browse files
committed
docs: update README.md
1 parent 03311ee commit 5bcabc1

File tree

1 file changed

+15
-15
lines changed

1 file changed

+15
-15
lines changed

README.md

Lines changed: 15 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -20,14 +20,15 @@ net = BayesianModule(net) # 'net' is now a BNN
2020

2121
The resulting model behaves exactly like any standard `nn.Module`, but instead of learning fixed weight values, your model now learns distributions from which weight values are sampled during training and inference, allowing it to capture uncertainty in its parameters and predictions.
2222

23-
Figure
24-
23+
<p align="center">
24+
<img src="https://raw.githubusercontent.com/raphbrodeur/torchbayesian/main/docs/images/bnn_1d_regression.png" width="50%">
25+
</p>
2526

2627
## Key Features
2728

2829
- **One line to "BNN-ize" any model** — Turn any already existing PyTorch model into a BNN with a single line of code. No need to rewrite your model, redefine layers, or modify your existing architecture.
2930
- **Truly compatible with all layers** — Unlike other "BNN-izers" that swap specific supported layers for variational versions, torchbayesian converts every trainable parameter in your model into a variational posterior module, actually making the entire model Bayesian, not just parts of it.
30-
- **PyTorch-native design** — Works entirely within the PyTorch framework; training, inference, evaluation remain unchanged. Fully compatible with other PyTorch-based tools such as [Lightning](https://lightning.ai/docs/pytorch/stable/), [TorchMetrics](https://lightning.ai/docs/torchmetrics/stable/), and [MONAI](https://monai.io/).
31+
- **PyTorch-native design** — Works entirely within PyTorch's framework; training, inference, evaluation remain unchanged. Fully compatible with other PyTorch-based tools such as [Lightning](https://lightning.ai/docs/pytorch/stable/), [TorchMetrics](https://lightning.ai/docs/torchmetrics/stable/), and [MONAI](https://monai.io/).
3132
- **Custom priors and variational posteriors** — Specify priors and variational posteriors directly as arguments. You can also define your own custom priors and variational posteriors and register them with the API using a simple decorator logic. This allows both plug-and-play use and deep customization without having to touch the core library.
3233
- **KL divergence easily accessible** — Retrieve the model's KL divergence at any point using the `.kl_divergence()` method of `bnn.BayesianModule`.
3334
- **Flexible KL computation** — When analytic computation is not available for some pair of variational posterior and prior, falls back to an estimation using Monte-Carlo sampling. This ensures generality and support for arbitrary user-defined distributions.
@@ -40,27 +41,26 @@ torchbayesian works with Python 3.10+ and has a direct dependency on [PyTorch](h
4041

4142
To install the current release, run :
4243

43-
...
44+
```bash
45+
pip install torchbayesian
46+
```
4447

4548
## Getting started
4649

47-
How to use it
48-
49-
How it works
50+
> This `README.md` is still a work in progress. Further details will be added.
5051
51-
KL divergence
52+
A working example is available at [torchbayesian/examples](https://github.com/raphbrodeur/torchbayesian/tree/main/examples).
5253

53-
Example .py
54+
The [Kullback-Leibler divergence](https://en.wikipedia.org/wiki/Evidence_lower_bound) of the model can be retrieved at any point using the `.kl_divergence()` method of `bnn.BayesianModule`.
5455

55-
Factories
56+
Different priors and posteriors can be used.
5657

5758
## Motivation
5859

59-
...
60-
61-
## License
62-
63-
...
60+
Modern deep learning models are remarkably powerful, but they often make predictions with high confidence even when they’re wrong.
61+
In safety-critical domains such as health, finance, or autonomous systems, this overconfidence makes it difficult to trust model outputs and impedes automatization.
62+
`torchbayesian` was created to make Bayesian Neural Networks (BNNs) and uncertainty quantification in PyTorch as simple as possible.
63+
The goal is to lower the barrier to practical Bayesian deep learning, enabling researchers and practitioners to integrate principled uncertainty estimation directly into their existing framework.
6464

6565
## Citation
6666

0 commit comments

Comments
 (0)