Skip to content

Release 2.0.4 #510

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 77 commits into from
Jun 17, 2025
Merged

Release 2.0.4 #510

merged 77 commits into from
Jun 17, 2025

Conversation

stefanradev93
Copy link
Contributor

@stefanradev93 stefanradev93 commented Jun 13, 2025

Summary

This PR introduces the next set of improvements and qualitative changes for the upcoming v2.0.4 release.


Major Changes

  • Added several diffusion model implementations
  • Rewritten and refactored all Approximators classes
  • Introduced new adapter transforms, including support for nested tensors (e.g., dictionaries of tensors) to better handle multimodal simulations
  • Added a new fusion network for multimodal amortized Bayesian inference (ABI)
  • Integrated standardization layers into each Approximation, eliminating the need for stateful adapters
    • Note: this paves the way for fully stateless adapters in v2.0.5; all stateful functionality from the adapter will be removed in v2.0.5.
  • Enabled automatic standardization of point estimates
  • Enhanced support for tracking multiple loss terms
  • Enabled sampling of tensors with arbitrary shapes
  • Extended and improved tests
  • Support arbitrary data augmentation functions for data sets

Bug Fixes

  • Resolved serialization issues in edge cases and when using model comparison approximators
  • Fixed incorrect aggregation of validation losses
  • Corrected pairplot rendering issues

eodole and others added 30 commits May 6, 2025 13:30
* made initial backend functions for adapter subsetting, need to still make the squeeze function and link it to the front end

* added subsample functionality, to do would be adding them to testing procedures

* made the take function and ran the linter

* changed name of subsampling function

* changed documentation, to be consistent with external notation, rather than internal shorthand

* small formation change to documentation

* changed subsample to have sample size and axis in the constructor

* moved transforms in the adapter.py so they're in alphabetical order like the other transforms

* changed random_subsample to maptransform rather than filter transform

* updated documentation with new naming convention

* added arguments of take to the constructor

* added feature to specify a percentage of the data to subsample rather than only integer input

* changed subsample in adapter.py to allow float as an input for the sample size

* renamed subsample_array and associated classes/functions to RandomSubsample and random_subsample respectively

* included TypeError to force users to only subsample one dataset at a time

* ran linter

* rerun formatter

* clean up random subsample transform and docs

* clean up take transform and docs

* nitpick clean-up

* skip shape check for subsampled adapter transform inverse

* fix serialization of new transforms

* skip randomly subsampled key in serialization consistency check

---------

Co-authored-by: LarsKue <lars@kuehmichel.de>
…ositiveDefinite link (#469)

* Refactor fill_triangular_matrix

* stable positive definite link, fix for #468

* Minor changes to docstring

* Remove self.built=True that prevented registering layer norm in build()

* np -> keras.ops
* Remove old rounds data set, add documentation, and augmentation options to data sets

* Enable augmentation to parts of the data or the whole data

* Improve doc

* Enable augmentations in workflow

* Fix silly type check and improve readability of for loop

* Bring back num_batches
y = (x - mu) / sigma
log p(y) = log p(x) - log(sigma)
Fixed Jacobian computation of standardize transform
The two lines were switched, leading to performance degradation.
This commit contains the following changes (see PR #408 for discussions)

- DiffusionModel following the formalism in Kingma et. al (2023) [1]
- Stochastic sampler to solve SDEs
- Tests for the diffusion model

[1] https://arxiv.org/abs/2303.00848

---------

Co-authored-by: arrjon <jonas.arruda@uni-bonn.de>
Co-authored-by: Jonas Arruda <69197639+arrjon@users.noreply.github.com>
Co-authored-by: LarsKue <lars@kuehmichel.de>
- From the table in the `bayesflow.networks` module overview, one cannot
  tell which network belongs to which group. This commit adds short
  labels to indicate inference networks (IN) and summary networks (SN)
…simulators (#452)

Adds option to drop, fill or error when different keys are encountered in the outputs of different simulators. Fixes #441.

---------

Co-authored-by: Valentin Pratz <git@valentinpratz.de>
* Add classes and transforms to simplify multimodal training

- Add class `MultimodalSummaryNetwork` to combine multiple summary
  networks, each for one modality.
- Add transforms `Group` and `Ungroup`, to gather the multimodal inputs
  in one variable (usually "summary_variables")
- Add tests for new behavior

* [no ci] add tutorial notebook for multimodal data

* [no ci] add missing training argument

* rename MultimodalSummaryNetwork to FusionNetwork

* [no ci] clarify that the network implements late fusion
Very basic transform, just the inverse of expand_dims
…erialization in keras (#493)

* add custom sequential to fix #491

* revert using Sequential in classifier_two_sample_test.py

* Add docstring to custom Sequential

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* fix copilot docstring

* remove mlp override methods

---------

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Make docs optional dependencies compatible with python 3.10
* Add NNPE adapter

* Add NNPE adapter tests

* Only apply NNPE during training

* Integrate stage differentiation into tests

* Improve test coverage

* Fix inverse and add to tests

* Adjust class name and add docstring to forward method

* Enable compatibility with #486 by adjusting scales automatically

* Add dimensionwise noise application

* Update exception handling

* Fix tests
@vpratz
Copy link
Collaborator

vpratz commented Jun 14, 2025

I will do some local experiments. The trained diffusion model is performing quite well. The strange thing is that the test only fails with a JAX backend....

@stefanradev93 I cannot reproduce the failure locally, can you? I think skipping this specific test for the diffusion models (e.g. via a marker) would be acceptable, as we do not expect cycle consistency to strictly hold for the number of steps we want to put in here, and the errors can accumulate in more or less arbitrary ways. We might want to use (at least somewhat) trained models in the future, where we can expect more consistent results.

@LarsKue
Copy link
Contributor

LarsKue commented Jun 14, 2025

@vpratz I added a review to #503. I think we should not include it here, yet.

@stefanradev93
Copy link
Contributor Author

stefanradev93 commented Jun 14, 2025

@stefanradev93 I cannot reproduce the failure locally, can you? I think skipping this specific test for the diffusion models (e.g. via a marker) would be acceptable, as we do not expect cycle consistency to strictly hold for the number of steps we want to put in here, and the errors can accumulate in more or less arbitrary ways. We might want to use (at least somewhat) trained models in the future, where we can expect more consistent results.

The error is stochastic and I think you are right. We should not rely on this test for now. It is still a bit strange that it's only the case for JAX.

Copy link
Contributor

@LarsKue LarsKue left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I left a detailed review. See individual comments.

@stefanradev93 stefanradev93 requested a review from LarsKue June 17, 2025 07:19
@LarsKue
Copy link
Contributor

LarsKue commented Jun 17, 2025

Thanks everyone for the massive work in contributing these features and also in cleaning up the parts that I commented so quickly!

Rename approximator.summaries to summarize with deprecation
@stefanradev93 stefanradev93 merged commit 0cdd16a into main Jun 17, 2025
16 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

10 participants