Releases: meta-pytorch/botorch
Releases · meta-pytorch/botorch
API updates, more robust model fitting
Breaking changes
- rename
botorch.qmctobotorch.sampling, move MC samplers from
acquisition.samplertobotorch.sampling.samplers(#172)
New Features
- Add
condition_on_observationsandfantasizeto the Model level API (#173) - Support pending observations generically for all
MCAcqusitionFunctions(#176) - Add fidelity kernel for training iterations/training data points (#178)
- Support for optimization constraints across
q-batches (to support things like
sample budget constraints) (2a95a6c) - Add ModelList <-> Batched Model converter (#187)
- New test functions
Improved functionality:
- More robust model fitting
- Introduce optional batch limit in
joint_optimizeto increases scalability of
parallel optimization (baab578) - Change constructor of
ModelListGPto comply with GPyTorch’sIndependentModelList
constructor (a6cf739) - Use
torch.randomto set default seed for samplers (rather thanrandom) to
making sampling reproducible when settingtorch.manual_seed
(ae507ad)
Performance Improvements
- Use
einsuminLinearMCObjective(22ca295) - Change default Sobol sample size for
MCAquisitionFunctionsto be base-2 for
better MC integration performance (5d8e818) - Add ability to fit models in
SumMarginalLogLikelihoodsequentially (and make
that the default setting) (#183) - Do not construct the full covariance matrix when computing posterior of
single-output BatchedMultiOutputGPyTorchModel (#185)
Bug fixes
- Properly handle observation_noise kwarg for BatchedMultiOutputGPyTorchModels (#182)
- Fix a issue where
f_bestwas always max for NoisyExpectedImprovement
(410de58) - Fix bug and numerical issues in
initialize_q_batch
(844dcd1) - Fix numerical issues with
inv_transformfor qMC sampling (#162)
Other
- Bump GPyTorch minimum requirement to 0.3.3
Initial beta release
First public release.