Releases: meta-pytorch/botorch
Releases · meta-pytorch/botorch
Maintenance Release, Updated Community Contributions
New Features
- Introduce updated guidelines and a new directory for community contributions (#2167).
- Add
qEUBOpreferential acquisition function (#2192). - Add Multi Information Source Augmented GP (#2152).
Bug Fixes
- Fix
condition_on_observationsin fully Bayesian models (#2151). - Fix for bug that occurs when splitting single-element bins, use default BoTorch kernel for BAxUS. (#2165).
- Fix a bug when non-linear constraints are used with
q > 1(#2168). - Remove unsupported
X_pendingfromqMultiFidelityLowerBoundMaxValueEntropyconstructor (#2193). - Don't allow
data_fidelities=[]inSingleTaskMultiFidelityGP(#2195). - Fix
EHVI,qEHVI, andqLogEHVIinput constructors (#2196). - Fix input constructor for
qMultiFidelityMaxValueEntropy(#2198). - Add ability to not deduplicate points in
_is_non_dominated_loop(#2203).
Other Changes
- Minor improvements to
MVaRrisk measure (#2150). - Add support for multitask models to
ModelListGP(#2154). - Support unspecified noise in
ContextualDataset(#2155). - Update
HVKGsampler to reflect the number of model outputs (#2160). - Release restriction in
OneHotToNumericthat the categoricals are the trailing dimensions (#2166). - Standardize broadcasting logic of
q(Log)EI'sbest_fandcompute_best_feasible_objective(#2171). - Use regular inheritance instead of dispatcher to special-case
PairwiseGPlogic (#2176). - Support
PBOinEUBO's input constructor (#2178). - Add
posterior_transformtoqMaxValueEntropySearch's input constructor (#2181). - Do not normalize or standardize dimension if all values are equal (#2185).
- Reap deprecated support for objective with 1 arg in
GenericMCObjective(#2199). - Consistent signature for
get_objective_weights_transform(#2200). - Update context order handling in
ContextualDataset(#2205). - Update contextual models for use in MBM (#2206).
- Remove
(Identity)AnalyticMultiOutputObjective(#2208). - Reap deprecated support for
soft_eval_constraint(#2223). Please usebotorch.utils.sigmoidinstead.
Compatibility
- Pin
mpmath <= 1.3.0to avoid CI breakages due to removed modules in the latest alpha release (#2222).
Hypervolume Knowledge Gradient (HVKG)
New features
Hypervolume Knowledge Gradient (HVKG):
- Add
qHypervolumeKnowledgeGradient, which seeks to maximize the difference in hypervolume of the hypervolume-maximizing set of a fixed size after conditioning the unknown observation(s) that would be received if X were evaluated (#1950, #1982, #2101). - Add tutorial on decoupled Multi-Objective Bayesian Optimization (MOBO) with HVKG (#2094).
Other new features:
- Add
MultiOutputFixedCostModel, which is useful for decoupled scenarios where the objectives have different costs (#2093). - Enable
q > 1in acquisition function optimization when nonlinear constraints are present (#1793). - Support different noise levels for different outputs in test functions (#2136).
Bug fixes
- Fix fantasization with a
FixedNoiseGaussianLikelihoodwhennoiseis known andXis empty (#2090). - Make
LearnedObjectivecompatible with constraints in acquisition functions regardless ofsample_shape(#2111). - Make input constructors for
qExpectedImprovement,qLogExpectedImprovement, andqProbabilityOfImprovementcompatible withLearnedObjectiveregardless ofsample_shape(#2115). - Fix handling of constraints in
qSimpleRegret(#2141).
Other changes
- Increase default sample size for
LearnedObjective(#2095). - Allow passing in
Xwith or without fidelity dimensions inproject_to_target_fidelity(#2102). - Use full-rank task covariance matrix by default in SAAS MTGP (#2104).
- Rename
FullyBayesianPosteriortoGaussianMixturePosterior; add_is_ensembleand_is_fully_bayesianattributes toModel(#2108). - Various improvements to tutorials including speedups, improved explanations, and compatibility with newer versions of libraries.
Bugfix release
Compatibility
- Re-establish compatibility with PyTorch 1.13.1 (#2083).
Multi-Objective "Log" acquisition functions
Highlights
- Additional "Log" acquisition functions for multi-objective optimization with better numerical behavior, which often leads to significantly improved BO performance over their non-"Log" counterparts:
FixedNoiseGPandFixedNoiseMultiFidelityGPhave been deprecated, their functionalities merged intoSingleTaskGPandSingleTaskMultiFidelityGP, respectively (#2052, #2053).- Removed deprecated legacy model fitting functions:
numpy_converter,fit_gpytorch_scipy,fit_gpytorch_torch,_get_extra_mll_args(#1995, #2050).
New Features
- Support multiple data fidelity dimensions in
SingleTaskMultiFidelityGPand (deprecated)FixedNoiseMultiFidelityGPmodels (#1956). - Add
logsumexpandfatmaxto handle infinities and control asymptotic behavior in "Log" acquisition functions (#1999). - Add outcome and feature names to datasets, implement
MultiTaskDataset(#2015, #2019). - Add constrained Hartmann and constrained Gramacy synthetic test problems (#2022, #2026, #2027).
- Support observed noise in
MixedSingleTaskGP(#2054). - Add
PosteriorStandardDeviationacquisition function (#2060).
Bug fixes
- Fix input constructors for
qMaxValueEntropyandqMultiFidelityKnowledgeGradient(#1989). - Fix precision issue that arises from inconsistent data types in
LearnedObjective(#2006). - Fix fantasization with
FixedNoiseGPand outcome transforms and useFantasizeMixin(#2011). - Fix
LearnedObjectivebase sample shape (#2021). - Apply constraints in
prune_inferior_points(#2069). - Support non-batch evaluation of
PenalizedMCObjective(#2073). - Fix
Datasetequality checks (#2077).
Other changes
- Don't allow unused
**kwargsin input_constructors except for a defined set of exceptions (#1872, #1985). - Merge inferred and fixed noise LCE-M models (#1993).
- Fix import structure in
botorch.acquisition.utils(#1986). - Remove deprecated functionality:
weightsargument ofRiskMeasureMCObjectiveandsqueeze_last_dim(#1994). - Make
X,Y,Yvarinto properties in datasets (#2004). - Make synthetic constrained test functions subclass from
SyntheticTestFunction(#2029). - Add
construct_inputsto contextual GP modelsLCEAGPandSACGP(#2057).
Bug fix release
This release fixes bugs that affected Ax's modular BotorchModel and silently ignored outcome constraints due to naming mismatches.
Bug fixes
- Hot fix (#1973) for a few issues:
- A naming mismatch between Ax's modular
BotorchModeland the BoTorch's acquisition input constructors, leading to outcome constraints in Ax not being used with single-objective acquisition functions in Ax's modularBotorchModel. The naming has been updated in Ax and consistent naming is now used in input constructors for single and multi-objective acquisition functions in BoTorch. - A naming mismatch in the acquisition input constructor
constraintsinqNoisyLogExpectedImprovement, which kept constraints from being used. - A bug in
compute_best_feasible_objectivethat could lead to-infincumbent values.
- A naming mismatch between Ax's modular
- Fix setting seed in
get_polytope_samples(#1968)
Other changes
Dependency fix release
This is a very minor release; the only change from v0.9.0 is that the linear_operator dependency was bumped to 0.5.1 (#1963). This was needed since a bug in linear_operator 0.5.0 caused failures with some BoTorch models.
LogEI acquisition functions, L0 regularization & homotopy optimization, PiBO, orthogonal additive kernel, nonlinear constraints
Compatibility
- Require Python >= 3.9.0 (#1924).
- Require PyTorch >= 1.13.1 (#1960).
- Require linear_operator == 0.5.0 (#1961).
- Require GPyTorch == 1.11 (#1961).
Highlights
- Introduce
OrthogonalAdditiveKernel(#1869). - Speed up LCE-A kernel by over an order of magnitude (#1910).
- Introduce
optimize_acqf_homotopy, for optimizing acquisition functions with homotopy (#1915). - Introduce
PriorGuidedAcquisitionFunction(PiBO) (#1920). - Introduce
qLogExpectedImprovement, which provides more accurate numerics thanqExpectedImprovementand can lead to significant optimization improvements (#1936). - Similarly, introduce
qLogNoisyExpectedImprovement, which is analogous toqNoisyExpectedImprovement(#1937).
New Features
- Add constrained synthetic test functions
PressureVesselDesign,WeldedBeam,SpeedReducer, andTensionCompressionString(#1832). - Support decoupled fantasization (#1853) and decoupled evaluations in cost-aware utilities (#1949).
- Add
PairwiseBayesianActiveLearningByDisagreement, an active learning acquisition function for PBO and BOPE (#1855). - Support custom mean and likelihood in
MultiTaskGP(#1909). - Enable candidate generation (via
optimize_acqf) with bothnon_linear_constraintsandfixed_features(#1912). - Introduce
L0PenaltyApproxObjectiveto support L0 regularization (#1916). - Enable batching in
PriorGuidedAcquisitionFunction(#1925).
Other changes
- Deprecate
FixedNoiseMultiTaskGP; allowtrain_Yvaroptionally inMultiTaskGP(#1818). - Implement
load_state_dictfor SAAS multi-task GP (#1825). - Improvements to
LinearEllipticalSliceSampler(#1859, #1878, #1879, #1883). - Allow passing in task features as part of X in MTGP.posterior (#1868).
- Improve numerical stability of log densities in pairwise GPs (#1919).
- Python 3.11 compliance (#1927).
- Enable using constraints with
SampleReducingMCAcquisitionFunctions when usinginput_constructors andget_acquisition_function(#1932). - Enable use of
qLogExpectedImprovementandqLogNoisyExpectedImprovementwith Ax (#1941).
Bug Fixes
- Enable pathwise sampling modules to be converted to GPU (#1821).
- Allow
Standardizemodules to be loaded once trained (#1874). - Fix memory leak in Inducing Point Allocators (#1890).
- Correct einsum computation in
LCEAKernel(#1918). - Properly whiten bounds in MVNXPB (#1933).
- Make
FixedFeatureAcquisitionFunctionconvert floats to double-precision tensors rather than single-precision (#1944). - Fix memory leak in
FullyBayesianPosterior(#1951). - Make
AnalyticExpectedUtilityOfBestOptioninput constructor work correctionly with multi-task GPs (#1955).
Maintenance Release
Maintenance Release
Compatibility
- Require GPyTorch == 1.10 and linear_operator == 0.4.0 (#1803).
New Features
- Polytope sampling for linear constraints along the q-dimension (#1757).
- Single-objective joint entropy search with additional conditioning, various improvements to entropy-based acquisition functions (#1738).
Other changes
- Various updates to improve numerical stability of
PairwiseGP(#1754, #1755). - Change batch range for
FullyBayesianPosterior(1176a38, #1773). - Make
gen_batch_initial_conditionsmore flexible (#1779). - Deprecate
objectivein favor ofposterior_transformforMultiObjectiveAnalyticAcquisitionFunction(#1781). - Use
prune_baseline=Trueas default forqNoisyExpectedImprovement(#1796). - Add
batch_shapeproperty toSingleTaskVariationalGP(#1799). - Change minimum inferred noise level for
SaasFullyBayesianSingleTaskGP(#1800).
Bug fixes
Maintenance Release
New Features
- Add BAxUS tutorial (#1559).
Other changes
- Various improvements to tutorials (#1703, #1706, #1707, #1708, #1710, #1711, #1718, #1719, #1739, #1740, #1742).
- Allow tensor input for
integer_indicesinRoundtransform (#1709). - Expose
cache_rootin qNEHVI input constructor (#1730). - Add
get_init_argshelper toNormalize&Roundtransforms (#1731). - Allowing custom dimensionality and improved gradient stability in
ModifiedFixedSingleSampleModel(#1732).