pytorch / botorch

Bayesian optimization in PyTorch
https://botorch.org/
MIT License
3.11k stars 406 forks source link

Expose all kwargs to optimize_acqf in optimize_acqf_homotopy #2580

Closed CompRhys closed 1 month ago

CompRhys commented 1 month ago

Motivation

This PR seeks to address https://github.com/pytorch/botorch/issues/2579

Have you read the Contributing Guidelines on pull requests?

Yes

Test Plan

Added test incorporating a linear inequality constraint

Related PRs

If this is merged as is with breaking changes then a follow-on PR will be needed in Ax

codecov[bot] commented 1 month ago

Codecov Report

Attention: Patch coverage is 67.85714% with 9 lines in your changes missing coverage. Please review.

Project coverage is 99.93%. Comparing base (d2c6f5e) to head (85805d9). Report is 1 commits behind head on main.

Files with missing lines Patch % Lines
botorch/optim/optimize_homotopy.py 67.85% 9 Missing :warning:
Additional details and impacted files ```diff @@ Coverage Diff @@ ## main #2580 +/- ## ========================================== - Coverage 99.98% 99.93% -0.06% ========================================== Files 196 196 Lines 17333 17357 +24 ========================================== + Hits 17331 17346 +15 - Misses 2 11 +9 ```

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

CompRhys commented 1 month ago

I really am loath to contribute a PR that creates increased maintenance burden going forward for sake of documentation/type hinting, this method has these missing constraints options and has the apparent fixed_features bug because it leaves the surface area for such issues to exist by hard coding the args to the function it wraps. There will 100% be divergences again when more functionality is added to the base method.

However, pragmatically as I would like this functionality, if you get back to me with decisions about the need for final_options (I think it's most consistent with what you propose in terms of arg unification to remove this as not used here or in Ax) and whether the apparent issues with fixed_features are indeed bugs or whether it was intended (I can't see how it would be intended as the test would be flaky if the test optimization was harder), I can implement it the other way.

test failures are all saying that 2.000 is not >= 2.0 which I can debug after re-writing the wrapper function

CompRhys commented 1 month ago

close in favour of https://github.com/pytorch/botorch/pull/2588