pytorch / botorch

Bayesian optimization in PyTorch
https://botorch.org/
MIT License
3.11k stars 406 forks source link

Fix hard-coded `double` precision in test_functions to default dtype #2597

Closed AVHopp closed 3 weeks ago

AVHopp commented 4 weeks ago

Motivation

This PR replaces the hard-coded double precision that was used in the initialization of test_functions/base.py to use torch.get_default_dtype() instead.

See #2596 for more details.

Have you read the Contributing Guidelines on pull requests?

Yes, I have read it.

Test Plan

I ran code formatting via ufmt and checked the code via pytest -ra. All tests related to the changes here were adjusted in the second commit of the branch. Locally, two tests failed for me, but it seems to me that these are not related to the fix implemented here. If it turns out they are, I'd be more than happy to further adjust.

Related PRs

None, but #2596 is related.

facebook-github-bot commented 4 weeks ago

Hi @AVHopp!

Thank you for your pull request and welcome to our community.

Action Required

In order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you.

Process

In order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.

Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with CLA signed. The tagging process may take up to 1 hour after signing. Please give it that time before contacting us about it.

If you have received this in error or have any questions, please contact us at cla@meta.com. Thanks!

AVHopp commented 4 weeks ago

I am aware that I still need to sign the CLA and will do so soon.

codecov[bot] commented 4 weeks ago

Codecov Report

All modified and coverable lines are covered by tests :white_check_mark:

Project coverage is 99.98%. Comparing base (9d37e90) to head (6ddd51f). Report is 6 commits behind head on main.

Additional details and impacted files ```diff @@ Coverage Diff @@ ## main #2597 +/- ## ======================================= Coverage 99.98% 99.98% ======================================= Files 196 196 Lines 17367 17367 ======================================= Hits 17365 17365 Misses 2 2 ```

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

AVHopp commented 4 weeks ago

I have signed the CLA, but it seems like the test was not retriggered, does this just take some more time or did something go wrong with me signing it?

Balandat commented 4 weeks ago

I have signed the CLA, but it seems like the test was not retriggered, does this just take some more time or did something go wrong with me signing it?

No need to re-test. I can now import the PR and land it even if GitHub says that check failed.

facebook-github-bot commented 4 weeks ago

@Balandat has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

facebook-github-bot commented 4 weeks ago

Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Meta Open Source project. Thanks!

AVHopp commented 3 weeks ago

@Balandat @esantorella Pinging both of you since we already started discussions on #2596 .

TL;DR: Changed to use dtype in __init__. Looking forward to feedback, how to deal with the branch history?

As proposed there, I have re-implemented this now such that there is an explicit dtype argument in all of the test functions. I just added this as an argument to every explicit __init__ function, always setting it to torch.double. Not sure if there is a more elegant way, if so feel free to point me to it :)

For better reviewing, each commit just touches a single file - however, I'd be more than happy to rebase/squash the commits at a later point, but since I could not find any guidelines regarding whether or not this is done here, I thought I'd better just have a clean but "true" history for now 😄

I tested the code again locally using pytest and also in my own code where I observed this problem and everything seems to work.

Looking forward to your feedback/comments :)

Balandat commented 3 weeks ago

For better reviewing, each commit just touches a single file - however, I'd be more than happy to rebase/squash the commits at a later point, but since I could not find any guidelines regarding whether or not this is done here, I thought I'd better just have a clean but "true" history for now 😄

I think it's fine to have these as one commit. But it doesn't really matter for the branch history here; the way we merge things into the main branch in botorch is that our github bot will auto-squash every change from this PR into a single commit.

facebook-github-bot commented 3 weeks ago

@Balandat has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

Balandat commented 3 weeks ago

Thanks!

facebook-github-bot commented 3 weeks ago

@Balandat merged this pull request in pytorch/botorch@10f634201c917f2010be80b6d18003075198826c.