Closed AVHopp closed 3 weeks ago
Hi @AVHopp!
Thank you for your pull request and welcome to our community.
In order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you.
In order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.
Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with CLA signed
. The tagging process may take up to 1 hour after signing. Please give it that time before contacting us about it.
If you have received this in error or have any questions, please contact us at cla@meta.com. Thanks!
I am aware that I still need to sign the CLA and will do so soon.
All modified and coverable lines are covered by tests :white_check_mark:
Project coverage is 99.98%. Comparing base (
9d37e90
) to head (6ddd51f
). Report is 6 commits behind head on main.
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
I have signed the CLA, but it seems like the test was not retriggered, does this just take some more time or did something go wrong with me signing it?
I have signed the CLA, but it seems like the test was not retriggered, does this just take some more time or did something go wrong with me signing it?
No need to re-test. I can now import the PR and land it even if GitHub says that check failed.
@Balandat has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.
Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Meta Open Source project. Thanks!
@Balandat @esantorella Pinging both of you since we already started discussions on #2596 .
TL;DR: Changed to use dtype
in __init__
. Looking forward to feedback, how to deal with the branch history?
As proposed there, I have re-implemented this now such that there is an explicit dtype
argument in all of the test functions. I just added this as an argument to every explicit __init__
function, always setting it to torch.double
. Not sure if there is a more elegant way, if so feel free to point me to it :)
For better reviewing, each commit just touches a single file - however, I'd be more than happy to rebase/squash the commits at a later point, but since I could not find any guidelines regarding whether or not this is done here, I thought I'd better just have a clean but "true" history for now 😄
I tested the code again locally using pytest
and also in my own code where I observed this problem and everything seems to work.
Looking forward to your feedback/comments :)
For better reviewing, each commit just touches a single file - however, I'd be more than happy to rebase/squash the commits at a later point, but since I could not find any guidelines regarding whether or not this is done here, I thought I'd better just have a clean but "true" history for now 😄
I think it's fine to have these as one commit. But it doesn't really matter for the branch history here; the way we merge things into the main branch in botorch is that our github bot will auto-squash every change from this PR into a single commit.
@Balandat has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.
Thanks!
@Balandat merged this pull request in pytorch/botorch@10f634201c917f2010be80b6d18003075198826c.
Motivation
This PR replaces the hard-coded double precision that was used in the initialization of
test_functions/base.py
to usetorch.get_default_dtype()
instead.See #2596 for more details.
Have you read the Contributing Guidelines on pull requests?
Yes, I have read it.
Test Plan
I ran code formatting via
ufmt
and checked the code viapytest -ra
. All tests related to the changes here were adjusted in the second commit of the branch. Locally, two tests failed for me, but it seems to me that these are not related to the fix implemented here. If it turns out they are, I'd be more than happy to further adjust.Related PRs
None, but #2596 is related.