-
Notifications
You must be signed in to change notification settings - Fork 415
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Homotopy explicit all kwargs #2588
Conversation
df8ffad
to
81320a2
Compare
81320a2
to
b3e8aaf
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the update! This is much more in-line with the rest of the optimize_acqf
signatures.
botorch/optim/optimize_homotopy.py
Outdated
from collections.abc import Callable | ||
|
||
from botorch.generation.gen import TGenCandidates | ||
from botorch.optim.initializers import ( | ||
TGenInitialConditions, | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you run ufmt format .
on the PR? This will fail the linter.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure, Can I add a pre-commit yaml to have this be an auto hook? doesn't work out of the box with ufmt
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #2588 +/- ##
=======================================
Coverage 99.98% 99.98%
=======================================
Files 196 196
Lines 17333 17345 +12
=======================================
+ Hits 17330 17343 +13
+ Misses 3 2 -1 ☔ View full report in Codecov by Sentry. |
@saitcakmak has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
@saitcakmak has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
test/optim/test_homotopy.py
Outdated
inequality_constraints=constraints, | ||
) | ||
self.assertEqual(candidate.shape, torch.Size([1, 2])) | ||
self.assertGreaterEqual(candidate.sum(), 2.0) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
self.assertGreaterEqual(candidate.sum(), 2.0) | |
self.assertGreaterEqual(candidate.sum().item(), 2.0) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Weird that this is failing. I was gonna try comparing floats but looks like I don't have commit access to the PR
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If I make this change it at least makes the failure explict:
FAILED test/optim/test_homotopy.py::TestHomotopy::test_optimize_acqf_homotopy - AssertionError: 1.999999999999982 not greater than or equal to 2.0
Would you be okay if I change the test to >= 2.0 - 1e-6?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That sounds reasonable to me
… floating point errors
Can we hold off merging this as I signed the CLA for a previous commit in my personal time and someone at my new company wants the lawyer to check |
Sure, let us know when it is safe to merge. |
we're good to go |
@saitcakmak has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
@saitcakmak merged this pull request in 24f659c. |
Summary: Following pytorch/botorch#2588 being merged there is now no reason to stop people using inequality constraints with SEBO. This PR removes the ValueError check and adjusts the tests accordingly. Connected to #2790 Pull Request resolved: #2938 Reviewed By: Balandat Differential Revision: D64835824 Pulled By: saitcakmak fbshipit-source-id: d12a80851bb815497bae42f617c82a93cd88b3bf
This PR seeks to address: #2579
optimize_acqf_homotopy
is a fairly minimal wrapper aroundoptimize_acqf
but didn't have all the constraint functionality.This PR copies over all of the arguments that we could in principal want to use up into
optimize_acqf_homotopy
. For the time beingfinal_options
has been kept. The apparent bug with fixed features not being passed to the final optimization has been fixed.a simple dict rather than
OptimizeAcqfInputs
dataclass is used to store the shared parameters.Related PRs
The original approach in #2580 made use of kwargs which was opposed due to being less explicit.