-
-
Notifications
You must be signed in to change notification settings - Fork 233
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How can I dynamically change the parameter space during the optimization process? #1213
Comments
Hi, Problem:Smacs, by default, uses LocalAndRandomSearch. Here, previously evaluated configurations, which would then be outside of the new space, are used as starting points for LocalSearch. Possible Solutions:In order to still do this, you need to create a new, modified ConfigurationSpace (you can't adapt a ConfigurationSpace due to caching) and set it at every affected position, e.g., the AcquisitionFunctionMaximizer, Search Strategies, and AcquisitionFunctions.
Should you have any issues, or good ideas how to fix this, feel free to reach out again :) |
I have another question: How can we define complex parameter space constraints? For example, how can we ensure that the product of two parameters falls within a certain interval, such as constant_A < param1 × param2 < constant_B? I’ve looked around and found that the internal Condition and Conjunction classes seem to only support simple constraints and cannot solve my problem. If SMAC currently doesn't support this, where would you recommend I add this constraint? I would really appreciate it if you could reply. |
I think that should issue should be generally raised with ConfigSpace. In case this is not possible with ConfigSapce (which I am unsure of), you could preprocess configurations? class FilteredSearch(LocalAndRandomSearch):
def _maximize(
self,
previous_configs: list[Configuration],
n_points: int,
) -> list[tuple[float, Configuration]]:
while self._not_enough_configs():
configs = super()._maximize(previous_configs, n_points)
#Disregard configs that do not fall under your budget
def _not_enough_configs(self,...):
.... Should you struggle with this, feel free to reach out directly. |
How to dynamically change the model's sampling space during the optimization process. This is because some configurations may already exhibit good performance, and I want to narrow down the sampling space to the vicinity of these configurations. How can I write the code to achieve this?
The text was updated successfully, but these errors were encountered: