Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How can I dynamically change the parameter space during the optimization process? #1213

Open
Mr-yang12345 opened this issue Mar 4, 2025 · 3 comments

Comments

@Mr-yang12345
Copy link

How to dynamically change the model's sampling space during the optimization process. This is because some configurations may already exhibit good performance, and I want to narrow down the sampling space to the vicinity of these configurations. How can I write the code to achieve this?

@LukasFehring
Copy link
Collaborator

LukasFehring commented Mar 5, 2025

Hi,
Unfortunately, this is currently not supported in the standard setup.

Problem:

Smacs, by default, uses LocalAndRandomSearch. Here, previously evaluated configurations, which would then be outside of the new space, are used as starting points for LocalSearch.

Possible Solutions:

In order to still do this, you need to create a new, modified ConfigurationSpace (you can't adapt a ConfigurationSpace due to caching) and set it at every affected position, e.g., the AcquisitionFunctionMaximizer, Search Strategies, and AcquisitionFunctions.
Additionally you need to do one of the following

  1. Adapt LocalSearch or only use RandomSearch.
  2. Set strong distributions on the ConfigurationSpace such that samples outside of the ConfigurationSpace are very unlikely.
  3. Use forbidden clauses in the ConfigurationSpace (I am not sure whether this works)

Should you have any issues, or good ideas how to fix this, feel free to reach out again :)

@Mr-yang12345
Copy link
Author

Hi, Unfortunately, this is currently not supported in the standard setup.

Problem:

Smacs, by default, uses LocalAndRandomSearch. Here, previously evaluated configurations, which would then be outside of the new space, are used as starting points for LocalSearch.

Possible Solutions:

In order to still do this, you need to create a new, modified ConfigurationSpace (you can't adapt a ConfigurationSpace due to caching) and set it at every affected position, e.g., the AcquisitionFunctionMaximizer, Search Strategies, and AcquisitionFunctions. Additionally you need to do one of the following

  1. Adapt LocalSearch or only use RandomSearch.
  2. Set strong distributions on the ConfigurationSpace such that samples outside of the ConfigurationSpace are very unlikely.
  3. Use forbidden clauses in the ConfigurationSpace (I am not sure whether this works)

Should you have any issues, or good ideas how to fix this, feel free to reach out again :)

I have another question: How can we define complex parameter space constraints? For example, how can we ensure that the product of two parameters falls within a certain interval, such as constant_A < param1 × param2 < constant_B? I’ve looked around and found that the internal Condition and Conjunction classes seem to only support simple constraints and cannot solve my problem. If SMAC currently doesn't support this, where would you recommend I add this constraint? I would really appreciate it if you could reply.

@LukasFehring
Copy link
Collaborator

I think that should issue should be generally raised with ConfigSpace.

In case this is not possible with ConfigSapce (which I am unsure of), you could preprocess configurations?
Simply create a new class and wrap the _maximize

class FilteredSearch(LocalAndRandomSearch):

  def _maximize(
          self,
          previous_configs: list[Configuration],
          n_points: int,
      ) -> list[tuple[float, Configuration]]:
         while self._not_enough_configs(): 
             configs = super()._maximize(previous_configs, n_points)
             #Disregard configs that do not fall under your budget
    
    def _not_enough_configs(self,...):
        ....

Should you struggle with this, feel free to reach out directly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants