Smac 2.0 Page

print(f"Best config: incumbent") print(f"Best cost: smac.runhistory.get_cost(incumbent)") | Concept | Meaning | |--------|---------| | Incumbent | Best configuration found so far | | Surrogate | Model that predicts performance given parameters | | Acquisition Function | Balances exploration (try unknown) vs exploitation (trust surrogate) – e.g., EI, LCB | | Runhistory | Log of all evaluated configs + costs | | Multi-fidelity | Use cheap approximations (e.g., 10% of data) to discard bad configs early | | Conditional Space | if hyperparameter A = X then hyperparameter B appears | Advanced Features (SMAC 2.0 Unlocks) 1. Multi-fidelity (budget):

smac = HPOFacade(scenario, train_model, overwrite=True) smac.optimize(parallel_backend="multiprocessing", n_workers=4) | Pitfall | Fix | |---------|-----| | SMAC gets stuck in one region | Increase acq_func exploration (e.g., acq_func="EI" + high kappa ) | | Too slow for large spaces | Use multi-fidelity or lower n_trials | | Conditional parameters not handled | Use ConfigSpace.Condition – see docs | | Reproducibility issues | Set seed in Scenario | | Memory blowup | Reduce runhistory size or use extensive=False in facade | Comparison vs Other Tuners (TL;DR) | Tool | Best for | |------|----------| | SMAC 2.0 | Conditional spaces, multi-objective, moderate cost | | Optuna | Simpler spaces, TPEF+CMA, good defaults | | Hyperopt | Quick TPE experiments, older codebases | | BayesianOptimization | Low-dim (<20) continuous spaces | | Grid/Random | Debugging, cheap functions | Final Tip Start with HPOFacade – it hides most complexity. Only drop to SMAC4BB or SMAC4AC classes if you need full control (e.g., custom surrogate). smac 2.0

from smac import HyperparameterOptimizationFacade as HPOFacade from smac import Scenario def train_model(config, seed: int = 0): lr = config["learning_rate"] batch_size = config["batch_size"] # ... train your model ... return validation_error # lower is better 2. Define hyperparameter space from ConfigSpace import ConfigurationSpace, Float, Integer cs = ConfigurationSpace() cs.add_float("learning_rate", (1e-5, 1.0), log=True) cs.add_integer("batch_size", (16, 256), log=True) 3. Set scenario scenario = Scenario(cs, n_trials=100, walltime_limit=3600) 4. Optimize smac = HPOFacade(scenario, train_model) incumbent = smac.optimize() print(f"Best config: incumbent") print(f"Best cost: smac

https://automl.github.io/SMAC3/main/ Paper: "SMAC 2.0: A Versatile Hyperparameter Optimization Framework" (Lindauer et al., 2022) walltime_limit=3600) 4. Optimize smac = HPOFacade(scenario

from smac import MultiObjectiveFacade # minimize both error and latency smac = MultiObjectiveFacade(scenario, train_model, ["val_loss", "inference_ms"])