HyperbandSuccessive
HyperbandSuccessive is a proposed variant of the Hyperband algorithm for hyperparameter optimization that aims to combine the broad exploration of Hyperband with a more aggressive and unified use of successive halving. It retains the core idea of evaluating many configurations with small budgets and gradually increasing resource allocation to the best performers, but it emphasizes a tighter integration of successive halving across and within brackets to improve efficiency and reuse information.
- Initialize a diverse set of configurations sampled from the search space.
- Within each bracket, apply successive halving: evaluate all configurations with a small initial budget, prune a
- Cross-bracket coordination: share performance signals across brackets to steer allocations toward configurations showing consistent promise.
- Adaptation: allow eta (the fraction kept each round) and budget increments to adjust based on observed
- Emphasizes a continuous, nested successive halving process rather than strictly separate brackets with fixed schedules.
- Seeks to reuse partial evaluations across brackets to accelerate decision making.
- Supports dynamic adjustment of halving steps and budgets in response to runtime signals rather than relying
- Pros: can reduce wasted computation, speed up the identification of strong configurations, and provide robust early
- Cons: increases algorithmic complexity, potential sensitivity to evaluation noise, and may require careful tuning of eta
- Hyperparameter tuning for deep learning and other resource-intensive models, as well as neural architecture search where
See also: Hyperband, Successive Halving, Neural Architecture Search.