NOTE: Following the release of v0.1.2, this project will be maintained for stability and compatibility purposes but will no longer undergo active development. While new features will not be added, the repository will continue to receive necessary updates to ensure ongoing functionality. For any feature requests, please feel free to fork the repository and submit a pull request.
Welcome to DEHB, an algorithm for Hyperparameter Optimization (HPO). DEHB uses Differential Evolution (DE) under-the-hood as an Evolutionary Algorithm to power the black-box optimization that HPO problems pose.
dehb
is a python package implementing the DEHB algorithm. It offers an intuitive interface to optimize user-defined problems using DEHB.
pip install dehb
DEHB allows users to either utilize the Ask & Tell interface for manual task distribution or leverage the built-in functionality (run
) to set up a Dask cluster autonomously. The following snippet offers a small look in to how to use DEHB. For further information, please refer to our getting started examples in our documentation.
optimizer = DEHB(
f=your_target_function,
cs=config_space,
dimensions=dimensions,
min_fidelity=min_fidelity,
max_fidelity=max_fidelity)
##### Using Ask & Tell
# Ask for next configuration to run
job_info = optimizer.ask()
# Run the configuration for the given fidelity. Here you can freely distribute the computation to any worker you'd like.
result = your_target_function(config=job_info["config"], fidelity=job_info["fidelity"])
# When you received the result, feed them back to the optimizer
optimizer.tell(job_info, result)
##### Using run()
# Run optimization for 1 bracket. Output files will be saved to ./logs
traj, runtime, history = optimizer.run(brackets=1)
For a more in-depth look in how-to run DEHB in a parallel setting, please have a look at our documentation.
- 00 - A generic template to use DEHB for multi-fidelity Hyperparameter Optimization
- 01.1 - Using DEHB to optimize 4 hyperparameters of a Scikit-learn's Random Forest on a classification dataset
- 01.2 - Using DEHB to optimize 4 hyperparameters of a Scikit-learn's Random Forest on a classification dataset using Ask & Tell interface
- 02 - Optimizing Scikit-learn's Random Forest without using ConfigSpace to represent the hyperparameter space
- 03 - Hyperparameter Optimization for MNIST in PyTorch
To run PyTorch example: (note additional requirements)
python examples/03_pytorch_mnist_hpo.py \
--min_fidelity 1 \
--max_fidelity 3 \
--runtime 60 \
--verbose
For more details and features, please have a look at our documentation.
Any contribution is greaty appreciated! Please take the time to check out our contributing guidelines
@inproceedings{awad-ijcai21,
author = {N. Awad and N. Mallik and F. Hutter},
title = {{DEHB}: Evolutionary Hyberband for Scalable, Robust and Efficient Hyperparameter Optimization},
pages = {2147--2153},
booktitle = {Proceedings of the Thirtieth International Joint Conference on
Artificial Intelligence, {IJCAI-21}},
publisher = {ijcai.org},
editor = {Z. Zhou},
year = {2021}
}