Building upon JAX and Ray, EvoX offers a comprehensive suite of 50+ Evolutionary Algorithms (EAs) and a wide range of 100+ Benchmark Problems/Environments, all benefiting from distributed GPU-acceleration. It facilitates efficient exploration of complex optimization landscapes, effective tackling of black-box optimization challenges, and deep dives into neuroevolution with Brax. With a foundation in functional programming and hierarchical state management, EvoX offers a user-friendly and modular experience. For more details, please refer to our Paper and Documentation / 文档.
-
🚀 Fast Performance:
- Experience GPU-Accelerated optimization, achieving speeds over 100x faster than traditional methods.
- Leverage the power of Distributed Workflows for even more rapid optimization.
-
🌐 Versatile Optimization Suite:
- Cater to all your needs with both Single-objective and Multi-objective optimization capabilities.
- Dive into a comprehensive library of Benchmark Problems/Environments, ensuring robust testing and evaluation.
- Explore the frontier of AI with extensive tools for Neuroevolution/RL tasks.
-
🛠️ Designed for Simplicity:
- Embrace the elegance of Functional Programming, simplifying complex algorithmic compositions.
- Benefit from Hierarchical State Management, ensuring modular and clean programming.
Category | Algorithms |
---|---|
Differential Evolution | CoDE, JaDE, SaDE, SHADE, IMODE, ... |
Evolution Strategy | CMA-ES, PGPE, OpenES, CR-FM-NES, xNES, ... |
Particle Swarm Optimization | FIPS, CSO, CPSO, CLPSO, SL-PSO, ... |
Category | Algorithms |
---|---|
Dominance-based | NSGA-II, NSGA-III, SPEA2, BiGE, KnEA, ... |
Decomposition-based | MOEA/D, RVEA, t-DEA, MOEAD-M2M, EAG-MOEAD, ... |
Indicator-based | IBEA, HypE, SRA, MaOEA-IGD, AR-MOEA, ... |
For a comprehensive list and further details of all algorithms, please check the API Documentation.
Category | Problems/Environments |
---|---|
Numerical | DTLZ, LSMOP, MaF, ZDT, CEC'22, ... |
Neuroevolution/RL | Brax, Gym, TorchVision Dataset, ... |
For a comprehensive list and further details of all benchmark problems/environments, please check the API Documentation.
Install evox
effortlessly via pip
:
pip install evox
Note: To setup EvoX with GPU acceleration capabilities, you will need to setup JAX first. For detials, please refer to our comprehensive Installation Guide. Additionally, you can watch our instructional videos:
🎥 EvoX Installation Guide (Linux)
🎥 EvoX Installation Guide (Windows)
Kickstart your journey with EvoX in just a few simple steps:
- Import necessary modules:
import evox
from evox import algorithms, problems, workflows
- Configure an algorithm and define a problem:
pso = algorithms.PSO(
lb=jnp.full(shape=(2,), fill_value=-32),
ub=jnp.full(shape=(2,), fill_value=32),
pop_size=100,
)
ackley = problems.numerical.Ackley()
- Compose and initialize the workflow:
workflow = workflows.StdWorkflow(pso, ackley)
key = jax.random.PRNGKey(42)
state = workflow.init(key)
- Run the workflow:
# Execute the workflow for 100 iterations
for i in range(100):
state = workflow.step(state)
Try out ready-to-play examples in your browser with Colab:
Example | Link |
---|---|
Basic Usage | |
Numerical Optimization | |
Neuroevolution with Gym | |
Neuroevolution with Brax | |
Custom Algorithm/Problem |
For more use-cases and applications, pleae check out Example Directory.
- Engage in discussions and share your experiences on GitHub Discussion Board.
- Join our QQ group (ID: 297969717).
- Help with the translation of the documentation on Weblate. We currently support translations in two languages, English / 中文.
- Official Website: https://evox.group/
- TensorNEAT: Tensorized NeuroEvolution of Augmenting Topologies (NEAT) for GPU Acceleration. Check out here.
- TensorRVEA: Tensorized Reference Vector Guided Evolutionary Algorithm (RVEA) for GPU Acceleration. Check out here.
- TensorACO: Tensorized Ant Colony Optimization (ACO) for GPU Acceleration. Check out here.
- EvoXBench: A benchmark platform for Neural Architecutre Search (NAS) without the requirement of GPUs/PyTorch/Tensorflow, supporting various programming languages such as Java, Matlab, Python, ect. Check out here.
If you use EvoX in your research and want to cite it in your work, please use:
@article{evox,
title = {{EvoX}: {A} {Distributed} {GPU}-accelerated {Framework} for {Scalable} {Evolutionary} {Computation}},
author = {Huang, Beichen and Cheng, Ran and Li, Zhuozhao and Jin, Yaochu and Tan, Kay Chen},
journal = {IEEE Transactions on Evolutionary Computation},
year = 2024,
doi = {10.1109/TEVC.2024.3388550}
}