Skip to content

Latest commit

 

History

History
53 lines (34 loc) · 2.15 KB

README.md

File metadata and controls

53 lines (34 loc) · 2.15 KB

miha_logo.png

MIHA

Optimizer for configuration of hyperparameters in neural networks.

What does this library do? - Module can optimize hyperparameters of a neural network for a pre-defined architecture.

What deep learning libraries can this module work with? - PyTorch.

What algorithm is used for optimization? - An evolutionary algorithm with mutation and crossover operators is used. The neural network is continuously trained in the process of evolution.

The main concept

main_concept.png

Requirements

'python>=3.7',
'numpy',
'cudatoolkit==10.2',
'torchvision==0.7.0',
'pytorch==1.6.0'

Documentation

Description of the submodules:

For now all the necessary description can be found in docstring.

How to use

How to run the algorithm can be seen in the examples:

Comparison with competing solutions (jupyter notebooks)

Contacts

Feel free to contact us: