KEX – Software library for easier management and result representation of Keras experiments

Authors

Ing. Pavol Harár

Download

The software is available for downlad here.

Description

This library simplifies the workflow of training multiple DNN models using Keras and Theano frameworks and automatically presents the results. It is very useful for hyperparameter tuning when you need to train lots of models to find the best performing one.

Kex will wrap all defined models into one experiment, train them and save all the configuration files and results into one folder, so you can easily compare which model performed better.

KEX steps

Img. 1. Simplified KEX workflow.

Main features

  • Configuration of all models resides in just one file, so you can configure all models at once easily.
  • Uses data generator as input, so it can handle large amounts of data.
  • Data preparation is done beforehand on CPU and training on GPU, so only the speed of your GPU is the bottleneck.
  • Utilizes multiple threads to run models on multiple GPUs, so you can train multiple models at once.
    WARNING: it does not spread one model over multiple GPUs, it just spreads multiple models over multiple GPUs.
  • It handles GPU Memory errors. If you, by mistake, define a model that won’t fit in your GPU memory,
    the program won’t crash and continues to train the next model, so you won’t have to start over.
  • It can resume the experiment after marginal crash such as machine shutdown etc., so you won’t have to start over.
  • It assures reproducible results if the model definition and initial weights did not change.
KEX folder structure

Img. 2. Example of the experiment folder structure.

KEX results

Img. 3. Example of the results of one DNN model.

License

This project is licensed under the terms of the MIT license.