Heuristic hyperparameter optimization for multilayer perceptron with one hidden layer

Łukasz Neumann , Robert Marek Nowak


One of the crucial steps of preparing a neural network model is the process of tuning its hyperparameters. This process can be time-consuming and hard to be done properly by hand. Tuned hyperparameters allow to obtain high accuracy of classification as well as fast training. In this paper we explore the usage of selected heuristic algorithms based on evolutionary approach: Covariance Matrix Adaptation Evolution Strategy (CMAES), Differential Evolution Strategy (DES) and jSO for the hyperparameter tuning task. Results of Multilayer Perceptron’s (MLP) hyperparameter optimization for a real-life dataset are presented. An improvement in models’ performance is observed through the usage of presented approach.
Author Łukasz Neumann (FEIT / ICS)
Łukasz Neumann,,
- The Institute of Computer Science
, Robert Marek Nowak (FEIT / IN)
Robert Marek Nowak,,
- The Institute of Computer Science
Publication size in sheets0.3
Book Romaniuk Ryszard, Linczuk Maciej Grzegorz (eds.): Proceedings of SPIE: Photonics Applications in Astronomy, Communications, Industry, and High-Energy Physics Experiments 2018, vol. 10808, 2018, SPIE - the International Society for Optics and Photonics, ISBN 9781510622036, 2086 p.
Keywords in Englishevolutionary algorithm, optimization, tuning, hyperparameter, neural network
URL https://www.spiedigitallibrary.org/conference-proceedings-of-spie/10808/108082A/Heuristic-hyperparameter-optimization-for-multilayer-perceptron-with-one-hidden-layer/10.1117/12.2501569.full
Languageen angielski
108082A_neumann.pdf 312.08 KB
Score (nominal)15
ScoreMinisterial score = 15.0, 16-10-2018, BookChapterMatConf
Ministerial score (2013-2016) = 15.0, 16-10-2018, BookChapterMatConf
Citation count*
Share Share

Get link to the record

* presented citation count is obtained through Internet information analysis and it is close to the number calculated by the Publish or Perish system.