Heuristic hyperparameter optimization for multilayer perceptron with one hidden layer
Łukasz Neumann , Robert Marek Nowak
AbstractOne of the crucial steps of preparing a neural network model is the process of tuning its hyperparameters. This process can be time-consuming and hard to be done properly by hand. Tuned hyperparameters allow to obtain high accuracy of classification as well as fast training. In this paper we explore the usage of selected heuristic algorithms based on evolutionary approach: Covariance Matrix Adaptation Evolution Strategy (CMAES), Differential Evolution Strategy (DES) and jSO for the hyperparameter tuning task. Results of Multilayer Perceptron’s (MLP) hyperparameter optimization for a real-life dataset are presented. An improvement in models’ performance is observed through the usage of presented approach.
|Publication size in sheets||0.3|
|Book||Romaniuk Ryszard, Linczuk Maciej Grzegorz (eds.): Proceedings of SPIE: Photonics Applications in Astronomy, Communications, Industry, and High-Energy Physics Experiments 2018, vol. 10808, 2018, SPIE - the International Society for Optics and Photonics, ISBN 9781510622036, 2086 p.|
|Keywords in English||evolutionary algorithm, optimization, tuning, hyperparameter, neural network|
|Score|| = 15.0, 16-10-2018, BookChapterMatConf|
= 15.0, 16-10-2018, BookChapterMatConf
* presented citation count is obtained through Internet information analysis and it is close to the number calculated by the Publish or Perish system.