Optimizing connection weights in neural networks using the whale optimization algorithm

Ibrahim Aljarah, Hossam Faris, Seyedali Mirjalili

Research output: Contribution to journalArticlepeer-review

607 Citations (Scopus)

Abstract

The learning process of artificial neural networks is considered as one of the most difficult challenges in machine learning and has attracted many researchers recently. The main difficulty of training a neural network is the nonlinear nature and the unknown best set of main controlling parameters (weights and biases). The main disadvantages of the conventional training algorithms are local optima stagnation and slow convergence speed. This makes stochastic optimization algorithm reliable alternative to alleviate these drawbacks. This work proposes a new training algorithm based on the recently proposed whale optimization algorithm (WOA). It has been proved that this algorithm is able to solve a wide range of optimization problems and outperform the current algorithms. This motivated our attempts to benchmark its performance in training feedforward neural networks. For the first time in the literature, a set of 20 datasets with different levels of difficulty are chosen to test the proposed WOA-based trainer. The results are verified by comparisons with back-propagation algorithm and six evolutionary techniques. The qualitative and quantitative results prove that the proposed trainer is able to outperform the cur rent algorithms on the majority of datasets in terms of both local optima avoidance and convergence speed.

Original languageEnglish
JournalSoft Computing
Volume22
Issue number1
DOIs
Publication statusPublished - 1 Jan 2018
Externally publishedYes

Keywords

  • Evolutionary algorithm
  • MLP
  • Multilayer perceptron
  • Optimization
  • Training neural network
  • Whale optimization algorithm
  • WOA

Fingerprint

Dive into the research topics of 'Optimizing connection weights in neural networks using the whale optimization algorithm'. Together they form a unique fingerprint.

Cite this