Training feedforward neural networks using multi-verse optimizer for binary classification problems

Hossam Faris, Ibrahim Aljarah, Seyedali Mirjalili

Research output: Contribution to journalArticlepeer-review

184 Citations (Scopus)

Abstract

This paper employs the recently proposed nature-inspired algorithm called Multi-Verse Optimizer (MVO) for training the Multi-layer Perceptron (MLP) neural network. The new training approach is benchmarked and evaluated using nine different bio-medical datasets selected from the UCI machine learning repository. The results are compared to five classical and recent evolutionary metaheuristic algorithms: Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Differential Evolution (DE), FireFly (FF) Algorithm and Cuckoo Search (CS). In addition, the results are compared with two well-regarded conventional gradient-based training methods: the conventional Back-Propagation (BP) and the Levenberg-Marquardt (LM) algorithms. The comparative study demonstrates that MVO is very competitive and outperforms other training algorithms in the majority of datasets in terms of improved local optima avoidance and convergence speed.

Original languageEnglish
Pages (from-to)322-332
Number of pages11
JournalApplied Intelligence
Volume45
Issue number2
DOIs
Publication statusPublished - 1 Sept 2016
Externally publishedYes

Keywords

  • Evolutionary algorithm
  • MLP
  • Multi-verse optimizer
  • Multilayer perceptron
  • MVO
  • Training neural network

Fingerprint

Dive into the research topics of 'Training feedforward neural networks using multi-verse optimizer for binary classification problems'. Together they form a unique fingerprint.

Cite this