Let a biogeography-based optimizer train your Multi-Layer Perceptron

Seyedali Mirjalili, Seyed Mohammad Mirjalili, Andrew Lewis

Research output: Contribution to journalArticle

159 Citations (Scopus)

Abstract

The Multi-Layer Perceptron (MLP), as one of the most-widely used Neural Networks (NNs), has been applied to many practical problems. The MLP requires training on specific applications, often experiencing problems of entrapment in local minima, convergence speed, and sensitivity to initialization. This paper proposes the use of the recently developed Biogeography-Based Optimization (BBO) algorithm for training MLPs to reduce these problems. In order to investigate the efficiencies of BBO in training MLPs, five classification datasets, as well as six function approximation datasets are employed. The results are compared to five well-known heuristic algorithms, Back Propagation (BP), and Extreme Learning Machine (ELM) in terms of entrapment in local minima, result accuracy, and convergence rate. The results show that training MLPs by using BBO is significantly better than the current heuristic learning algorithms and BP. Moreover, the results show that BBO is able to provide very competitive results in comparison with ELM.

Original languageEnglish
Pages (from-to)188-209
Number of pages22
JournalInformation Sciences
Volume269
DOIs
Publication statusPublished - 10 Jun 2014
Externally publishedYes

    Fingerprint

Keywords

  • BBO
  • Biogeography-Based Optimization
  • Evolutionary algorithm
  • FNN
  • Learning neural network
  • Neural network

Cite this