Abstract
The Multi-Layer Perceptron (MLP), as one of the most-widely used Neural Networks (NNs), has been applied to many practical problems. The MLP requires training on specific applications, often experiencing problems of entrapment in local minima, convergence speed, and sensitivity to initialization. This paper proposes the use of the recently developed Biogeography-Based Optimization (BBO) algorithm for training MLPs to reduce these problems. In order to investigate the efficiencies of BBO in training MLPs, five classification datasets, as well as six function approximation datasets are employed. The results are compared to five well-known heuristic algorithms, Back Propagation (BP), and Extreme Learning Machine (ELM) in terms of entrapment in local minima, result accuracy, and convergence rate. The results show that training MLPs by using BBO is significantly better than the current heuristic learning algorithms and BP. Moreover, the results show that BBO is able to provide very competitive results in comparison with ELM.
Original language | English |
---|---|
Pages (from-to) | 188-209 |
Number of pages | 22 |
Journal | Information Sciences |
Volume | 269 |
DOIs | |
Publication status | Published - 10 Jun 2014 |
Externally published | Yes |
Keywords
- BBO
- Biogeography-Based Optimization
- Evolutionary algorithm
- FNN
- Learning neural network
- Neural network