Abstract
In the literature, metaheuristics are proposed as alternatives to traditional techniques such as grid search, gradient descent, randomized search, and experimental methods in tuning the hyperparameters of Machine Learning (ML) models. Metaheuristics have fast convergence times, flexibility, robustness, local optima avoidance, and intelligence. The performance of ML models is highly dependent on choosing optimal hyperparameters. This chapter explores the Whale Optimization Algorithm (WOA) and five of its selected improvement capabilities in optimally tuning the hyperparameters of the Support Vector Machine (SVM). SVMs are margin-based ML classifiers and are popular owing to their simplicity, high generalization, and accuracy. SVMs' hyperparameter tuning is analyzed with seven standard datasets, including Breast Cancer Wisconsin, Iris, MNIST, Wine Recognition, Glass Identification, Cars Evaluation, and Indian Liver Patients.
Original language | English |
---|---|
Title of host publication | Handbook of Whale Optimization Algorithm |
Subtitle of host publication | Variants, Hybrids, Improvements, and Applications |
Publisher | Elsevier |
Pages | 495-521 |
Number of pages | 27 |
ISBN (Electronic) | 9780323953658 |
ISBN (Print) | 9780323953641 |
DOIs | |
Publication status | Published - 1 Jan 2023 |
Keywords
- Classification
- Datasets
- Hyperparameters
- ML
- SVM
- WOA