Tuning SVMs' hyperparameters using the whale optimization algorithm

Sunday O. Oladejo, Stephen O. Ekwe, Adedotun T. Ajibare, Lateef A. Akinyemi, Seyedali Mirjalili

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

2 Citations (Scopus)

Abstract

In the literature, metaheuristics are proposed as alternatives to traditional techniques such as grid search, gradient descent, randomized search, and experimental methods in tuning the hyperparameters of Machine Learning (ML) models. Metaheuristics have fast convergence times, flexibility, robustness, local optima avoidance, and intelligence. The performance of ML models is highly dependent on choosing optimal hyperparameters. This chapter explores the Whale Optimization Algorithm (WOA) and five of its selected improvement capabilities in optimally tuning the hyperparameters of the Support Vector Machine (SVM). SVMs are margin-based ML classifiers and are popular owing to their simplicity, high generalization, and accuracy. SVMs' hyperparameter tuning is analyzed with seven standard datasets, including Breast Cancer Wisconsin, Iris, MNIST, Wine Recognition, Glass Identification, Cars Evaluation, and Indian Liver Patients.

Original languageEnglish
Title of host publicationHandbook of Whale Optimization Algorithm
Subtitle of host publicationVariants, Hybrids, Improvements, and Applications
PublisherElsevier
Pages495-521
Number of pages27
ISBN (Electronic)9780323953658
ISBN (Print)9780323953641
DOIs
Publication statusPublished - 1 Jan 2023

Keywords

  • Classification
  • Datasets
  • Hyperparameters
  • ML
  • SVM
  • WOA

Fingerprint

Dive into the research topics of 'Tuning SVMs' hyperparameters using the whale optimization algorithm'. Together they form a unique fingerprint.

Cite this