Binary Harris Hawks Optimizer for High-Dimensional, Low Sample Size Feature Selection (original) (raw)

Algorithms for Intelligent Systems, 2019

Abstract

Feature selection is a preprocessing step that aims to eliminate the features that may negatively influence the performance of the machine learning techniques. The negative influence is due to the possibility of having many irrelevant and/or redundant features. In this chapter, a binary variant of recent Harris hawks optimizer (HHO) is proposed to boost the efficacy of wrapper-based feature selection techniques. HHO is a new fast and efficient swarm-based optimizer with various simple but effective exploratory and exploitative mechanisms (Levy flight, greedy selection, etc.) and a dynamic structure for solving continuous problems. However, it was originally designed for continuous search spaces. To deal with binary feature spaces, we propose a new binary HHO in this chapter. The binary HHO is validated based on special types of feature selection datasets. These hard datasets are high dimensional, which means that there is a huge number of features. Simultaneously, we should deal with a low number of samples. Various experiments and comparisons reveal the improved stability of HHO in dealing with this type of datasets.

Thaer Thaher hasn't uploaded this paper.

Let Thaer know you want this paper to be uploaded.

Ask for this paper to be uploaded.