نویسندگان | Hamid Saadatfar,Hamed SabbaghGol,Mahdi Khazaiepoor |
---|---|
نشریه | Knowledge-Based Systems |
شماره صفحات | 1-19 |
شماره سریال | 285 |
شماره مجلد | 1 |
نوع مقاله | Full Paper |
تاریخ انتشار | 2024 |
نوع نشریه | الکترونیکی |
کشور محل چاپ | ایران |
نمایه نشریه | ISI،JCR،Scopus |
چکیده مقاله
Datasets often include excessive or irrelevant data that affect the performance and complexity of the machine learning model. Feature selection is one of the most effective dimension-reduction technics in datasets to solve the problem. Feature reduction can lead to increased classification accuracy and decreased computational costs. Wrapper feature selection algorithms are a popular and effective category of these methods which considering the learning method’s feedback. One of the well-known algorithms in this area is the random subset feature selection algorithm (RSFS). This study proposes an improved version that has higher convergence speed, lower feature selection rate, and higher classification accuracy. Improvements include considering the features’ interaction in the selective subsets, continuous evaluation of the selected features to avoid being stuck in the local optima and enhancing the method’s evolution phase. To comprehensively evaluate the proposed algorithm, its performance was applied to 20 standard and well-known datasets from public resources and compared with various recent related methods. Experimental results demonstrate that the proposed method reduced classification errors using fewer features compared to other methods, achieving the highest ranking in the Friedman and Wilcoxon rank-sum tests.
tags: Feature selection; Dimension reduction; RSFS algorithm; KNN classifier.