High-Level K-Nearest Neighbors (HLKNN): A Supervised Machine Learning Model for Classification Analysis


Creative Commons License

Ozturk Kiyak E., Ghasemkhani B., BİRANT D.

ELECTRONICS, vol.12, no.18, 2023 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 12 Issue: 18
  • Publication Date: 2023
  • Doi Number: 10.3390/electronics12183828
  • Journal Name: ELECTRONICS
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Aerospace Database, Communication Abstracts, INSPEC, Metadex, Directory of Open Access Journals, Civil Engineering Abstracts
  • Keywords: artificial intelligence, classification, k-nearest neighbors, machine learning, supervised learning
  • Dokuz Eylül University Affiliated: Yes

Abstract

The k-nearest neighbors (KNN) algorithm has been widely used for classification analysis in machine learning. However, it suffers from noise samples that reduce its classification ability and therefore prediction accuracy. This article introduces the high-level k-nearest neighbors (HLKNN) method, a new technique for enhancing the k-nearest neighbors algorithm, which can effectively address the noise problem and contribute to improving the classification performance of KNN. Instead of only considering k neighbors of a given query instance, it also takes into account the neighbors of these neighbors. Experiments were conducted on 32 well-known popular datasets. The results showed that the proposed HLKNN method outperformed the standard KNN method with average accuracy values of 81.01% and 79.76%, respectively. In addition, the experiments demonstrated the superiority of HLKNN over previous KNN variants in terms of the accuracy metric in various datasets.