Adaptive Classifier Ensemble Method Based on Spatial Perception for High-Dimensional Data Classification

Yuhong u, Zhiwen Yu, Wenming Cao, C. L. Philip Chen, Jia You

Research output: Journal article publicationJournal articleAcademic researchpeer-review


Classifying high-dimensional small-size data is challenging in the field of pattern recognition. Traditional ensemble learning methods have several limitations: 1) sample-space based methods are easily affected by noise and redundant features; 2) feature-space based methods cannot excavate the essential characteristics of features; 3) feature subspaces cause information loss, which leads to a decline in accuracy; 4) most selective ensemble methods only consider the diversity and performance of sub-classifiers and ignore the impact on integration systems. To address the above limitations, we propose an adaptive classifier ensemble learning method (AdaSPEL) based on spatial perception for high-dimensional data. First, we design a local-space perception method for feature transformation, which encourages both high performance and diversity of the ensemble members. Second, we design a cross-space perception method based on the distribution of samples to obtain the cross-space enhanced features to provide a macro analysis for the characteristics of data. Furthermore, an adaptive selective ensemble method based on local and global evaluation mechanisms is proposed, which considers the impact of sub-classifiers on integrated systems. Experimental results on 33 high-dimensional data sets verify that our method outperforms mainstream ensemble learning methods based on feature space and sample space, and neural network-based algorithms.
Original languageEnglish
Pages (from-to)2847 - 2862
JournalIEEE Transactions on Knowledge and Data Engineering
Issue number7
Publication statusPublished - Jul 2021


Dive into the research topics of 'Adaptive Classifier Ensemble Method Based on Spatial Perception for High-Dimensional Data Classification'. Together they form a unique fingerprint.

Cite this