Minimum-maximum local structure information for feature selection

Wenjun Hu, Kup Sze Choi, Yonggen Gu, Shitong Wang

Research output: Journal article publicationJournal articleAcademic researchpeer-review

17 Citations (Scopus)


Feature selection methods have been extensively applied in machine learning tasks, such as computer vision, pattern recognition, and data mining. These methods aim to identify a subset of the original features with high discriminating power. Among them, the feature selection technique for unsupervised tasks is more attractive since the cost to obtain the labels of the data and/or the information between classes is often high. On the other hand, the low-dimensional manifold of the "same" class data is usually revealed by considering the local invariance of the data structure, it may not be adequate to deal with unsupervised tasks where the class information is completely absent. In this paper, a novel feature selection method, called Minimum-maximum local structure information Laplacian Score (MMLS), is proposed to minimize the within-locality information (i.e., preserving the manifold structure of the "same" class data) and to maximize the between-locality information (i.e., maximizing the information between the manifold structures of the "different" class data) at the same time. The effectiveness of the proposed algorithm is demonstrated with experiments on classification and clustering.
Original languageEnglish
Pages (from-to)527-535
Number of pages9
JournalPattern Recognition Letters
Issue number5
Publication statusPublished - 4 Feb 2013


  • Feature selection
  • Laplacian Eigenmap
  • Laplacian Score
  • Locality preserving
  • Manifold learning

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Artificial Intelligence


Dive into the research topics of 'Minimum-maximum local structure information for feature selection'. Together they form a unique fingerprint.

Cite this