On minimum class locality preserving variance support vector machine

Xiaoming Wang, Fu Lai Korris Chung, Shitong Wang

Research output: Journal article publicationJournal articleAcademic researchpeer-review

45 Citations (Scopus)

Abstract

In this paper, a so-called minimum class locality preserving variance support machine (MCLPV_SVM) algorithm is presented by introducing the basic idea of the locality preserving projections (LPP), which can be seen as a modified class of support machine (SVM) and/or minimum class variance support machine (MCVSVM). MCLPV_SVM, in contrast to SVM and MCVSVM, takes the intrinsic manifold structure of the data space into full consideration and inherits the characteristics of SVM and MCVSVM. We discuss in the paper the linear case, the small sample size case and the nonlinear case of the MCLPV_SVM. Similar to MCVSVM, the MCLPV_SVM optimization problem in the small sample size case is solved by using dimensionality reduction through principal component analysis (PCA) and one in the nonlinear case is transformed into an equivalent linear MCLPV_SVM problem under kernel PCA (KPCA). Experimental results on real datasets indicate the effectiveness of the MCLPV_SVM by comparing it with SVM and MCVSVM.
Original languageEnglish
Pages (from-to)2753-2762
Number of pages10
JournalPattern Recognition
Volume43
Issue number8
DOIs
Publication statusPublished - 1 Aug 2010

Keywords

  • Locality preserving projections
  • Minimum class variance support machine
  • Supervised learning
  • Support vector machine

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'On minimum class locality preserving variance support vector machine'. Together they form a unique fingerprint.

Cite this