A Kernel Classification Framework for Metric Learning

Faqiang Wang, Wangmeng Zuo, Lei Zhang, Deyu Meng, Dapeng Zhang

Research output: Journal article publicationJournal articleAcademic researchpeer-review

65 Citations (Scopus)

Abstract

Learning a distance metric from the given training samples plays a crucial role in many machine learning tasks, and various models and optimization algorithms have been proposed in the past decade. In this paper, we generalize several state-of-the-art metric learning methods, such as large margin nearest neighbor (LMNN) and information theoretic metric learning (ITML), into a kernel classification framework. First, doublets and triplets are constructed from the training samples, and a family of degree-2 polynomial kernel functions is proposed for pairs of doublets or triplets. Then, a kernel classification framework is established to generalize many popular metric learning methods such as LMNN and ITML. The proposed framework can also suggest new metric learning methods, which can be efficiently implemented, interestingly, using the standard support vector machine (SVM) solvers. Two novel metric learning methods, namely, doublet-SVM and triplet-SVM, are then developed under the proposed framework. Experimental results show that doublet-SVM and triplet-SVM achieve competitive classification accuracies with state-of-the-art metric learning methods but with significantly less training time.
Original languageEnglish
Article number6932476
Pages (from-to)1950-1962
Number of pages13
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume26
Issue number9
DOIs
Publication statusPublished - 1 Sep 2015

Keywords

  • Kernel method
  • metric learning
  • nearest neighbor (NN)
  • polynomial kernel
  • support vector machine (SVM).

ASJC Scopus subject areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Cite this