Abstract
Learning a distance metric from the given training samples plays a crucial role in many machine learning tasks, and various models and optimization algorithms have been proposed in the past decade. In this paper, we generalize several state-of-the-art metric learning methods, such as large margin nearest neighbor (LMNN) and information theoretic metric learning (ITML), into a kernel classification framework. First, doublets and triplets are constructed from the training samples, and a family of degree-2 polynomial kernel functions is proposed for pairs of doublets or triplets. Then, a kernel classification framework is established to generalize many popular metric learning methods such as LMNN and ITML. The proposed framework can also suggest new metric learning methods, which can be efficiently implemented, interestingly, using the standard support vector machine (SVM) solvers. Two novel metric learning methods, namely, doublet-SVM and triplet-SVM, are then developed under the proposed framework. Experimental results show that doublet-SVM and triplet-SVM achieve competitive classification accuracies with state-of-the-art metric learning methods but with significantly less training time.
Original language | English |
---|---|
Article number | 6932476 |
Pages (from-to) | 1950-1962 |
Number of pages | 13 |
Journal | IEEE Transactions on Neural Networks and Learning Systems |
Volume | 26 |
Issue number | 9 |
DOIs | |
Publication status | Published - 1 Sept 2015 |
Keywords
- Kernel method
- metric learning
- nearest neighbor (NN)
- polynomial kernel
- support vector machine (SVM).
ASJC Scopus subject areas
- Software
- Computer Science Applications
- Computer Networks and Communications
- Artificial Intelligence