Abstract
Distance metric learning aims to learn from the given training data a valid distance metric, with which the similarity between data samples can be more effectively evaluated for classification. Metric learning is often formulated as a convex or nonconvex optimization problem, while most existing methods are based on customized optimizers and become inefficient for large scale problems. In this paper, we formulate metric learning as a kernel classification problem with the positive semi-definite constraint, and solve it by iterated training of support vector machines (SVMs). The new formulation is easy to implement and efficient in training with the off-the-shelf SVM solvers. Two novel metric learning models, namely positive-semidefinite constrained metric learning (PCML) and nonnegative-coefficient constrained metric learning (NCML), are developed. Both PCML and NCML can guarantee the global optimality of their solutions. Experiments are conducted on general classification, face verification, and person re-identification to evaluate our methods. Compared with the state-of-the-art approaches, our methods can achieve comparable classification accuracy and are efficient in training.
Original language | English |
---|---|
Article number | 7973168 |
Pages (from-to) | 4937-4950 |
Number of pages | 14 |
Journal | IEEE Transactions on Image Processing |
Volume | 26 |
Issue number | 10 |
DOIs | |
Publication status | Published - 1 Oct 2017 |
Keywords
- alternating minimization
- kernel method
- Lagrange duality
- Metric learning
- support vector machine
ASJC Scopus subject areas
- Software
- Computer Graphics and Computer-Aided Design