Abstract
The symmetric positive definite (SPD) matrices have been widely used in image and vision problems. Recently there are growing interests in studying sparse representation (SR) of SPD matrices, motivated by the great success of SR for vector data. Though the space of SPD matrices is well-known to form a Lie group that is a Riemannian manifold, existing work fails to take full advantage of its geometric structure. This paper attempts to tackle this problem by proposing a kernel based method for SR and dictionary learning (DL) of SPD matrices. We disclose that the space of SPD matrices, with the operations of logarithmic multiplication and scalar logarithmic multiplication defined in the Log-Euclidean framework, is a complete inner product space. We can thus develop a broad family of kernels that satisfies Mercer's condition. These kernels characterize the geodesic distance and can be computed efficiently. We also consider the geometric structure in the DL process by updating atom matrices in the Riemannian space instead of in the Euclidean space. The proposed method is evaluated with various vision problems and shows notable performance gains over state-of-the-arts.
Original language | English |
---|---|
Title of host publication | Proceedings - 2013 IEEE International Conference on Computer Vision, ICCV 2013 |
Publisher | IEEE |
Pages | 1601-1608 |
Number of pages | 8 |
ISBN (Print) | 9781479928392 |
DOIs | |
Publication status | Published - 1 Jan 2013 |
Event | 2013 14th IEEE International Conference on Computer Vision, ICCV 2013 - Sydney, NSW, Australia Duration: 1 Dec 2013 → 8 Dec 2013 |
Conference
Conference | 2013 14th IEEE International Conference on Computer Vision, ICCV 2013 |
---|---|
Country/Territory | Australia |
City | Sydney, NSW |
Period | 1/12/13 → 8/12/13 |
Keywords
- Dictionary Learning
- Log-Euclidean Kernels
- Space of Symmetric Positive Definite (SPD) Matrices
- Sparse Representation
ASJC Scopus subject areas
- Software
- Computer Vision and Pattern Recognition