TY - GEN
T1 - Support vector guided dictionary learning
AU - Cai, Sijia
AU - Zuo, Wangmeng
AU - Zhang, Lei
AU - Feng, Xiangchu
AU - Wang, Ping
PY - 2014/1/1
Y1 - 2014/1/1
N2 - Discriminative dictionary learning aims to learn a dictionary from training samples to enhance the discriminative capability of their coding vectors. Several discrimination terms have been proposed by assessing the prediction loss (e.g., logistic regression) or class separation criterion (e.g., Fisher discrimination criterion) on the coding vectors. In this paper, we provide a new insight on discriminative dictionary learning. Specifically, we formulate the discrimination term as the weighted summation of the squared distances between all pairs of coding vectors. The discrimination term in the state-of-the-art Fisher discrimination dictionary learning (FDDL) method can be explained as a special case of our model, where the weights are simply determined by the numbers of samples of each class. We then propose a parameterization method to adaptively determine the weight of each coding vector pair, which leads to a support vector guided dictionary learning (SVGDL) model. Compared with FDDL, SVGDL can adaptively assign different weights to different pairs of coding vectors. More importantly, SVGDL automatically selects only a few critical pairs to assign non-zero weights, resulting in better generalization ability for pattern recognition tasks. The experimental results on a series of benchmark databases show that SVGDL outperforms many state-of-the-art discriminative dictionary learning methods.
AB - Discriminative dictionary learning aims to learn a dictionary from training samples to enhance the discriminative capability of their coding vectors. Several discrimination terms have been proposed by assessing the prediction loss (e.g., logistic regression) or class separation criterion (e.g., Fisher discrimination criterion) on the coding vectors. In this paper, we provide a new insight on discriminative dictionary learning. Specifically, we formulate the discrimination term as the weighted summation of the squared distances between all pairs of coding vectors. The discrimination term in the state-of-the-art Fisher discrimination dictionary learning (FDDL) method can be explained as a special case of our model, where the weights are simply determined by the numbers of samples of each class. We then propose a parameterization method to adaptively determine the weight of each coding vector pair, which leads to a support vector guided dictionary learning (SVGDL) model. Compared with FDDL, SVGDL can adaptively assign different weights to different pairs of coding vectors. More importantly, SVGDL automatically selects only a few critical pairs to assign non-zero weights, resulting in better generalization ability for pattern recognition tasks. The experimental results on a series of benchmark databases show that SVGDL outperforms many state-of-the-art discriminative dictionary learning methods.
KW - Dictionary learning
KW - Fisher discrimination
KW - sparse representation
KW - support vector machine
UR - http://www.scopus.com/inward/record.url?scp=84906492141&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-10593-2_41
DO - 10.1007/978-3-319-10593-2_41
M3 - Conference article published in proceeding or book
SN - 9783319105925
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 624
EP - 639
BT - Computer Vision, ECCV 2014 - 13th European Conference, Proceedings
PB - Springer Verlag
T2 - 13th European Conference on Computer Vision, ECCV 2014
Y2 - 6 September 2014 through 12 September 2014
ER -