Abstract
Over the past few decades, a large number of algorithms have been developed for dimensionality reduction. Despite the different motivations of these algorithms, they can be interpreted by a common framework known as graph embedding. In order to explore the significant features of data, some sparse regression algorithms have been proposed based on graph embedding. However, the problem is that these algorithms include two separate steps: 1) embedding learning and 2) sparse regression. Thus their performance is largely determined by the effectiveness of the constructed graph. In this paper, we present a framework by combining the objective functions of graph embedding and sparse regression so that embedding learning and sparse regression can be jointly implemented and optimized, instead of simply using the graph spectral for sparse regression. By the proposed framework, supervised, semisupervised, and unsupervised learning algorithms could be unified. Furthermore, we analyze two situations of the optimization problem for the proposed framework. By adopting an L2,1-norm regularization for the proposed framework, it can perform feature selection and subspace learning simultaneously. Experiments on seven standard databases demonstrate that joint graph embedding and sparse regression method can significantly improve the recognition performance and consistently outperform the sparse regression method.
Original language | English |
---|---|
Article number | 7045492 |
Pages (from-to) | 1341-1355 |
Number of pages | 15 |
Journal | IEEE Transactions on Image Processing |
Volume | 24 |
Issue number | 4 |
DOIs | |
Publication status | Published - 1 Apr 2015 |
Keywords
- feature selection
- Graph embedding
- L -norm 2,1
- sparse regression
- subspace learning
ASJC Scopus subject areas
- Software
- Computer Graphics and Computer-Aided Design