Regularized Label Relaxation Linear Regression

Xiaozhao Fang, Yong Xu, Xuelong Li, Zhihui Lai, Wai Keung Wong, Bingwu Fang

Research output: Journal article publicationJournal articleAcademic researchpeer-review

27 Citations (Scopus)

Abstract

Linear regression (LR) and some of its variants have been widely used for classification problems. Most of these methods assume that during the learning phase, the training samples can be exactly transformed into a strict binary label matrix, which has too little freedom to fit the labels adequately. To address this problem, in this paper, we propose a novel regularized label relaxation LR method, which has the following notable characteristics. First, the proposed method relaxes the strict binary label matrix into a slack variable matrix by introducing a nonnegative label relaxation matrix into LR, which provides more freedom to fit the labels and simultaneously enlarges the margins between different classes as much as possible. Second, the proposed method constructs the class compactness graph based on manifold learning and uses it as the regularization item to avoid the problem of overfitting. The class compactness graph is used to ensure that the samples sharing the same labels can be kept close after they are transformed. Two different algorithms, which are, respectively, based on ℓ 2-norm and ℓ2,1-norm loss functions are devised. These two algorithms have compact closed-form solutions in each iteration so that they are easily implemented. Extensive experiments show that these two algorithms outperform the state-of-the-art algorithms in terms of the classification accuracy and running time.
Original languageEnglish
Pages (from-to)1006-1018
Number of pages13
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume29
Issue number4
DOIs
Publication statusPublished - 1 Apr 2018

Keywords

  • Class compactness graph
  • computer vision
  • label relaxation
  • linear regression (LR)
  • manifold learning

ASJC Scopus subject areas

  • Software
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Cite this