Weighted Schatten p-Norm Minimization for Image Denoising and Background Subtraction

Yuan Xie, Shuhang Gu, Yan Liu, Wangmeng Zuo, Wensheng Zhang, Lei Zhang

Research output: Journal article publicationJournal articleAcademic researchpeer-review

164 Citations (Scopus)

Abstract

Low rank matrix approximation (LRMA), which aims to recover the underlying low rank matrix from its degraded observation, has a wide range of applications in computer vision. The latest LRMA methods resort to using the nuclear norm minimization (NNM) as a convex relaxation of the nonconvex rank minimization. However, NNM tends to over-shrink the rank components and treats the different rank components equally, limiting its flexibility in practical applications. We propose a more flexible model, namely, the weighted Schatten p-norm minimization (WSNM), to generalize the NNM to the Schatten p-norm minimization with weights assigned to different singular values. The proposed WSNM not only gives better approximation to the original low-rank assumption, but also considers the importance of different rank components. We analyze the solution of WSNM and prove that, under certain weights permutation, WSNM can be equivalently transformed into independent non-convex lp-norm subproblems, whose global optimum can be efficiently solved by generalized iterated shrinkage algorithm. We apply WSNM to typical low-level vision problems, e.g., image denoising and background subtraction. Extensive experimental results show, both qualitatively and quantitatively, that the proposed WSNM can more effectively remove noise, and model the complex and dynamic scenes compared with state-of-the-art methods.
Original languageEnglish
Article number7539605
Pages (from-to)4842-4857
Number of pages16
JournalIEEE Transactions on Image Processing
Volume25
Issue number10
DOIs
Publication statusPublished - 1 Oct 2016

Keywords

  • Low rank
  • low-level vision
  • weighted Schatten p-norm

ASJC Scopus subject areas

  • Software
  • Computer Graphics and Computer-Aided Design

Cite this