Best nonnegative rank-one approximations of tensors

Shenglong Hu, Defeng Sun, Kim Chuan Toh

Research output: Journal article publicationJournal articleAcademic researchpeer-review

2 Citations (Scopus)


In this paper, we study the polynomial optimization problem of a multiform over the intersection of the multisphere and the nonnegative orthants. This class of problems is NP-hard in general and includes the problem of finding the best nonnegative rank-one approximation of a given tensor. A Positivstellensatz is given for this class of polynomial optimization problems, based on which a globally convergent hierarchy of doubly nonnegative (DNN) relaxations is proposed. A (zeroth order) DNN relaxation method is applied to solve these problems, resulting in linear matrix optimization problems under both the positive semidefinite and nonnegative conic constraints. A worst case approximation bound is given for this relaxation method. The recent solver SDPNAL+ is adopted to solve this class of matrix optimization problems. Typically the DNN relaxations are tight, and hence the best nonnegative rank-one approximation of a tensor can be obtained frequently. Extensive numerical experiments show that this approach is quite promising.

Original languageEnglish
Pages (from-to)1527-1554
Number of pages28
JournalSIAM Journal on Matrix Analysis and Applications
Issue number4
Publication statusPublished - 3 Dec 2019


  • Doubly nonnegative relaxation method
  • Doubly nonnegative semidefinite program
  • Multiforms
  • Nonnegative rank-1 approximation
  • Polynomial
  • Tensor

ASJC Scopus subject areas

  • Analysis


Dive into the research topics of 'Best nonnegative rank-one approximations of tensors'. Together they form a unique fingerprint.

Cite this