A Fast Non-Negative Latent Factor Model Based on Generalized Momentum Method

Xin Luo, Zhigang Liu, Shuai Li, Mingsheng Shang, Zidong Wang

Research output: Journal article publicationJournal articleAcademic researchpeer-review

19 Citations (Scopus)

Abstract

Non-negative latent factor (NLF) models can efficiently acquire useful knowledge from high-dimensional and sparse (HiDS) matrices filled with non-negative data. Single latent factor-dependent, non-negative and multiplicative update (SLF-NMU) is an efficient algorithm for building an NLF model on an HiDS matrix, yet it suffers slow convergence. A momentum method is frequently adopted to accelerate a learning algorithm, but it is incompatible with those implicitly adopting gradients like SLF-NMU. To build a fast NLF (FNLF) model, we propose a generalized momentum method compatible with SLF-NMU. With it, we further propose a single latent factor-dependent non-negative, multiplicative and momentum-incorporated update algorithm, thereby achieving an FNLF model. Empirical studies on six HiDS matrices from industrial application indicate that an FNLF model outperforms an NLF model in terms of both convergence rate and prediction accuracy for missing data. Hence, compared with an NLF model, an FNLF model is more practical in industrial applications.

Original languageEnglish
JournalIEEE Transactions on Systems, Man, and Cybernetics: Systems
DOIs
Publication statusAccepted/In press - 1 Jan 2018

Keywords

  • Acceleration
  • Analytical models
  • Big data
  • Computational modeling
  • Convergence
  • Data models
  • high-dimensional and sparse (HiDS) matrix
  • latent factor (LF) analysis
  • missing data estimation
  • non-negative LF (NLF) model
  • recommender system
  • Recommender systems
  • Sparse matrices

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Human-Computer Interaction
  • Computer Science Applications
  • Electrical and Electronic Engineering

Cite this