Joint learning of multiple regressors for single image super-resolution

Kai Zhang, Baoquan Wang, Wangmeng Zuo, Hongzhi Zhang, Lei Zhang

Research output: Journal article publicationJournal articleAcademic researchpeer-review

45 Citations (Scopus)

Abstract

Using a global regression model for single image super-resolution (SISR) generally fails to produce visually pleasant output. The recently developed local learning methods provide a remedy by partitioning the feature space into a number of clusters and learning a simple local model for each cluster. However, in these methods the space partition is conducted separately from local model learning, which results in an abundant number of local models to achieve satisfying performance. To address this problem, we propose a mixture of experts (MoE) method to jointly learn the feature space partition and local regression models. Our MoE consists of two components: gating network learning and local regressors learning. An expectation-maximization (EM) algorithm is adopted to train MoE on a large set of LR/HR patch pairs. Experimental results demonstrate that the proposed method can use much less local models and time to achieve comparable or superior results to state-of-the-art SISR methods, providing a highly practical solution to real applications.

Original languageEnglish
Article number7339441
Pages (from-to)102-106
Number of pages5
JournalIEEE Signal Processing Letters
Volume23
Issue number1
DOIs
Publication statusPublished - 1 Jan 2016

Keywords

  • Image super-resolution
  • Joint learning
  • Linear regression
  • Local learning
  • Mixture of experts

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Joint learning of multiple regressors for single image super-resolution'. Together they form a unique fingerprint.

Cite this