Abstract
Using a global regression model for single image super-resolution (SISR) generally fails to produce visually pleasant output. The recently developed local learning methods provide a remedy by partitioning the feature space into a number of clusters and learning a simple local model for each cluster. However, in these methods the space partition is conducted separately from local model learning, which results in an abundant number of local models to achieve satisfying performance. To address this problem, we propose a mixture of experts (MoE) method to jointly learn the feature space partition and local regression models. Our MoE consists of two components: gating network learning and local regressors learning. An expectation-maximization (EM) algorithm is adopted to train MoE on a large set of LR/HR patch pairs. Experimental results demonstrate that the proposed method can use much less local models and time to achieve comparable or superior results to state-of-the-art SISR methods, providing a highly practical solution to real applications.
Original language | English |
---|---|
Article number | 7339441 |
Pages (from-to) | 102-106 |
Number of pages | 5 |
Journal | IEEE Signal Processing Letters |
Volume | 23 |
Issue number | 1 |
DOIs | |
Publication status | Published - 1 Jan 2016 |
Keywords
- Image super-resolution
- Joint learning
- Linear regression
- Local learning
- Mixture of experts
ASJC Scopus subject areas
- Signal Processing
- Electrical and Electronic Engineering
- Applied Mathematics