TMP: Temporal Motion Propagation for Online Video Super-Resolution

Zhengqiang Zhang, Ruihuang Li, Shi Guo, Yang Cao, Lei Zhang

Research output: Journal article publicationJournal articleAcademic researchpeer-review

3 Citations (Scopus)

Abstract

Online video super-resolution (online-VSR) highly relies on an effective alignment module to aggregate temporal information, while the strict latency requirement makes accurate and efficient alignment very challenging. Though much progress has been achieved, most of the existing online-VSR methods estimate the motion fields of each frame separately to perform alignment, which is computationally redundant and ignores the fact that the motion fields of adjacent frames are correlated. In this work, we propose an efficient Temporal Motion Propagation (TMP) method, which leverages the continuity of motion field to achieve fast pixel-level alignment among consecutive frames. Specifically, we first propagate the offsets from previous frames to the current frame, and then refine them in the neighborhood, significantly reducing the matching space and speeding up the offset estimation process. Furthermore, to enhance the robustness of alignment, we perform spatial-wise weighting on the warped features, where the positions with more precise offsets are assigned higher importance. Experiments on benchmark datasets demonstrate that the proposed TMP method achieves leading online-VSR accuracy as well as inference speed. The source code of TMP can be found at https://github.com/xtudbxk/TMP.

Original languageEnglish
Pages (from-to)5014-5028
Number of pages15
JournalIEEE Transactions on Image Processing
Volume33
DOIs
Publication statusPublished - 2024

Keywords

  • deep neural networks
  • motion compensation
  • Video super-resolution

ASJC Scopus subject areas

  • Software
  • Computer Graphics and Computer-Aided Design

Fingerprint

Dive into the research topics of 'TMP: Temporal Motion Propagation for Online Video Super-Resolution'. Together they form a unique fingerprint.

Cite this