TY - GEN
T1 - Video deflickering using multi-frame optimization
AU - Li, Chao
AU - Chen, Zhihua
AU - Sheng, Bin
AU - Li, Ping
AU - He, Gaoqi
PY - 2018/9
Y1 - 2018/9
N2 - In this paper, we propose an approach of removing flickering artifacts in a video. The video is obtained via applying the image-based processing methods to an original non-flickering video. In traditional video deflickering methods, we always reconstruct flickering frames with nonflickering frames, which fail to keep video spatial consistency and are always designed to address certain video flickering artifact under specific condition. On the contrary, we propose a general multiple frames based video deflickering approach, where we take both temporal and spatial coherence into account. Instead of reconstructing a flickering frame only from its last frame, we warp multiple corresponding frames to reconstruct the flickering frame, so the warp inaccuracy in the reconstruction process can be reduced. By taking the advantage of video fidelity, temporal coherence and spatial coherence, we formulate video deflickeing objective as a least-squares energy. A non-flickering output video can be obtained via solving the constructed energy formulation with the least angle regression. Results of visual quality, objective measurement and user study demonstrate the efficiency of our proposed multiple frames based video deflickering approach.
AB - In this paper, we propose an approach of removing flickering artifacts in a video. The video is obtained via applying the image-based processing methods to an original non-flickering video. In traditional video deflickering methods, we always reconstruct flickering frames with nonflickering frames, which fail to keep video spatial consistency and are always designed to address certain video flickering artifact under specific condition. On the contrary, we propose a general multiple frames based video deflickering approach, where we take both temporal and spatial coherence into account. Instead of reconstructing a flickering frame only from its last frame, we warp multiple corresponding frames to reconstruct the flickering frame, so the warp inaccuracy in the reconstruction process can be reduced. By taking the advantage of video fidelity, temporal coherence and spatial coherence, we formulate video deflickeing objective as a least-squares energy. A non-flickering output video can be obtained via solving the constructed energy formulation with the least angle regression. Results of visual quality, objective measurement and user study demonstrate the efficiency of our proposed multiple frames based video deflickering approach.
KW - deflickering
KW - multiple frames
KW - spatial coherence
KW - temporal coherence
KW - Video processing
UR - http://www.scopus.com/inward/record.url?scp=85057121585&partnerID=8YFLogxK
U2 - 10.1109/BigMM.2018.8499461
DO - 10.1109/BigMM.2018.8499461
M3 - Conference article published in proceeding or book
AN - SCOPUS:85057121585
T3 - 2018 IEEE 4th International Conference on Multimedia Big Data, BigMM 2018
SP - 1
EP - 7
BT - 2018 IEEE 4th International Conference on Multimedia Big Data, BigMM 2018
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 4th IEEE International Conference on Multimedia Big Data, BigMM 2018
Y2 - 13 September 2018 through 16 September 2018
ER -