Adaptive search range by depth variant decaying weights for HEVC inter texture coding

Tsz Kwan Lee, Yui Lam Chan, Wan Chi Siu

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

Abstract

Emerging high-efficiency video coding (HEVC) outperforms H.264 by a gain of 50% bitrate reduction while maintaining almost the same perceptual quality. However, it induces higher coding complexity due to its adoption of recursive block partitioning mechanism in motion estimation (ME) with a fixed search range. For an objective of reducing the computational burden in HEVC, this paper proposes an adaptive search range algorithm by using depth map information. With the aid of depth intensity variations among neighboring blocks, associated weights to the neighboring blocks are derived. The proposed weighted sum of the motions from the neighboring blocks is formulated to provide a suitable search range for each block. The simulation results demonstrated that proposed adaptive search range is compatible to not only full-search (FS) but also fast Test Zone Search (TZS) in HEVC. The proposed algorithm could reduce significant coding time on average with negligible rate-distortion degradation.
Original languageEnglish
Title of host publication2017 IEEE International Conference on Multimedia and Expo, ICME 2017
PublisherIEEE Computer Society
Pages1249-1254
Number of pages6
ISBN (Electronic)9781509060672
DOIs
Publication statusPublished - 28 Aug 2017
Event2017 IEEE International Conference on Multimedia and Expo, ICME 2017 - Hong Kong, Hong Kong
Duration: 10 Jul 201714 Jul 2017

Conference

Conference2017 IEEE International Conference on Multimedia and Expo, ICME 2017
CountryHong Kong
CityHong Kong
Period10/07/1714/07/17

Keywords

  • Adaptive search range
  • Complexity reduction
  • Depth intensity
  • High-efficiency video coding
  • Motion estimation

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Computer Science Applications

Cite this