Priority search technique for MPEG-4 motion estimation of arbitrarily shaped video object

K. C. Hui, Yui Lam Chan, W. C. Siu

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

3 Citations (Scopus)

Abstract

One of the main differences between the MPEG-4 video and the previously standardized video coding schemes is the support of arbitrarily shaped video objects, for which most of the existing fast motion estimation algorithms are not suitable. The conventional fast motion estimation algorithm works well for opaque macroblocks, but not the case for a boundary macroblock which contains a large number of local minima on its error surface. In this paper, we propose a fast search algorithm which incorporates the binary alpha-plane to predict accurately the motion vectors of boundary macroblocks. Besides, these accurate motion vectors can be used to develop a novel priority search algorithm which is an efficient search strategy for the remaining opaque macroblocks. Experimental results show that, when compared to the conventional methods, our approach requires a low computational complexity and provides a significant improvement in terms of accuracy in motion-compensated video object planes.
Original languageEnglish
Title of host publicationIEEE International Conference on Image Processing
Pages644-647
Number of pages4
Publication statusPublished - 1 Jan 2001
EventIEEE International Conference on Image Processing (ICIP) - Thessaloniki, Greece
Duration: 7 Oct 200110 Oct 2001

Conference

ConferenceIEEE International Conference on Image Processing (ICIP)
Country/TerritoryGreece
CityThessaloniki
Period7/10/0110/10/01

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition
  • Hardware and Architecture
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Priority search technique for MPEG-4 motion estimation of arbitrarily shaped video object'. Together they form a unique fingerprint.

Cite this