Nonlinear dimensionality reduction with hybrid distance for trajectory representation of dynamic texture

Yang Liu, Yan Liu, Chun Chung Chan

Research output: Journal article publicationJournal articleAcademic researchpeer-review

7 Citations (Scopus)

Abstract

Dynamic textures play an important role in video content analysis. Current works of dynamic textures mainly focus on overall texture and motion analysis for segmentation or classification based on statistical features and structure models. This paper proposes a novel framework to study the dynamic textures by exploring the motion trajectory using unsupervised learning. A nonlinear dimensionality reduction algorithm, called hybrid distance isometric embedding (HDIE), is proposed, to generate a low-dimensional motion trajectory from high-dimensional feature space of the raw video data. First, we partition the high-dimensional data points into a set of data clusters and construct the intra-cluster graphs based on the individual character of each data cluster to build the basic layer of HDIE. Second, we construct the inter-cluster graph by analyzing the interrelation among these isolated data clusters to build the top layer of HDIE. Finally, we generate a whole graph and map all data points into a unique low-dimensional feature space, trying to maintain the distances of all pairs of high-dimensional data points. Experiments on the standard dynamic texture database show that the proposed framework with the novel algorithm can represent the motion characters of the dynamic textures very well.
Original languageEnglish
Pages (from-to)2375-2395
Number of pages21
JournalSignal Processing
Volume90
Issue number8
DOIs
Publication statusPublished - 1 Aug 2010

Keywords

  • Dynamic texture
  • Hybrid distance isometric embedding
  • Nonlinear dimensionality reduction
  • Video trajectory

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Electrical and Electronic Engineering

Cite this