Robust and real-time texture analysis system using a distributed workstation cluster

Jia You, H. A. Cohen, W. P. Zhu, E. Pissaloux

Research output: Journal article publicationConference articleAcademic researchpeer-review

1 Citation (Scopus)

Abstract

This paper presents a parallel approach to the development of a real-time system for recognition of textured objects. In order to find an efficient and effective approach to identify and localize objects in textured images invariant of translation, rotation and scale changes and occlusion, we propose a new method which involves dynamic texture feature extraction and hierarchical image matching. Based on our previous work, we extend the concept of interesting points and develop a dynamic detection procedure on texture energy image which is in conjunction with Laws' texture energy concept and our mask tuning scheme. The search for the best fit between two objects in terms of Hausdorff distance is guided through an interesting point pyramid from coarse level to fine level. In addition, unlike current approaches which mostly rely on specialized multiprocessor architectures for fast processing, we use a distributed workstation cluster to support parallelism, which provides a different approach to real-time computing and applicable to many classes of tasks.
Original languageEnglish
Pages (from-to)2207-2210
Number of pages4
JournalICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
Volume4
Publication statusPublished - 1 Jan 1996
Externally publishedYes
EventProceedings of the 1996 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP. Part 1 (of 6) - Atlanta, GA, United States
Duration: 7 May 199610 May 1996

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Signal Processing
  • Acoustics and Ultrasonics

Fingerprint

Dive into the research topics of 'Robust and real-time texture analysis system using a distributed workstation cluster'. Together they form a unique fingerprint.

Cite this