A light field sparse representation structure and its fast coding technique

Jie Chen, Alexander Matyasko, Lap Pui Chau

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

2 Citations (Scopus)

Abstract

The dimensionality of light field data is typically very large for efficient implementation of sparse representation algorithms, such as for dictionary training and sparse coding. We propose a framework for creating light field dictionary using the method of perspective-shearing. Such a dictionary has a special organized structure for different central view patterns and perspective disparities. Based on this dictionary structure, a two-stage sparse coding algorithm is proposed to speed up the reconstruction process by incorporating an interim Winner- Take-All (WTA) hash coding stage into the Orthogonal Matching Pursuit (OMP) algorithm; this stage proves to speed up the sparse coding process by almost three times but still maintains the reconstruction quality. The proposed scheme produces impressive light field reconstruction qualities for compressed light field sensing.

Original languageEnglish
Title of host publication2014 19th International Conference on Digital Signal Processing, DSP 2014
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages214-218
Number of pages5
ISBN (Electronic)9781479946129
DOIs
Publication statusPublished - Aug 2014
Externally publishedYes
Event2014 19th International Conference on Digital Signal Processing, DSP 2014 - Hong Kong, Hong Kong
Duration: 20 Aug 201423 Aug 2014

Publication series

NameInternational Conference on Digital Signal Processing, DSP
Volume2014-January

Conference

Conference2014 19th International Conference on Digital Signal Processing, DSP 2014
Country/TerritoryHong Kong
CityHong Kong
Period20/08/1423/08/14

Keywords

  • Compressed sensing
  • Light field
  • Perspective shearing
  • Two-stage coding
  • WTA hashing

ASJC Scopus subject areas

  • Signal Processing

Cite this