Abstract
This paper presents an interactive retexturing approach that preserves similar underlying texture distortion between the original and retextured images/videos. The system offers real-time feedback interaction for easy control of target objects definition, texture selection with size adjusting, and overall lighting tuning using latest GPU parallelism. Existing retexturing and synthesis methods deal with texture distortion by inter-pixel distances manipulation, and the underlying texture distortion of the original images is always destroyed due to limitations like improper distortion caused by human mesh stretching, or unavoidable texture splitting through synthesis. The long processing time due to time-consuming filtering is also unacceptable. We propose to utilize SIFT corner features to naturally discover the underlying texture distortion. Gradient depth recovery and wrinkle energy optimization are applied to accomplish the distortion process. We facilitate the interactive retexturing upon needs of users via real-time bilateral grid and feature-guided texture distortion optimization using CUDA parallelism, and video retexturing is accomplished by a keyframe-based texture transferring using real-time TV-L 1 optical flow with patch-based block motion techniques. Our interactive retexturing using feature-guided gradient optimization provides realistic retexturing while preserving elite texture distortion in cornered area. In the experiments, our method consistently demonstrates high-quality image/video retexturing with real-time feedback interaction.
Original language | English |
---|---|
Pages (from-to) | 1048-1059 |
Number of pages | 12 |
Journal | Computers and Graphics (Pergamon) |
Volume | 36 |
Issue number | 8 |
DOIs | |
Publication status | Published - Dec 2012 |
Externally published | Yes |
Keywords
- Bilateral grid
- GPU
- Real-time processing
- Retexturing
ASJC Scopus subject areas
- General Engineering
- Human-Computer Interaction
- Computer Graphics and Computer-Aided Design