Effective Prior Regularized Sparse Learning

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

Abstract

Neural radiance fields (NeRF) have the ability of synthesizing novel views from sets of input images, which has attracted a great deal of interest in recent years. Typical methods require tens of images for view synthesis, which limits the potential applications of NeRF. In this paper, a novel framework is proposed for view synthesis in a sparse setting by tactically imposing a regularization using prior information extracted from a pretrained network. We design a network model that trains a prior field as well as a color field simultaneously, and the network integrates such prior knowledge for better novel view synthesis. Experiments on two benchmark datasets have demonstrated the effectiveness and robustness of our method and that our framework is adaptable to other existing methods for synthesizing better quality outputs in a sparse setting.

Original languageEnglish
Title of host publicationComputer Vision – ECCV 2024 Workshops, Proceedings
EditorsAlessio Del Bue, Cristian Canton, Jordi Pont-Tuset, Tatiana Tommasi
PublisherSpringer Science and Business Media Deutschland GmbH
Pages246-260
Number of pages15
ISBN (Print)9783031918551
DOIs
Publication statusPublished - 2025
EventWorkshops that were held in conjunction with the 18th European Conference on Computer Vision, ECCV 2024 - Milan, Italy
Duration: 29 Sept 20244 Oct 2024

Publication series

NameLecture Notes in Computer Science
Volume15632 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

ConferenceWorkshops that were held in conjunction with the 18th European Conference on Computer Vision, ECCV 2024
Country/TerritoryItaly
CityMilan
Period29/09/244/10/24

ASJC Scopus subject areas

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'Effective Prior Regularized Sparse Learning'. Together they form a unique fingerprint.

Cite this