TY - JOUR
T1 - Enhanced 3D mapping with an RGB-D sensor via integration of depth measurements and image sequences
AU - Wu, Bo
AU - Ge, Xuming
AU - Xie, Linfu
AU - Chen, Wu
PY - 2019/9
Y1 - 2019/9
N2 - State-of-the-art visual simultaneous localization and mapping (SLAM) techniques greatly facilitate three-dimensional (3D) mapping and modeling with the use of low-cost red-green-blue-depth (RGB-D) sensors. However, the effective range of such sensors is limited due to the working range of the infrared (IR) camera, which provides depth information, and thus the practicability of such sensors in 3D mapping and modeling is limited. To address this limitation, we present a novel solution for enhanced 3D mapping using a low-cost RGB-D sensor. We carry out state-of-the-art visual SLAM to obtain 3D point clouds within the mapping range of the RGB-D sensor and implement an improved structure-from-motion (SfM) on the collected RGB image sequences with additional constraints from the depth information to produce image-based 3D point clouds. We then develop a feature-based scale-adaptive registration to merge the gained point clouds to further generate enhanced and extended 3D mapping results. We use two challenging test sites to examine the proposed method. At these two sites, the coverage of both generated 3D models increases by more than 50% with the proposed solution. Moreover, the proposed solution achieves a geometric accuracy of about 1% in a measurement range of about 20 m. These positiveexperimental results not only demonstrate thefeasibilityandpracticality of the proposed solutionbut alsoits potential.
AB - State-of-the-art visual simultaneous localization and mapping (SLAM) techniques greatly facilitate three-dimensional (3D) mapping and modeling with the use of low-cost red-green-blue-depth (RGB-D) sensors. However, the effective range of such sensors is limited due to the working range of the infrared (IR) camera, which provides depth information, and thus the practicability of such sensors in 3D mapping and modeling is limited. To address this limitation, we present a novel solution for enhanced 3D mapping using a low-cost RGB-D sensor. We carry out state-of-the-art visual SLAM to obtain 3D point clouds within the mapping range of the RGB-D sensor and implement an improved structure-from-motion (SfM) on the collected RGB image sequences with additional constraints from the depth information to produce image-based 3D point clouds. We then develop a feature-based scale-adaptive registration to merge the gained point clouds to further generate enhanced and extended 3D mapping results. We use two challenging test sites to examine the proposed method. At these two sites, the coverage of both generated 3D models increases by more than 50% with the proposed solution. Moreover, the proposed solution achieves a geometric accuracy of about 1% in a measurement range of about 20 m. These positiveexperimental results not only demonstrate thefeasibilityandpracticality of the proposed solutionbut alsoits potential.
UR - http://www.scopus.com/inward/record.url?scp=85071858655&partnerID=8YFLogxK
U2 - 10.14358/PERS.85.9.633
DO - 10.14358/PERS.85.9.633
M3 - Journal article
AN - SCOPUS:85071858655
SN - 0099-1112
VL - 85
SP - 633
EP - 642
JO - Photogrammetric Engineering and Remote Sensing
JF - Photogrammetric Engineering and Remote Sensing
IS - 9
ER -