TY - JOUR
T1 - PolyU-BPCoMa: A dataset and benchmark towards mobile colorized mapping using a backpack multisensorial system
AU - Shi, Wenzhong
AU - Chen, Pengxin
AU - Wang, Muyang
AU - Bao, Sheng
AU - Xiang, Haodong
AU - Yu, Yue
AU - Yang, Daping
N1 - Funding Information:
This work was partially supported by Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET, Argentina) and Agencia Nacional de Investigación e Innovación (ANII, Uruguay) through a bilateral cooperation project (MOV_CO_2015_1_110430) and Programa de Desarrollo de las Ciencias Básicas (PEDECIBA, Uruguay). Also, the author Laura I Lafon Hughes donated US$ 1000 to Instituto Pasteur de Montevideo (IPMONT, 21/05/2019) to compensate for usage of required molecular biology reagents. The CRISPR/Cas VCL KO kit was donated by Synthego. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Publisher Copyright:
© 2022 The Author(s)
PY - 2022/8
Y1 - 2022/8
N2 - Constructing colorized point clouds from mobile laser scanning and images is a fundamental work in surveying and mapping. It is also an essential prerequisite for building digital twins for smart cities. However, existing public datasets are either in relatively small scales or lack accurate geometrical and color ground truth. This paper documents a multisensorial dataset named PolyU-BPCoMA which is distinctively positioned towards mobile colorized mapping. The dataset incorporates resources of 3D LiDAR, spherical imaging, GNSS and IMU on a backpack platform. Color checker boards are pasted in each surveyed area as targets and ground truth data are collected by an advanced terrestrial laser scanner (TLS). 3D geometrical and color information can be recovered in the colorized point clouds produced by the backpack system and the TLS, respectively. Accordingly, we provide an opportunity to benchmark the mapping and colorization accuracy simultaneously for a mobile multisensorial system. The dataset is approximately 800 GB in size covering both indoor and outdoor environments. The dataset and development kits are available at https://github.com/chenpengxin/PolyU-BPCoMa.git
AB - Constructing colorized point clouds from mobile laser scanning and images is a fundamental work in surveying and mapping. It is also an essential prerequisite for building digital twins for smart cities. However, existing public datasets are either in relatively small scales or lack accurate geometrical and color ground truth. This paper documents a multisensorial dataset named PolyU-BPCoMA which is distinctively positioned towards mobile colorized mapping. The dataset incorporates resources of 3D LiDAR, spherical imaging, GNSS and IMU on a backpack platform. Color checker boards are pasted in each surveyed area as targets and ground truth data are collected by an advanced terrestrial laser scanner (TLS). 3D geometrical and color information can be recovered in the colorized point clouds produced by the backpack system and the TLS, respectively. Accordingly, we provide an opportunity to benchmark the mapping and colorization accuracy simultaneously for a mobile multisensorial system. The dataset is approximately 800 GB in size covering both indoor and outdoor environments. The dataset and development kits are available at https://github.com/chenpengxin/PolyU-BPCoMa.git
KW - Backpack multisensorial system
KW - Mobile colorized mapping
UR - http://www.scopus.com/inward/record.url?scp=85135953902&partnerID=8YFLogxK
U2 - 10.1016/j.jag.2022.102962
DO - 10.1016/j.jag.2022.102962
M3 - Journal article
SN - 1569-8432
VL - 112
JO - International Journal of Applied Earth Observation and Geoinformation
JF - International Journal of Applied Earth Observation and Geoinformation
M1 - 102962
ER -