TY - JOUR
T1 - Probabilistic End-to-End Vehicle Navigation in Complex Dynamic Environments with Multimodal Sensor Fusion
AU - Cai, Peide
AU - Wang, Sukai
AU - Sun, Yuxiang
AU - Liu, Ming
N1 - Funding Information:
Manuscript received February 24, 2020; accepted April 21, 2020. Date of publication May 11, 2020; date of current version May 22, 2020. This letter was recommended for publication by Associate Editor H. Myung and Editor Y. Choi upon evaluation of the reviewers’ comments. This work was supported in part by the National Natural Science Foundation of China, under Grant U1713211, in part by the Shenzhen Science, Technology and Innovation Commission (SZSTI) under Grant JCYJ20160428154842603, and in part by the Research Grant Council of Hong Kong SAR Government, China, under Project Nos. 11210017 and 21202816, awarded to Prof. Ming Liu. (Corresponding author: Ming Liu.) The authors are with The Hong Kong University of Science and Technology, Hong Kong (e-mail: [email protected]; [email protected]; [email protected]; [email protected]).
Publisher Copyright:
© 2016 IEEE.
PY - 2020/7
Y1 - 2020/7
N2 - All-day and all-weather navigation is a critical capability for autonomous driving, which requires proper reaction to varied environmental conditions and complex agent behaviors. Recently, with the rise of deep learning, end-to-end control for autonomous vehicles has been well studied. However, most works are solely based on visual information, which can be degraded by challenging illumination conditions such as dim light or total darkness. In addition, they usually generate and apply deterministic control commands without considering the uncertainties in the future. In this letter, based on imitation learning, we propose a probabilistic driving model with multi-perception capability utilizing the information from the camera, lidar and radar. We further evaluate its driving performance online on our new driving benchmark, which includes various environmental conditions (e.g., urban and rural areas, traffic densities, weather and times of the day) and dynamic obstacles (e.g., vehicles, pedestrians, motorcyclists and bicyclists). The results suggest that our proposed model outperforms baselines and achieves excellent generalization performance in unseen environments with heavy traffic and extreme weather.
AB - All-day and all-weather navigation is a critical capability for autonomous driving, which requires proper reaction to varied environmental conditions and complex agent behaviors. Recently, with the rise of deep learning, end-to-end control for autonomous vehicles has been well studied. However, most works are solely based on visual information, which can be degraded by challenging illumination conditions such as dim light or total darkness. In addition, they usually generate and apply deterministic control commands without considering the uncertainties in the future. In this letter, based on imitation learning, we propose a probabilistic driving model with multi-perception capability utilizing the information from the camera, lidar and radar. We further evaluate its driving performance online on our new driving benchmark, which includes various environmental conditions (e.g., urban and rural areas, traffic densities, weather and times of the day) and dynamic obstacles (e.g., vehicles, pedestrians, motorcyclists and bicyclists). The results suggest that our proposed model outperforms baselines and achieves excellent generalization performance in unseen environments with heavy traffic and extreme weather.
KW - Automation technologies for smart cities
KW - autonomous vehicle navigation
KW - motion planning and control
KW - multi-modal perception
KW - sensorimotor learning
UR - http://www.scopus.com/inward/record.url?scp=85085662811&partnerID=8YFLogxK
U2 - 10.1109/LRA.2020.2994027
DO - 10.1109/LRA.2020.2994027
M3 - Journal article
AN - SCOPUS:85085662811
SN - 2377-3766
VL - 5
SP - 4218
EP - 4224
JO - IEEE Robotics and Automation Letters
JF - IEEE Robotics and Automation Letters
IS - 3
M1 - 9091334
ER -