Vision-Based Trajectory Planning via Imitation Learning for Autonomous Vehicles

Peide Cai, Yuxiang Sun, Yuying Chen, Ming Liu

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

37 Citations (Scopus)

Abstract

Reliable trajectory planning like human drivers in real-world dynamic urban environments is a critical capability for autonomous driving. To this end, we develop a vision and imitation learning-based planner to generate collision-free trajectories several seconds into the future. Our network consists of three sub-networks to conduct three basic driving tasks: keep straight, turn left and turn right. During the planning process, high-level commands are received as prior information to select a specific sub-network. We create our dataset from the Robotcar dataset, and the experimental results suggest that our planner is able to reliably generate trajectories in various driving tasks, such as turning at different intersections, lane-keeping on curved roads and changing lanes for collision avoidance.

Original languageEnglish
Title of host publication2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019
PublisherIEEE
Pages2736-2742
Number of pages7
ISBN (Electronic)9781538670248
DOIs
Publication statusPublished - Oct 2019
Event2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019 - Auckland, New Zealand
Duration: 27 Oct 201930 Oct 2019

Publication series

Name2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019

Conference

Conference2019 IEEE Intelligent Transportation Systems Conference, ITSC 2019
Country/TerritoryNew Zealand
CityAuckland
Period27/10/1930/10/19

ASJC Scopus subject areas

  • Artificial Intelligence
  • Management Science and Operations Research
  • Instrumentation
  • Transportation

Fingerprint

Dive into the research topics of 'Vision-Based Trajectory Planning via Imitation Learning for Autonomous Vehicles'. Together they form a unique fingerprint.

Cite this