Abstract
Motor vehicles typically exhibit a "speed-varying range"(SVR) characteristic. For battery-powered electric vehicles (BEVs), the range diminishes at higher speed. This characteristic greatly impacts BEV operation for demanding commercial uses like express delivery, given their limited range and long recharge times. In view of the above, this article examines a new electric vehicle routing problem (VRP) that explicitly models BEVs' SVR and considers the joint planning of BEV route, speed, and charging under stochastic traffic conditions. A deep reinforcement learning (DRL) approach that exploits the interdependence among the above three decision aspects is then developed to generate real-time policies. Experiments on hypothetical and real-world instances showcase that the proposed approach can efficiently find high-quality policies that effectively accommodate BEVs' SVR.
| Original language | English |
|---|---|
| Pages (from-to) | 7066-7082 |
| Number of pages | 17 |
| Journal | IEEE Transactions on Transportation Electrification |
| Volume | 11 |
| Issue number | 2 |
| DOIs | |
| Publication status | Published - Apr 2025 |
Keywords
- Deep reinforcement learning (DRL)
- delivery planning
- electric vehicle (EV)
- speed-varying range (SVR)
- uncertain traffic condition
ASJC Scopus subject areas
- Automotive Engineering
- Transportation
- Energy Engineering and Power Technology
- Electrical and Electronic Engineering