Research review on behavior strategies of electric vehicles considering charging demands
-
摘要: 为提高电动汽车的可用性和营运效益,从充电行为策略(包括充电站推荐、充电路径规划)、载客业务下的行为策略(包括共享出行和租车场景)以及车网(V2G)互动下的行为策略3个角度出发,阐述了相关工作研究进展,重点梳理和总结人工智能技术的原理和应用,探讨了未来的研究方向。研究结果表明:充电站推荐研究主要集中在时间代价和充电费用2种优化目标,常采用启发式算法或强化学习算法求取最优充电站;充电路径规划需要针对电动汽车的特点构造路径能量约束和能量回收机制,一般基于帕累托最优方法或强化学习算法以时间、能量等目标对路径进行优化;共享出行场景下的行为策略研究主要利用订单时间分布和空间分布特征,协同接单、充电与重定位使车队收益最大化,租车场景下的行为策略研究利用充电和重定位保证服务点可用电动汽车数量满足用户需求;V2G场景下的行为策略研究主要集中在充放电成本效益、电网稳定性和能源利用效率3种优化目标,常采用数学规划方法或强化学习算法优化电动汽车的充放电行为;未来的电动汽车行为策略研究应关注引入自动驾驶技术后充电行为方面的变化,模型方面应关注可解释性和可扩展性,系统方面则应进一步考虑电池衰退和综合调度。Abstract: In order to improve the usability and operational efficiency of electric vehicles, the research progress of related work was described from three perspectives: charging behavior strategies (including charging station recommendation and charging path planning), behavior strategies for passenger service (including ride-sharing and car rental scenarios), and behavior strategies under vehicle-to-grid (V2G) interactions. The principles and applications of artificial intelligence technology were summarized, and the future research directions were explored. Research results show that the research on charging station recommendation focuses on two optimization objectives: time overhead and charging fee. Heuristic algorithm or reinforcement learning algorithm is often applied to obtain the optimal charging station. Charging path planning needs to construct path energy constraints and energy recovery mechanisms according to the characteristics of electric vehicles. In general, the Pareto optimal method or reinforcement learning algorithm is used to optimize the path with time, energy, and other objectives. In ride-sharing scenarios, the research on behavior strategies mainly uses the temporal and spatial distribution features, and coordinate order dispatch, charging, and repositioning operation to maximize fleet profit. In car rental scenarios, the research on behavior strategies uses charging and repositioning operation to provide abundant available electric vehicles to satisfy users' needs at service stations. Research on behavior strategy in V2G scenarios focuses on the three optimization objectives of charging/discharging cost effectiveness, power grid stability, and energy utilization efficiency. Mathematical programming method or reinforcement learning algorithm is often used to optimize the charging/discharging behavior of electric vehicles. Future research on behavior strategies of electric vehicles should focus on the changes in charging behavior after the introduction of autonomous driving technology, with attention to the interpretability and scalability of the model. From the system perspective, battery degradation and integrated scheduling should be further considered.
-
表 1 充电站推荐文献比较
Table 1. Comparison of recommendation literatures on charging station
文献 优化目标 求解方法 特点 [5] 最小化时间代价、路程增量 深度Q学习算法 区分快充、慢充,并考虑充电导致的路程增量 [6] 最小化时间代价 粒子群算法 考虑充电站的负载均衡 [7] 最小化时间代价 模拟退火算法 通过预测充电需求来获取更精确的排队时间 [8] 最小化时间代价、费用和剩余电量 遗传算法 考虑快充、慢充以及部分充等充电选项 [9] 最小化时间代价 拉格朗日松弛算法 算法可伸缩性较好,可应对1 000辆车的规模 [10] 最小化时间代价 多机调度算法 算法可保证等待时间存在上限 [11] 最小化充电次数 列生成算法 提出了分层的决策架构,提高算法的实时性 [17] 最小化路程代价、用电代价 动态规划算法 针对私家车从家出发途径目的地最终回到家的场景 [18] 最大化个体司机的收益 博弈论 利用协商机制解决司机竞争充电资源 [12] 最小化时间代价、费用和充电失败率 演员评论家(Actor-Critic, AC)算法 采用了集中式注意力融合的评委机制 [13] 最小化时间代价、满足用户偏好 AC算法 在时间代价最小充电站和用户偏好充电站间寻求平衡 [14] 最小化时间代价、费用和充电失败率 AC算法 通过图表示学习融合了充电站与电动汽车、充电站之间的关系 [15] 最小化时间代价、费用 Rainbow算法 通过图卷积获取电动汽车和充电站的交互关系 [16] 最小化充电费用 AC算法 采用电价函数来反映电价和负载的关系 [19] 最小化充电费用 博弈论 协调各个车队的充电动作 表 2 充电路径规划文献比较
Table 2. Comparison of literatures on charging path planning
文献 优化目标 求解方法 特点 [21] 能量约束下最小化行程时间 改进的贝尔曼福特算法 设置虚拟节点以表示充电行为 [22] 能量约束下最小化行程时间 标签传播算法 计算最小充电电量以减小充电时间代价 [23]、[24] 能量约束下最小化行程时间 标签传播算法 提出双向标签传播以提高搜索速度 [25] 最小化行程时间 Q学习算法 给予电量耗尽状态负奖励以避免耗尽 [26] 能量约束下最小化行程时间 标签传播算法 调整充电站的充电时长以缩短行程时间 [27] 最小化出行距离,出行时间以及充电成本 自适应Dijkstra算法 采用了分时段动态交通路网模型和"时间-流量"路阻模型 [28] 最小化充电等待时间,最大化充电站利用率 改进的Dijkstra算法 考虑了多辆电动汽车充电给路网、用户和充电站带来的影响 [29] 最小化充电等待时间、费用及电压偏差 改进的A*算法 考虑电网负荷和充电桩运营商的利益 [30] 最小化行程距离 策略梯度算法 针对向多个用户运货并最终返回车库的场景 [31] 最小化电量耗尽的概率 安全强化学习算法 在满足安全电量约束的情况下, 通过最大化期望回报值得到最优策略 表 3 订单派遣与充电协同策略文献比较
Table 3. Comparison of literatures on integration strategies of dispatching and charging
文献 优化目标 求解方法 特点 [33] 最小化乘客等待时间、行程电量损耗和车辆运营代价 V值估计算法 利用KM算法实现车辆与动作的匹配 [34] 最优的车队规模 近似动态规划算法 利用V值单调性加速收敛过程 [35] 最小化代价、最大化收入 教师-学生强化学习算法 可提供更快的决策速度和可解释性 [36] 最大化车队收入 深度Q学习算法 当新的订单到来就进行一次决策 [37] 最大化车队收入 异步学习算法 利用排队论计算乘客等待时间 [38] 最大化车队收入 深度Q学习算法 动作域考虑重定位 [40] 最大化服务数量、最小化代价 滚动域控制算法 主动充电以应对高峰期的客流 [41] 最大化车队长远收入 鲁棒优化 可计算最坏情况下的总收入 [42] 最小化因充电失去的订单数量和最大化充电后接单概率 基于帕累托最优的改进算法 决策时考虑订单空间分布,最大限度地保持车辆的服务连续性 [43] 最小化服务供需差和负能量的车辆数 基于电池耗尽所需时间的启发式算法 先进行宏观充电规划决策,再依据宏观规划决定较短时间的具体策略 [44] 最大化接单数量和最小化等待时间 蚁群算法 通过信息素来描述订单匹配与路程时间、充电时间等因素的关系 [45] 最小化用户等待时间 蚁群算法 考虑订单撤销和新进订单 表 4 重定位与充电协同策略文献比较
Table 4. Comparison of literatures on integration strategies of relocation and charging
文献 优化目标 求解方法 特点 [46] 最小化区域供需差、最小化重定位距离和最大化剩余能量 线性规划 利用变量消除和约束方程矩阵的单模特性来求解 [47] 最大化重定位收益、最大化车队能量状态和最小化时间代价 动态规划 以未响应订单的等待时间作为重定位的收益,激励车辆向高等待时间区域移动 [48] 最小化重定位和充电代价、等待时间 贪心算法 重定位的目标中包括电量要求,允许车辆在重定位途中充电 [49] 最大化车队收益 V值估计算法 限制重定位数量以防止重定位形成震荡 [50] 最大化服务质量 深度Q学习算法 考虑交通状态不确定性 [51] 最小化重定位和充电代价 基于概率函数的启发式算法 根据区域需求、距离和已重定位车辆数量的概率函数选择重定位区域 [52] 最小化重定位和充电代价 鲁棒优化 考虑区域乘客需求和车辆供给不确定性 [53] 最小化重定位代价和最大化公平性 基于转换核的强化学习算法 通过转换核描述马尔可夫决策过程中状态转移的不确定性 [54] 最大化车队收益和公平性 博弈论 以车队的利润方差来描述车队公平性 [55] 最大化个体司机收益 博弈论 利用平均场向量提取周围智能体的状态 表 5 共享电动汽车文献比较
Table 5. Comparison of literatures on sharing electric vehicle
文献 优化目标 求解方法 特点 [56] 最小化最大响应时间和总运营时间 拉格朗日分析和KKT条件下的最优化算法 可保证服务质量和系统供需平衡 [57] 最大化租车收益 动态截止时间算法 可在高峰期更快地响应用户需求 [58]、[59] 最大化满足租车需求 PPO算法 鼓励用户进行自响应的重定位 [60] 最大化租车收益 混合整数二次规划 考虑电价浮动和租车订单价格浮动 [61] 最大化租车收益 遗传算法 考虑道路拥堵和定价对需求的影响 [62] 最小化调度和库存成本 模拟退火算法 考虑车辆积压的库存成本 [63] 最大化租车收益 剪枝和松弛算法 针对长途旅行中更换车辆的场景 [64] 最大化服务乘客数量 线性规划 针对换电模式下的租车场景 [65] 最小化服务车和租用车代价 混合遗传搜索算法 协同规划服务车和租用车的路线 [66] 最小化时间代价 AC算法 线路策略由序列到序列模型生成 [67] 最小化移动充电车能耗 自适应大邻域搜索算法 针对移动充电车的充电场景 表 6 V2G下的行为策略文献比较
Table 6. Comparison of literatures on behavior strategies in V2G
文献 优化目标 求解方法 特点 [68] 最小化车队充电成本、运营成本和充电站投资成本 Benders和Scenario分解算法 考虑从投资充电站到日常充电运营的整体成本 [69] 最小化充电成本和碳排放量 Benders分解算法 利用隶属度模型获得折衷解 [70] 最小化电力成本和污染指数 改进的粒子群算法 考虑了充电、发电和响应等方面成本 [71] 最大化充放电的成本效益 深度Q学习算法 考虑了充放电电量、电池老化 [72] 最小化电动汽车的充电成本 CPO算法 可大幅降低充电调度的违规率 [73] 最大化电动汽车充电效益 Q学习算法 考虑了行程安排与时变电价 [74] 最大化电动汽车的成本效益、最小化功率损耗和负载变化 博弈论 考虑了动态的实时定价模型、电池退化成本和电网负载稳定性 [75] 最小化电动汽车充电成本 RDDPG算法 考虑了充电站的平等性 [76] 最大化电动汽车的收益、电网和太阳能的利用效率 分支定界算法 针对可再生资源充电站的场景 -
[1] 赵轩, 李美莹, 余强, 等. 电动汽车动力锂电池状态估计综述[J]. 中国公路学报, 2023, 36(6): 254-283.ZHAO Xuan, LI Mei-ying, YU Qiang, et al. State estimation of power lithium batteries for electric vehicles: a review[J]. China Journal of Highway and Transport, 2023, 36(6): 254-283. (in Chinese) [2] 郭剑锋, 张雪美, 曹琪, 等. 电动汽车助力我国能源安全与"碳达峰、碳中和"协同推进[J]. 中国科学院院刊, 2024, 39(2): 397-407.GUO Jian-feng, ZHANG Xue-mei, CAO Qi, et al. Electric vehicles contribute to China's energy security and carbon peaking and carbon neutrality[J]. Bulletin of Chinese Academy of Sciences, 2024, 39(2): 397-407. (in Chinese) [3] 国家能源局. 2022中国电动汽车用户充电行为白皮书[R]. 北京: 国家能源局, 2023.National Energy Administration. 2022 China electric vehicle user charging behavior white paper[R]. Beijing: National Energy Administration, 2023. (in Chinese) [4] 中国城市规划设计研究院. 2022年中国主要城市充电基础设施监测报告[R]. 北京: 中国城市规划设计研究院, 2022.China Academy of Urban Planning and Design. China major cities charging infrastructure monitoring report 2022[R]. Beijing: China Academy of Urban Planning and Design, 2022. (in Chinese) [5] ZHANG Cong, LIU Yuan-an, WU Fan, et al. Effective charging planning based on deep reinforcement learning for electric vehicles[J]. IEEE Transactions on Intelligent Transportation Systems, 2021, 22(1): 542-554. doi: 10.1109/TITS.2020.3002271 [6] AN Yi-sheng, GAO Yu-xin, WU Nai-qi, et al. Optimal scheduling of electric vehicle charging operations considering real-time traffic condition and travel distance[J]. Expert Systems with Applications, 2023, 213: 118941. doi: 10.1016/j.eswa.2022.118941 [7] WANG Guang, CHEN Yue-fei, WANG Shuai, et al. For E-Taxi: data-driven fleet-oriented charging resource allocation in large-scale electric taxi networks[J]. ACM Transactions on Sensor Networks, 2023, 19(3): 63. [8] LIU Wei-li, GONG Yue-jiao, CHEN Wei-neng, et al. Coordinated charging scheduling of electric vehicles: a mixed-variable differential evolution approach[J]. IEEE Transactions on Intelligent Transportation Systems, 2020, 21(12): 5094-5109. doi: 10.1109/TITS.2019.2948596 [9] MA Tai-yu, XIE Si-min. Optimal fast charging station locations for electric ridesharing with vehicle-charging station assignment[J]. Transportation Research Part D: Transport and Environment, 2021, 90: 102682. doi: 10.1016/j.trd.2020.102682 [10] DONG Zheng, LIU Cong, LI Yan-hua, et al. REC: predictable charging scheduling for electric taxi fleets[C]//IEEE. 2017 IEEE Real-Time Systems Symposium. New York: IEEE, 2017: 287-296. [11] JAMSHIDI H, CORREIA G H A, VAN ESSEN J T, et al. Dynamic planning for simultaneous recharging and relocation of shared electric taxies: a sequential MILP approach[J]. Transportation Research Part C: Emerging Technologies, 2021, 125: 102933. doi: 10.1016/j.trc.2020.102933 [12] ZHANG Wei-jia, LIU Hao, WANG Fan, et al. Intelligent electric vehicle charging recommendation based on multi-agent reinforcement learning[C]//ACM. Proceedings of the World Wide Web Conference 2021. New York: ACM, 2021: 1856-1867. [13] LI Cheng-yin, DONG Zheng, FISHER N, et al. Coupling user preference with external rewards to enable driver-centered and resource-aware EV charging recommendation[C]//Springer. Machine Learning and Knowledge Discovery in Databases 2022. Berlin: Springer, 2022: 3-19. [14] ZHANG Wei-jia, LIU Hao, XIONG Hui, et al. RLCharge: imitative multi-agent spatiotemporal reinforcement learning for electric vehicle charging station recommendation[J]. IEEE Transactions on Knowledge and Data Engineering, 2023, 35(6): 6290-6304. doi: 10.1109/TKDE.2022.3178819 [15] XING Qiang, XU Yan, CHEN Zhong, et al. A graph reinforcement learning-based decision-making platform for real-time charging navigation of urban electric vehicles[J]. IEEE Transactions on Industrial Informatics, 2023, 19(3): 3284-3295. doi: 10.1109/TII.2022.3210264 [16] CAO Yong-sheng, WANG Hao, LI De-min, et al. Smart online charging algorithm for electric vehicles via customized actor-critic learning[J]. IEEE Internet of Things Journal, 2022, 9(1): 684-694. doi: 10.1109/JIOT.2021.3084923 [17] YI Zong-gen, SHIRK M. Data-driven optimal charging decision making for connected and automated electric vehicles: a personal usage scenario[J]. Transportation Research Part C: Emerging Technologies, 2018, 86: 37-58. doi: 10.1016/j.trc.2017.10.014 [18] GAO J, WONG T, WANG C. Social welfare maximizing fleet charging scheduling through voting-based negotiation[J]. Transportation Research Part C: Emerging Technologies, 2021, 130: 103304. doi: 10.1016/j.trc.2021.103304 [19] ZHU Ming, LIU Xiao-yang, WANG Xiao-dong. Joint transportation and charging scheduling in public vehicle systems—a game theoretic approach[J]. IEEE Transactions on Intelligent Transportation Systems, 2018, 19(8): 2407-2419. doi: 10.1109/TITS.2018.2817484 [20] WANG Z F, JOCHEM P, FICHTNER W. A scenario-based stochastic optimization model for charging scheduling of electric vehicles under uncertainties of vehicle availability and charging demand[J]. Journal of Cleaner Production, 2020, 254: 119886. doi: 10.1016/j.jclepro.2019.119886 [21] MORLOCK F, ROLLE B, BAUER M, et al. Time optimal routing of electric vehicles under consideration of available charging infrastructure and a detailed consumption model[J]. IEEE Transactions on Intelligent Transportation Systems, 2020, 21(12): 5123-5135. doi: 10.1109/TITS.2019.2949053 [22] BAUM M, DIBBELT J, GEMSA A, et al. Shortest feasible paths with charging stops for battery electric vehicles[J]. Transportation Science, 2019, 53(6): 1627-1655. doi: 10.1287/trsc.2018.0889 [23] SCHOENBERG S, DRESSLER F. Planning ahead for EV: total travel time optimization for electric vehicles[C]//IEEE. 2019 IEEE Intelligent Transportation Systems Conference. New York: IEEE, 2019: 3068-3075. [24] SCHOENBERG S, DRESSLER F. Reducing waiting times at charging stations with adaptive electric vehicle route planning[J]. IEEE Transactions on Intelligent Vehicles, 2023, 8(1): 95-107. doi: 10.1109/TIV.2022.3140894 [25] DOROKHOVA M, BALLIF C, WYRSCH N. Routing of electric vehicles with intermediary charging stations: a reinforcement learning approach[J]. Frontiers in Big Data, 2021, 4: 586481. doi: 10.3389/fdata.2021.586481 [26] FROGER A, MENDOZA J E, JABALI O, et al. Improved formulations and algorithmic components for the electric vehicle routing problem with nonlinear charging functions[J]. Computers and Operations Research, 2019, 104: 256-294. doi: 10.1016/j.cor.2018.12.013 [27] 邢强, 陈中, 冷钊莹, 等. 基于实时交通信息的电动汽车路径规划和充电导航策略[J]. 中国电机工程学报, 2020, 40(2): 534-549.XING Qiang, CHEN Zhong, LENG Zhao-ying, et al. Route planning and charging navigation strategy for electric vehicles based on real-time traffic information[J]. Proceedings of the CSEE, 2020, 40(2): 534-549. (in Chinese) [28] 张书玮, 冯桂璇, 樊月珍, 等. 基于信息交互的大规模电动汽车充电路径规划[J]. 清华大学学报(自然科学版), 2018, 58(3): 279-285.ZHANG Shu-wei, FENG Gui-xuan, FAN Yue-zhen, et al. Large-scale electric vehicle charging path planning based on information interaction[J]. Journal of Tsinghua University (Science and Technology), 2018, 58(3): 279-285. (in Chinese) [29] 刘东奇, 谢金焕, 王耀南. 车联网中多主体参与的电动汽车预充电路径规划[J]. 控制理论与应用, 2024, 41(8): 1438-1450.LIU Dong-qi, XIE Jin-huan, WANG Yao-nan. Electric vehicle pre-charging path planning with multi-agent participation in the internet of vehicles[J]. Control Theory and Applications, 2024, 41(8): 1438-1450. (in Chinese) [30] LIN Bo, GHADDAR B, NATHWANI J. Deep reinforcement learning for the electric vehicle routing problem with time windows[J]. IEEE Transactions on Intelligent Transportation Systems, 2022, 23(8): 11528-11538. doi: 10.1109/TITS.2021.3105232 [31] BASSO R, KULCSÁR B, SÁNCHEZ-DÍAZ I, et al. Dynamic stochastic electric vehicle routing with safe reinforcement learning[J]. Transportation Research Part E: Logistics and Transportation Review, 2022, 157: 102496. doi: 10.1016/j.tre.2021.102496 [32] BASSO R, KULCSÁR B, EGARDT B, et al. Energy consumption estimation integrated into the electric vehicle routing problem[J]. Transportation Research Part D: Transport and Environment, 2019, 69: 141-167. doi: 10.1016/j.trd.2019.01.006 [33] SHI Jie, GAO Yuan-qi, WANG Wei, et al. Operating electric vehicle fleet for ride-hailing services with reinforcement learning[J]. IEEE Transactions on Intelligent Transportation Systems, 2020, 21(11): 4822-4834. doi: 10.1109/TITS.2019.2947408 [34] AL-KANJ L, NASCIMENTO J, POWELL W B. Approximate dynamic programming for planning a ride-hailing system using autonomous fleets of electric vehicles[J]. European Journal of Operational Research, 2020, 284(3): 1088-1106. doi: 10.1016/j.ejor.2020.01.033 [35] TANG Xin-di, LI Meng, LIN Xi, et al. Online operations of automated electric taxi fleets: an advisor-student reinforcement learning framework[J]. Transportation Research Part C: Emerging Technologies, 2020, 121: 102844. doi: 10.1016/j.trc.2020.102844 [36] KULLMAN N D, COUSINEAU M, GOODSON J C, et al. Dynamic ride-hailing with electric vehicles[J]. Transportation Science, 2022, 56(3): 775-794. doi: 10.1287/trsc.2021.1042 [37] YU Guo-dong, LIU Ai-jun, ZHANG Jiang-hua, et al. Optimal operations planning of electric autonomous vehicles via asynchronous learning in ride-hailing systems[J]. Omega, 2021, 103: 102448. doi: 10.1016/j.omega.2021.102448 [38] WANG Ning, GUO Jia-hui. Modeling and optimization of multiaction dynamic dispatching problem for shared autonomous electric vehicles[J]. Journal of Advanced Transportation, 2021, 2021: 1368286. [39] TURAN B, PEDARSANI R, ALIZADEH M. Dynamic pricing and fleet management for electric autonomous mobility on demand systems[J]. Transportation Research Part C: Emerging Technologies, 2020, 121: 102829. doi: 10.1016/j.trc.2020.102829 [40] YUAN Yu-kun, ZHANG De-sheng, MIAO Fei, et al. p2Charging: proactive partial charging for electric taxi systems[C]//IEEE. 2019 IEEE 39th International Conference on Distributed Computing Systems. New York: IEEE, 2019: 688-699. [41] FAN Gui-yun, JIN Hai-ming, ZHAO Yi-ran, et al. Joint order dispatch and charging for electric self-driving taxi systems[C]//IEEE. IEEE INFOCOM 2022—IEEE Conference on Computer Communications. New York: IEEE, 2022: 1619-1628. [42] YAN Li, SHEN Hai-ying, KANG Liu-wang, et al. CD-guide: a dispatching and charging approach for electric taxicabs[J]. IEEE Internet of Things Journal, 2022, 9(23): 23302-23319. doi: 10.1109/JIOT.2022.3195785 [43] ZALESAK M, SAMARANAYAKE S. Real time operation of high-capacity electric vehicle ridesharing fleets[J]. Transportation Research Part C: Emerging Technologies, 2021, 133: 103413. doi: 10.1016/j.trc.2021.103413 [44] LIANG Di, ZHAN Zhi-Hui, ZHANG Yan-chun, et al. An efficient ant colony system approach for new energy vehicle dispatch problem[J]. IEEE Transactions on Intelligent Transportation Systems, 2020, 21(11): 4784-4797. doi: 10.1109/TITS.2019.2946711 [45] SHI Lin, ZHAN Zhi-Hui, LIANG Di, et al. Memory-based ant colony system approach for multi-source data associated dynamic electric vehicle dispatch optimization[J]. IEEE Transactions on Intelligent Transportation Systems, 2022, 23(10): 17491-17505. doi: 10.1109/TITS.2022.3150471 [46] DEAN M D, GURUMURTHY K M, DE SOUZA F, et al. Synergies between repositioning and charging strategies for shared autonomous electric vehicle fleets[J]. Transportation Research Part D: Transport and Environment, 2022, 108: 103314. doi: 10.1016/j.trd.2022.103314 [47] YI Z G, SMART J. A framework for integrated dispatching and charging management of an autonomous electric vehicle ride-hailing fleet[J]. Transportation Research Part D: Transport and Environment, 2021, 95: 102822. doi: 10.1016/j.trd.2021.102822 [48] PANTELIDIS T P, LI L, MA T Y, et al. A node-charge graph-based online carshare rebalancing policy with capacitated electric charging[J]. Transportation Science, 2022, 56(3): 654-676. doi: 10.1287/trsc.2021.1058 [49] LIANG Yan-chang, DING Zhao-hao, DING Tao, et al. Mobility-aware charging scheduling for shared on-demand electric vehicle fleet using deep reinforcement learning[J]. IEEE Transactions on Smart Grid, 2021, 12(2): 1380-1393. doi: 10.1109/TSG.2020.3025082 [50] SILVA P, HAN Y J, KIM Y C, et al. Ride-hailing service aware electric taxi fleet management using reinforcement learning[C]//IEEE. 2022 Thirteenth International Conference on Ubiquitous and Future Networks. New York: IEEE, 2022: 427-432. [51] KIM S, LEE U, LEE I, et al. Idle vehicle relocation strategy through deep learning for shared autonomous electric vehicle system optimization[J]. Journal of Cleaner Production, 2022, 333: 130055. doi: 10.1016/j.jclepro.2021.130055 [52] HE S H, PEPIN L, WANG G, et al. Data-driven distributionally robust electric vehicle balancing for mobility-on-demand systems under demand and supply uncertainties[C]//IEEE. 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems. New York: IEEE, 2020: 2165-2172. [53] HE Si-hong, WANG Yue, HAN Shuo, et al. A robust and constrained multi-agent reinforcement learning framework for electric vehicle AMoD systems[C]//IEEE. 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems. New York: IEEE, 2023: 5637-5644. [54] WANG Guang, ZHONG Shu-xin, WANG Shuai, et al. Data-driven fairness-aware vehicle displacement for large-scale electric taxi fleets[C]//IEEE. 2021 IEEE 37th International Conference on Data Engineering. New York: IEEE, 2021: 1200-1211. [55] WANG En-shu, DING Rong, YANG Zhao-xing, et al. Joint charging and relocation recommendation for E-taxi drivers via multi-agent mean field hierarchical reinforcement learning[J]. IEEE Transactions on Mobile Computing, 2022, 21(4): 1274-1290. doi: 10.1109/TMC.2020.3022173 [56] GUO Ge, SUN Tian-yu. Selective multi-grade charging scheduling and rebalancing for one-way car-sharing systems[J]. IEEE Transactions on Intelligent Transportation Systems, 2023, 24(4): 4391-4402. doi: 10.1109/TITS.2022.3229383 [57] WANG Guang, QIN Zhou, WANG Shuai, et al. Towards accessible shared autonomous electric mobility with dynamic deadlines[J]. IEEE Transactions on Mobile Computing, 2024, 23(1): 925-94. doi: 10.1109/TMC.2022.3213125 [58] LUO Man, ZHANG Wen-zhe, SONG Tian-you, et al. Rebalancing expanding EV sharing systems with deep reinforcement learning[C]//ACM. 29th International Joint Conference on Artificial Intelligence. New York: ACM, 2020: 1338-1344. [59] LUO Man, DU Bo-wen, ZHANG Wen-zhe, et al. Fleet rebalancing for expanding shared e-mobility systems: a multi-agent deep reinforcement learning approach[J]. IEEE Transactions on Intelligent Transportation Systems, 2023, 24(4): 3868-3881. doi: 10.1109/TITS.2022.3233422 [60] XIE Rui, WEI Wei, WU Qiu-wei, et al. Optimal service pricing and charging scheduling of an electric vehicle sharing system[J]. IEEE Transactions on Vehicular Technology, 2020, 69(1): 78-89. doi: 10.1109/TVT.2019.2950402 [61] 马舒予, 胡路, 吴佳媛, 等. 共享电动汽车系统车队规模与停车泊位数优化[J]. 交通运输工程与信息学报, 2022, 20(3): 31-42.MA Shu-yu, HU Lu, WU Jia-yuan, et al. Fleet size and parking capacity optimization of electric carsharing system[J]. Journal of Transportation Engineering and Information, 2022, 20(3): 31-42. (in Chinese) [62] 高俊杰, 崔晓敏, 赵鹏, 等. 基于需求预测的单向共享电动汽车车辆调度方法[J]. 大连理工大学学报, 2019, 59(6): 648-655.GAO Jun-jie, CUl Xiao-min, ZHAO Peng, et al. Scheduling method for one-way electric car-sharing based on demand forecasting[J]. Journal of Dalian University of Technology, 2019, 59(6): 648-655. (in Chinese) [63] ZHANG Dong, LIU Yang, HE Shuang-chi. Vehicle assignment and relays for one-way electric car-sharing systems[J]. Transportation Research Part B: Methodological, 2019, 120: 125-146. doi: 10.1016/j.trb.2018.12.004 [64] RIGAS E S, RAMCHURN S D, BASSILIADES N. Algorithms for electric vehicle scheduling in mobility-on-demand schemes[C]//IEEE. 2015 IEEE 18th International Conference on Intelligent Transportation Systems. New York: IEEE, 2015: 1339-1344. [65] FOLKESTAD C A, HANSEN N, FAGERHOLT K, et al. Optimal charging and repositioning of electric vehicles in a free-floating carsharing system[J]. Computers and Operations Research, 2020, 113: 104771. doi: 10.1016/j.cor.2019.104771 [66] BOGYRBAYEVA A, JANG S, SHAH A, et al. A reinforcement learning approach for rebalancing electric vehicle sharing systems[J]. IEEE Transactions on Intelligent Transportation Systems, 2022, 23(7): 8704-8714. doi: 10.1109/TITS.2021.3085217 [67] CUI Shao-hua, MA Xiao-lei, ZHANG Ming-heng, et al. The parallel mobile charging service for free-floating shared electric vehicle clusters[J]. Transportation Research Part E: Logistics and Transportation Review, 2022, 160: 102652. doi: 10.1016/j.tre.2022.102652 [68] ZHANG Yi-ling, LU Meng-shi, SHEN Si-qian. On the values of vehicle-to-grid electricity selling in electric vehicle sharing[J]. Manufacturing and Service Operations Management, 2021, 23(2): 488-507. [69] ZAKARIAZADEH A, JADID S, SIANO P. Multi-objective scheduling of electric vehicles in smart distribution system[J]. Energy Conversion and Management, 2014, 79: 43-53. doi: 10.1016/j.enconman.2013.11.042 [70] YIN W J, MAVALURU D, AHMED M, et al. Application of new multi-objective optimization algorithm for EV scheduling in smart grid through the uncertainties[J]. Journal of Ambient Intelligence and Humanized Computing, 2020, 11(5): 2071-2103. doi: 10.1007/s12652-019-01233-1 [71] WAN Zhi-qiang, LI He-peng, HE Hai-bo, et al. Model-free real-time EV charging scheduling based on deep reinforcement learning[J]. IEEE Transactions on Smart Grid, 2019, 10(5): 5246-5257. doi: 10.1109/TSG.2018.2879572 [72] LI He-peng, WAN Zhi-qiang, HE Hai-bo. Constrained EV charging scheduling based on safe deep reinforcement learning[J]. IEEE Transactions on Smart Grid, 2020, 11(3): 2427-2439. doi: 10.1109/TSG.2019.2955437 [73] DANG Qi-yun, WU Di, BOULET B. A Q-learning based charging scheduling scheme for electric vehicles[C]//IEEE. 2019 IEEE Transportation Electrification Conference and Expo. New York: IEEE, 2019: 8790603. [74] LATIFI M, RASTEGARNIA A, KHALILI A, et al. Agent-based decentralized optimal charging strategy for plug-in electric vehicles[J]. IEEE Transactions on Industrial Electronics, 2019, 66(5): 3668-3680. doi: 10.1109/TIE.2018.2853609 [75] LI Hang, LI Guo-jie, LIE T T, et al. Constrained large-scale real-time EV scheduling based on recurrent deep reinforcement learning[J]. International Journal of Electrical Power and Energy Systems, 2023, 144: 108603. doi: 10.1016/j.ijepes.2022.108603 [76] YUAN Yu-kun, ZHAO Yue, LIN Shan. SAC: solar-aware E-taxi fleet charging coordination under dynamic passenger mobility[C]//IEEE. Proceedings of the IEEE Conference on Decision and Control. New York: IEEE, 2021: 2071-2078. [77] KOUFAKIS A M, RIGAS E S, BASSILIADES N, et al. Offline and online electric vehicle charging scheduling with V2V energy transfer[J]. IEEE Transactions on Intelligent Transportation Systems, 2020, 21(5): 2128-2138. doi: 10.1109/TITS.2019.2914087 [78] AYAD A, EL-TAWEEL N A, FARAG H E Z. Optimal design of battery swapping-based electrified public bus transit systems[J]. IEEE Transactions on Transportation Electrification, 2021, 7(4): 2390-2401. doi: 10.1109/TTE.2021.3083106 [79] KONER R, LI H, HILDEBRANDT M, et al. Graphhopper: multi-hop scene graph reasoning for visual question answering[C]//Springer. 20th International Semantic Web Conference. Berlin: Springer, 2021: 111-127. [80] WANG Lu-ting, CHEN Bo. Model-based analysis of V2G impact on battery degradation[C]//SAE. 2017 SAE World Congress Experience. Warrendale: SAE, 2017: 1699. [81] YAN Li, SHEN Hai-ying, LI Zhuo-zhao, et al. Employing opportunistic charging for electric taxicabs to reduce idle time[J]. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 2018, 2(1): 47. [82] KUSARI A, LI Pei, YANG Han-zhi, et al. Enhancing sumo simulator for simulation based testing and validation of autonomous vehicles[C]//IEEE. 2022 IEEE Intelligent Vehicles Symposium. New York: IEEE, 2022: 829-835. [83] MA T Y, RASULKHANI S, CHOW J Y J, et al. A dynamic ridesharing dispatch and idle vehicle repositioning strategy with integrated transit transfers[J]. Transportation Research Part E: Logistics and Transportation Review, 2019, 128: 417-442. doi: 10.1016/j.tre.2019.07.002 -

计量
- 文章访问数: 114
- HTML全文浏览量: 20
- PDF下载量: 8
- 被引次数: 0