The Effect of the Initial Battery State of Charge on the Performance of a Soft Actor-Critic Energy Management System
2025-24-0104
To be published on 09/07/2025
- Event
- Content
- Reinforcement Learning (RL) approaches have gained significant popularity for solving complex optimization problems, such as the design of Energy Management Systems (EMS) in electrified powertrains, thanks to their adaptability and model-free nature. Therefore, this study investigates, through numerical simulations, the performance of an EMS, driven by a Soft Actor-Critic (SAC) RL agent targeting the reduction of CO2 emissions in a Plug-in Hybrid Electric Vehicle (pHEV). The SAC agent stands out with its stochastic policy, which accelerates the training process, boosts controller performance, and effectively handles the inherent uncertainties of vehicle dynamics. To improve generalization, the proposed controller was trained and validated across a wide set of driving scenarios. The robustness of the SAC policy was further demonstrated in charge depleting mode across various initial and final battery energy levels. Performance was then benchmarked against the Dynamic Programming (DP), which delivers the global optimum for this control problem. Simulation results reveal that the SAC agent achieves near-optimal results compared to the DP benchmark across most mission profiles. These results underscore the SAC agent’s multi-objective optimization capabilities, effectively balancing fuel efficiency with battery energy constraints.
- Citation
- Tresca, L., Pulvirenti, L., and Rolando, L., "The Effect of the Initial Battery State of Charge on the Performance of a Soft Actor-Critic Energy Management System," SAE Technical Paper 2025-24-0104, 2025, .