Battery Thermal Management Systems (BTMS) play a critical role in ensuring the longevity, safety, and efficient operation of lithium-ion battery packs. These systems are designed to better dissipate the heat generated by the cells during vehicle operation, thereby maintaining a uniform temperature distribution across the battery modules, preventing overheating and mitigating the chances of thermal runaway.
However, one of the primary challenges in BTMS design lies in achieving effective thermal contact between the battery cells and the cooling plate. Non-uniform or excessive application of Thermal Interface Materials (TIMs) without ensuring robustness and uniformity can increase interfacial thermal resistance, leading to significant temperature variations across the battery modules, which may trigger power limitations via the Battery Management System (BMS) and these thermal changes can cause inefficient cooling, ultimately affecting battery performance and lifespan.
In this paper, a real-world testing was conducted on the battery pack with uneven TIM application and unoptimized distribution patterns, which resulted in significant temperature variations across the pack. In contrast, the application of uniformly optimized TIM thickness reduced these temperature differences by up to 70%, demonstrating the critical impact of consistent interface design on thermal performance.
To validate and further understand these findings, combined conduction-convection heat transfer model was developed using ANSYS Fluent to simulate the thermal changes of the battery pack with different TIM thicknesses alongside the unoptimized distribution patterns. The results confirmed that uneven TIM distribution contributes significantly to thermal non-uniformity within the battery pack, whereas optimizing the thickness improves overall thermal performance. Additionally, the optimized application led to a significant reduction in weight of the thermal paste (TIMs) usage, resulting in cost savings and more efficient material utilization.