Browse Topic: Batteries
Linear time-invariant (LTI) reduced-order models (ROMs) have been widely used in battery thermal management simulations due to their low hardware requirements, high computational efficiency, and good accuracy. However, the inherent assumption of LTI behavior limits their applicability in scenarios with varying coolant flow rates, where this assumption is no longer valid. To address this limitation, a novel ROM is developed by decomposing the entire battery thermal system into two subsystems. All solid components are modeled as a traditional LTI ROM, while the coolant channel is represented using Newton’s cooling law. The two subsystems are then coupled through the exchange of heat transfer rate and temperature at the fluid–solid interface between the coolant and the cold plate. Model fidelity is further enhanced by introducing a spatially distributed heat flux during the generation of the LTI ROM for solid components. Validation is performed against CFD simulations at both module and
Reducing the high-voltage BEV to a household level of 120-240 volts is considered in the paper as an effective means of solving the problems of electrical safety, maintenance and minor repairs of an electric vehicle in household conditions, and distributed power supply of BEV within walking distance for the driver. The analysis of the low-voltage electric drive is performed under the assumption that the battery has a nominal voltage of 200 volts. The issues of transforming a high-voltage machine (400 volts) into a low-voltage one (200 volts) by switching the stator phase sections from serial to parallel connection without changing the overall and energy characteristics are considered. It is shown that a two-motor unit with induction machines with a capacity of 50 kilowatts can provide 100 kilowatts in long-term and up to 200 kilowatts in peak modes. The paper considers the issues of implementing a low-voltage inverter and modern trends in distributed power supply for BEVs based on low
In the design of Rechargeable Energy Storage System (RESS) structures, including battery trays, module side plates, and end plates, there are multiple conflating factors, including: Mechanical requirements necessitating the use of electrically conductive materials (steel and aluminum); proximity between battery module structure and battery cells, necessitating the use of electrical isolation coatings; and, module and pack designs that retain cells via the use of Structural Adhesive Material (SAM). Inherently, with this design approach, organic coatings are placed in a new and perilous position. In a sense, the coating becomes a supplement to an adhesive. As Computer-Aided Engineering (CAE) virtual analysis tools become more sophisticated, there is increasing reliance on these tools to predict the occurrence of structural failures in various load cases. Factors in test method, paint pretreatment, and topcoat affecting adhesion of organic coatings in structural adhesive joints are
As the utilization of lithium-ion batteries in electric vehicles expands, monitoring the usable cell capacity (UCC) is essential for ensuring accurate state-of-health (SOH) estimation. Battery performance degradation is influenced by temperature and constraints. Capacity tests in laboratory settings are typically conducted at low C-rates to approximate equilibrium conditions, whereas in real vehicle applications, charging currents are often much higher. This discrepancy in rates frequently results in deviations between laboratory characterization and on-board Battery Management Systems (BMS) capacity estimation. To investigate how C-rate of diagnostic Reference Performance Test (RPT) modulates aging effects under temperature and mechanical loading, we conducted long-term cycling tests on lithium iron phosphate/graphite pouch cells at 25°C and 45°C under different constrained conditions. The cycling protocol is a tiered multi-rate protocol. Cells were aged at Block1 under 1C, and UCC
Predicting battery self-discharge across wide temperature ranges and extended durations remains a significant challenge due to the scarcity of physical test data, which is typically limited to a few temperature points and short observation windows. This limitation complicates generalization and increases the risk of inaccurate extrapolation. To address this, the paper introduces a machine learning–based framework designed to predict self-discharge behavior under diverse thermal conditions and longtime horizons. Multiple modeling strategies are examined, including feedforward neural networks, long short-term memory (LSTM) architectures, synthetic data generation, and physics-informed integration of governing equations. Particular emphasis is placed on hybrid and physics-regularized models that embed first-principles relationships to guide extrapolation beyond the observed data domain. This approach mitigates the inherent instability and potential errors associated with purely data
Battery thermal management is crucial for ensuring the safety, efficiency, and longevity of lithium-ion battery packs, particularly in electric vehicles (EVs). The primary purpose of a lithium-ion battery in an electric vehicle is to store and provide electrical energy for vehicle propulsion while maintaining safety under different operating conditions. This work proposes a thermal correlation between 1D CFD simulation and experimental test data under passive environmental heat exchange conditions without active coolant flow of a battery pack comprising four modules. An environmental exchange test was conducted using a 50% state of charge (SOC) battery pack, which is stabilized at 25°C to assess passive heat dissipation, thermal soak behavior, temperature distribution, and potential thermal runaway risks. The simulation predictions correlate well within a 1.5°C range compared to test results using ambient temperature and flow inputs, which confirms the reliability of the modeling
Electric vehicles (EVs) face unique safety challenges under pole side impact conditions, largely due to the presence of floor-mounted battery packs. Existing regulatory test procedures, such as FMVSS 214, primarily address occupant injury using full-height cylindrical obstacles. These procedures were originally developed for internal combustion vehicles (ICVs). However, real-world roadside crashes frequently involve obstacles of varying heights, such as guardrails, curbs, and median bases. While these obstacles pose limited risk to the passenger compartment, they can intrude into the battery pack and trigger thermal runaway. This study investigates the influence of obstacle height on EV pole side impacts. Finite element simulations of a commercially available sedan were conducted against rigid obstacles of different heights. Results reveal a non-monotonic trend of battery intrusion governed by the interplay between rollover dynamics and structural stiffness. Theoretical analyses were
Battery modules consist of battery cells electrically joined at the terminals by conductive busbars. Laser welds are the most consistent and controllable process to create these connections on a large scale due to their control over power, laser width, speed, wobble, and overlap, and their quality is critical to battery pack performance. Tuning these parameters for an application typically requires weld trials to reach desired weld width, penetration, and strength without overheating the battery cell and weakening the dielectric insulators around the terminals. Poorly welded cells in a module can result in increased electrical resistance, causing greater joule heating and accelerated cell aging, and poorly welded modules can lead to uneven aging and unpredictable performance. To better understand the laser welding process, a modelling approach was developed to predict weld properties to reduce production time, costs, and potential cell damage. The 3D finite element model was calibrated
Items per page:
50
1 – 50 of 5269