A Comparison of EGR Correction Factor Models Based on SI Engine Data
ISSN: 1946-3936, e-ISSN: 1946-3944
Published March 27, 2019 by SAE International in United States
Citation: Smith, J., Ruprecht, D., Roberts, P., Kountouriotis, A. et al., "A Comparison of EGR Correction Factor Models Based on SI Engine Data," SAE Int. J. Engines 12(2):203-217, 2019, https://doi.org/10.4271/03-12-02-0015.
The article compares the accuracy of different exhaust gas recirculation (EGR) correction factor models under engine conditions. The effect of EGR on the laminar burning velocity of a EURO VI E10 specification gasoline (10% Ethanol content by volume) has been back calculated from engine pressure trace data, using the Leeds University Spark Ignition Engine Data Analysis (LUSIEDA) reverse thermodynamic code. The engine pressure data ranges from 5% to 25% EGR (by mass) with the running conditions, such as spark advance and pressure at intake valve closure, changed to maintain a constant engine load of 0.79 MPa gross mean effective pressure (GMEP). Based on the experimental data, a correlation is suggested on how the laminar burning velocity reduces with increasing EGR mass fraction. This correlation, together with existing models, was then implemented into the quasi-dimensional Leeds University Spark Ignition Engine (LUSIE) predictive engine code and resulting predictions are compared against measurements. It was found that the new correlation is in good agreement with experimental data for a diluent range of 5%-25%, providing the best fit for both engine loads investigated, whereas existing models tend to overpredict the reduction of burning velocity due to EGR.