This content is not included in your SAE MOBILUS subscription, or you are not logged in.

A Machine Learning-Genetic Algorithm (ML-GA) Approach for Rapid Optimization Using High-Performance Computing

Journal Article
2018-01-0190
ISSN: 1946-391X, e-ISSN: 1946-3928
Published April 03, 2018 by SAE International in United States
A Machine Learning-Genetic Algorithm (ML-GA) Approach for Rapid Optimization Using High-Performance Computing
Sector:
Citation: Moiz, A., Pal, P., Probst, D., Pei, Y. et al., "A Machine Learning-Genetic Algorithm (ML-GA) Approach for Rapid Optimization Using High-Performance Computing," SAE Int. J. Commer. Veh. 11(5):291-306, 2018, https://doi.org/10.4271/2018-01-0190.
Language: English

Abstract:

A Machine Learning-Genetic Algorithm (ML-GA) approach was developed to virtually discover optimum designs using training data generated from multi-dimensional simulations. Machine learning (ML) presents a pathway to transform complex physical processes that occur in a combustion engine into compact informational processes. In the present work, a total of over 2000 sector-mesh computational fluid dynamics (CFD) simulations of a heavy-duty engine were performed. These were run concurrently on a supercomputer to reduce overall turnaround time. The engine being optimized was run on a low-octane (RON70) gasoline fuel under partially premixed compression ignition (PPCI) mode. A total of nine input parameters were varied, and the CFD simulation cases were generated by randomly sampling points from this nine-dimensional input space. These input parameters included fuel injection strategy, injector design, and various in-cylinder flow and thermodynamic conditions at intake valve closure (IVC). The outputs (targets) of interest from these simulations included five metrics related to engine performance and emissions. Over 2000 samples generated from CFD were then used to train an ML model that could predict these five targets based on the nine input features. A robust super learner approach was employed to build the ML model, where results from a collection of different ML algorithms were pooled together. Thereafter, a stochastic global optimization genetic algorithm (GA) was used, with the ML model as the objective function, to optimize the input parameters based on a merit function so as to minimize fuel consumption while satisfying CO and NOx emissions constraints. The optimized configuration from ML-GA was found to be very close to that obtained from a sequentially performed CFD-GA approach, where a CFD simulation served as the objective function. In addition, the overall turnaround time was (at least) 75% lower with the ML-GA approach, as the training data was generated from concurrent CFD simulations and employing the ML model as the objective function significantly accelerated the GA optimization. This study demonstrates the potential of ML-GA and high-performance computing (HPC) to reduce the number of CFD simulations to be performed for optimization problems without loss in accuracy, thereby providing significant cost savings compared to traditional approaches.