Recent and forthcoming fuel consumption reduction requirements and exhaust emissions regulations are forcing the development of innovative and particularly complex intake-engine-exhaust layouts. In the case of Spark Ignition (SI) engines, the necessity to further reduce fuel consumption has led to the adoption of direct injection systems, displacement downsizing, and challenging intake-exhaust configurations, such as multi-stage turbocharging or turbo-assist solutions. Further, the most recent turbo-GDI engines may be equipped with other fuel-reduction oriented technologies, such as Variable Valve Timing (VVT) systems, devices for actively control tumble/swirl in-cylinder flow components, and Exhaust Gas Recirculation (EGR) systems. Such degree of flexibility has a main drawback: the exponentially increasing effort required for optimal engine control calibration. Even if extremely efficient and statistically-based experiments have recently been introduced as standard protocols during test-cell calibration activity, the time and the instrumentation required for a fully-validated test-cell calibration dataset has been steeply increasing during the last few years.
The methodology proposed in this paper is based on computing technologies, and deeper understanding of physical phenomena, which have been accessible only in very recent times. If the availability of dimensional models fast enough to be used in an iterative loop (aimed at the optimization of pre-designed cost or goal functions) allows the introduction of virtual-engine based calibration techniques, the challenge is to identify the best way to take advantage of them.
One necessary step is the reduction of fully 1-D engine models to simpler (and faster to resolve) but still-dimensional engine thermo-fluid-dynamics representations. One of the outcomes of this work is the demonstration that fully-automatic geometry simplifications (to reduce the computational effort) may still not guarantee model consistency, the main reasons being the assignment of inadequate boundary parameters (such as imposed wall temperatures) resulting after merging various elements, in an effort to reduce model complexity.
The second and most important phase is the definition of the calibration scheme. As it always happens with model-based design, the goals of the overall activity should be closely related to the accuracy of the simulation tool. The present project demonstrates the possibility of using simulation tools in a new environment, which is somehow in-between desktop design-oriented simulation (1-D and 3-D models) and real-time model-based control (0-D). The model reliability, and therefore the geometry reduction consistency, has been carefully checked to limit significant accuracy loss, especially for the variables being used for virtual calibration. Also, in the paper the limits of the model are introduced and taken into account, and the definition of cost functions and constraints (related to emissions limitation, fuel consumption reduction, and components protection criteria) is discussed.
Finally, the paper shows the application of the overall virtual-engine based calibration methodology to a Gasoline Direct Injection (GDI) turbocharged engine, equipped with tumble-flaps and Variable Valve Timing (VVT) systems. Simulation (and corresponding look-up-tables calibration) results are compared to experimentally measured ones (with similar sets of calibration parameters), demonstrating the potential of adopting the proposed methodology as an intermediate step between engine development and calibration-related test cell (and on-board) activities.