Compression ignition engines are widely used in the cargo and passenger transport sectors, this is due to their high energy efficiency and can operate with renewable fuels. The search for increased efficiency in internal combustion engines and reduced emissions are increasingly stringent, so to meet regulatory emission standards, new technologies are being studied and developed to reduce emissions generated by engines, in the case of diesel engines compression ignition, studies of techniques to reduce NOx and soot have been carried out. One of the techniques studied is the application of the DFI - Ducted Fuel Injection concept, which makes the fuel spray pass through a small cylindrical duct installed upstream of the injection orifice of the injector nozzle, thus improving the air/fuel, making it more homogeneous and allowing a more complete combustion. This work addresses a study of this application of DFI with different compression ratios. To carry out the tests, a thermodynamic single-cylinder engine was used where its compression ratio is 16.0:1 in its normal condition, when the ducts are installed in the combustion chamber the engine starts to operate with a compression ratio of 16.5:1, thus, this study is necessary so that it is possible to visualize the behavior of the engine when the compression ratio changes, aiming at the thermodynamic behavior and emissions. CO, HC and NOx emissions were measured with FTIR spectroscopy equipment, and soot was measured by Laser Induced Incandescence - LII. The difference in compression ratio between DFI and free spray causes soot levels to increase considerably with increasing load in free spray mode, while for DFI the indices are almost unchanged.