Brake drag in disc brakes occurs during the off-brake-phase, when the brake is not applied but friction contacts between brake disc and pads persist. First and foremost, the resulting drag torque increases energy consumption, where a few Newton meters can have a significant impact on the crucial factor – range – of battery-electric-vehicles. Moreover, brake wear is accelerated in conjunction with enlarged taper-wear of the pads. Additional wear can also imply increased brake particle emissions which are going to be limited by upcoming regulations due to their potential health risk.
In this light different countermeasures aim to create and maintain a sufficient air gap between brake disc and pads when the brake is released to avoid residual friction contacts. Among others these include optimization of piston retraction by adjusting the seal-grooves and integrating pad springs into the caliper to push the pads back. State of the art to analyze the effectiveness of countermeasures are component-level tests on brake dynamometers. As they provide high repeatability and the necessary accuracy to develop brakes with drag near zero. Though the laboratory conditions usually exclude influencing factors that are present on vehicle-level and can have a significant impact on brake drag, for example lateral acceleration. Therefore, this work uses a prototypical drag torque measurement system based on piezo-electric sensors to analyze, whether countermeasures that have been proven to be effective on component level also reduce brake drag on vehicle-level. A default brake setup with relatively high brake drag is compared to an optimized setup during chassis dynamometer tests, certain driving maneuvers on proving grounds and a real-driving cycle on the road.