Browse Topic: Driver assistance systems
Simulation has become mission-critical for ADAS development. Model-based systems engineering can integrate modeling and simulation from the start of the design process. Advanced Driver Assistance Systems (ADAS) are transforming vehicle safety, acting as the bridge between conventional driving and full autonomy. From adaptive cruise control to emergency braking and blind-spot detection, these technologies rely on a dense network of radar sensors, antennas, electronic control units and software. What unites them is the need for precise functionality under complex real-world situations. Achieving full reliability requires more than testing on the road; it demands a virtual approach grounded in simulation. Simulation has become mission-critical for ADAS development. As new vehicles integrate dozens of sensors into tightly constrained spaces, even subtle design decisions can affect system performance. Radar solutions, in particular, present unique challenges, especially as vehicle surfaces
While electric powertrains are driving 48V adoption, OEMs are realizing that xEV and ICE vehicles can benefit from a shift away from 12-volt architectures. In every corner of the automotive power engineering world, there are discussions and debates over the merits of 48V power networks vs. legacy 12V power networks. The dialogue started over 20 years ago, but now the tone is more serious. It's not a case of everything old is new again, but the result of a growing appetite for more electrical power in vehicles. Today's vehicles - and the coming generations - require more power for their ADAS and other safety systems, infotainment systems and overall passenger comfort systems. To satisfy the growing demand for low-voltage power, it is necessary to boost the capacity of the low-voltage power network by two or three times that of the late 20th century. Delivering power is more efficient at a higher voltage, and today, 48V is the consensus voltage for that higher level.
With the surge in adoption of artificial intelligence (AI) in automotive systems, especially Advanced Driver Assistance Systems (ADAS) and autonomous vehicles (AV), comes an increase of AI-related incidents–several of which have ended in injuries and fatalities. These incidents all share a common deficiency: insufficient coverage towards safety, ethical, and/or legal requirements. Responsible AI (RAI) is an approach to developing AI-enabled systems that systematically take such requirements into account. Existing published international standards like ISO 21448:2022 (Safety of the Intended Functionality) and ISO 26262:2018 (Road Vehicles – Functional Safety) do offer some guidance in this regard but are far from being sufficient. Therefore, several technical standards are emerging concurrently to address various RAI-related challenges, including but not limited to ISO 8800 for the integration of AI in automotive systems, ISO/IEC TR 5469:2024 for the integration of AI in functional
In the domain of advanced driver assistance systems and autonomous vehicles, precise perception and interpretation of the vehicle's environment are not merely requirements they are the very foundation upon which every aspect of functionality and safety is constructed. One prevalent method of representing the environment is through the use of an occupancy grid map. This map segments the environment into distinct grid cells, each of which is evaluated to determine if it is occupied or free. This evaluation operates under the assumption that each grid cell is independent of the others. The underlying mathematical structure of this system is the binary Bayes filter (BBF). The BBF integrates sensor data from various sources and can incorporate measurements taken at different times. The occupancy grid map does not rely on the identification of individual objects, which allows it to depict obstacles of any shape. This flexibility is a key advantage of this approach. Traditional occupancy grid
Autonomous vehicles utilise sensors, control systems and machine learning to independently navigate and operate through their surroundings, offering improved road safety, traffic management and enhanced mobility. This paper details the development, software architecture and simulation of control algorithms for key functionalities in a model that approaches Level 2 autonomy, utilising MATLAB Simulink and IPG CarMaker. The focus is on four critical areas: Autonomous Emergency Braking (AEB), Adaptive Cruise Control (ACC), Lane Detection (LD) and Traffic Object Detection. Also, the integration of low-level PID controllers for precise steering, braking and throttle actuation, ensures smooth and responsive vehicle behaviour. The hardware architecture is built around the Nvidia Jetson Nano and multiple Arduino Nano microcontrollers, each responsible for controlling specific actuators within the drive-by-wire system, which includes the steering, brake and throttle actuators. Communication
Automotive industries focus on driver safety leading to raising improvements and advancements in Advanced Driver Assistance Systems (ADAS) to avoid collisions and provide safety and comfort to the drivers. This paper proposes a novel approach toward Driver health and fatigue monitoring systems that uses cabin cameras and biometric sensors communicating continuously with vehicle telematics systems to enhance real-time monitoring and predictive intervention. The data from the camera and biometric sensors is sent to the machine learning algorithm (LSBoost) which processes the data and if anything is wrong concerning the driver's behavior then immediately it communicates with vehicle telematics and sends information to the emergency services. This approach enhances driver safety and reduces accidents caused due to health-related driver impairment. This system comprises several sensors and fusion algorithms are applied between different sensors like cabin camera and biometric sensors, all
This SAE Recommended Practice establishes a uniform, powered vehicle test procedure and minimum performance requirement for lane departure warning systems used in highway trucks and buses greater than 4546 kg (10000 pounds) gross vehicle weight (GVW). Systems similar in function but different in scope and complexity, including lane keeping/lane assist and merge assist, are not included in this document. This document does not apply to trailers, dollies, etc. This document does not intend to exclude any particular system or sensor technology. This document will test the functionality of the lane departure warning system (LDWS) (e.g., ability to detect lane presence and ability to detect an unintended lane departure), its ability to indicate LDWS engagement, its ability to indicate LDWS disengagement, and its ability to determine the point at which the LDWS notifies the human machine interface (HMI) or vehicle control system that a lane departure event is detected. Moreover, this
The rapid evolution of new technologies in the automotive sector is driving the demand for advanced simulation solutions, enabling faster software development cycles. Developers often encounter challenges in managing the vast amounts of data generated during testing. For example, a single Advanced Driver Assistance System (ADAS) test vehicle can produce several terabytes of data daily. Efficiently handling and distributing this data across multiple locations can introduce delays in the development process. Moreover, the large volume of test cases required for simulation and validation further exacerbates these delays. On-premises simulation setups, especially those dependent on High-Performance Computing (HPC) systems, pose several challenges, including limited computational resources, scalability issues, high capital and maintenance costs, resource management inefficiencies, and compatibility problems between GPU drivers and servers, all of which can impact both performance and costs
The off-highway industry witnesses a vast growth in integrating new technologies such as advance driver assistance systems (ADAS/ADS) and connectivity to the vehicles. This is primarily due to the need for providing a safe operational domain for the operators and other people. Having a full perception of the vehicle’s surrounding can be challenging due to the unstructured nature of the field of operation. This research proposes a novel collective perception system that utilizes a C-V2X Roadside Unit (RSU)-based object detection system as well as an onboard perception system. The vehicle uses the input from both systems to maneuver the operational field safely. This article also explored implementing a software-defined vehicle (SDV) architecture on an off-highway vehicle aiming to consolidate the ADAS system hardware and enable over-the-air (OTA) software update capability. Test results showed that FEV’s collective perception system was able to provide the necessary nearby and non-line
Sensata Technologies' booth at this year's IAA Transportation tradeshow included two of the company's Precor radar sensors. The PreView STA79 is a heavy-duty vehicle side-monitoring system launched in May 2024 and designed to comply with Europe-wide blind spot monitoring legislation introduced in June 2024. The PreView Sentry 79 is a front- and rear-monitoring system. Both systems operate on the 79-GHz band as the nomenclature suggests. PreView STA79 can cover up to three vehicle zones: a configurable center zone, which can monitor the length of the vehicle, and two further zones that can be independently set to align with individual customer needs. The system offers a 180-degree field of view to eliminate blind spots along the vehicle sides and a built-in measurement unit that will increase the alert level when turning toward an object even when the turn indicator is not used. The system also features trailer mitigation to reduce false positive alerts on the trailer when turning. The
Exactly when sensor fusion occurs in ADAS operations, late or early, impacts the entire system. Governments have been studying Advanced Driver Assistance Systems (ADAS) since at least the late 1980s. Europe's Generic Intelligent Driver Support initiative ran from 1989 to 1992 and aimed “to determine the requirements and design standards for a class of intelligent driver support systems which will conform with the information requirements and performance capabilities of the individual drivers.” Automakers have spent the past 30 years rolling out such systems to the buying public. Toyota and Mitsubishi started offering radar-based cruise control to Japanese drivers in the mid-1990s. Mercedes-Benz took the technology global with its Distronic adaptive cruise control in the 1998 S-Class. Cadillac followed that two years later with FLIR-based night vision on the 2000 Deville DTS. And in 2003, Toyota launched an automated parallel parking technology called Intelligent Parking Assist on the
With increasing emphasis on sustainable mobility and efficient energy use, advanced driver assistance systems (ADAS) may potentially be utilized to improve vehicles’ energy efficiency by influencing driver behavior. Despite the growing adoption of such systems in passenger vehicles for active safety and driver comfort, systematic studies examining the effects of ADAS on human driving, in the context of vehicle energy use, remain scarce. This study investigates the impacts of a driver speed advisory system on energy use in a plug-in hybrid electric vehicle (PHEV) through a controlled experiment using a driving simulator. A mixed urban highway driving environment was reconstructed from digitalizing a real-world route to observe the human driver’s behavior with and without driving assistance. The advisory system provided drivers with an optimized speed profile, pre-calculated for the simulated route to achieve maximum energy efficiency. Participants were instructed to navigate the
Items per page:
50
1 – 50 of 856