Browse Topic: Driver assistance systems
With the surge in adoption of artificial intelligence (AI) in automotive systems, especially Advanced Driver Assistance Systems (ADAS) and autonomous vehicles (AV), comes an increase of AI-related incidents–several of which have ended in injuries and fatalities. These incidents all share a common deficiency: insufficient coverage towards safety, ethical, and/or legal requirements. Responsible AI (RAI) is an approach to developing AI-enabled systems that systematically take such requirements into account. Existing published international standards like ISO 21448:2022 (Safety of the Intended Functionality) and ISO 26262:2018 (Road Vehicles – Functional Safety) do offer some guidance in this regard but are far from being sufficient. Therefore, several technical standards are emerging concurrently to address various RAI-related challenges, including but not limited to ISO 8800 for the integration of AI in automotive systems, ISO/IEC TR 5469:2024 for the integration of AI in functional
Autonomous vehicles utilise sensors, control systems and machine learning to independently navigate and operate through their surroundings, offering improved road safety, traffic management and enhanced mobility. This paper details the development, software architecture and simulation of control algorithms for key functionalities in a model that approaches Level 2 autonomy, utilising MATLAB Simulink and IPG CarMaker. The focus is on four critical areas: Autonomous Emergency Braking (AEB), Adaptive Cruise Control (ACC), Lane Detection (LD) and Traffic Object Detection. Also, the integration of low-level PID controllers for precise steering, braking and throttle actuation, ensures smooth and responsive vehicle behaviour. The hardware architecture is built around the Nvidia Jetson Nano and multiple Arduino Nano microcontrollers, each responsible for controlling specific actuators within the drive-by-wire system, which includes the steering, brake and throttle actuators. Communication
This SAE Recommended Practice establishes a uniform, powered vehicle test procedure and minimum performance requirement for lane departure warning systems used in highway trucks and buses greater than 4546 kg (10000 pounds) gross vehicle weight (GVW). Systems similar in function but different in scope and complexity, including lane keeping/lane assist and merge assist, are not included in this document. This document does not apply to trailers, dollies, etc. This document does not intend to exclude any particular system or sensor technology. This document will test the functionality of the lane departure warning system (LDWS) (e.g., ability to detect lane presence and ability to detect an unintended lane departure), its ability to indicate LDWS engagement, its ability to indicate LDWS disengagement, and its ability to determine the point at which the LDWS notifies the human machine interface (HMI) or vehicle control system that a lane departure event is detected. Moreover, this
The rapid evolution of new technologies in the automotive sector is driving the demand for advanced simulation solutions, enabling faster software development cycles. Developers often encounter challenges in managing the vast amounts of data generated during testing. For example, a single Advanced Driver Assistance System (ADAS) test vehicle can produce several terabytes of data daily. Efficiently handling and distributing this data across multiple locations can introduce delays in the development process. Moreover, the large volume of test cases required for simulation and validation further exacerbates these delays. On-premises simulation setups, especially those dependent on High-Performance Computing (HPC) systems, pose several challenges, including limited computational resources, scalability issues, high capital and maintenance costs, resource management inefficiencies, and compatibility problems between GPU drivers and servers, all of which can impact both performance and costs
The off-highway industry witnesses a vast growth in integrating new technologies such as advance driver assistance systems (ADAS/ADS) and connectivity to the vehicles. This is primarily due to the need for providing a safe operational domain for the operators and other people. Having a full perception of the vehicle’s surrounding can be challenging due to the unstructured nature of the field of operation. This research proposes a novel collective perception system that utilizes a C-V2X Roadside Unit (RSU)-based object detection system as well as an onboard perception system. The vehicle uses the input from both systems to maneuver the operational field safely. This article also explored implementing a software-defined vehicle (SDV) architecture on an off-highway vehicle aiming to consolidate the ADAS system hardware and enable over-the-air (OTA) software update capability. Test results showed that FEV’s collective perception system was able to provide the necessary nearby and non-line
Exactly when sensor fusion occurs in ADAS operations, late or early, impacts the entire system. Governments have been studying Advanced Driver Assistance Systems (ADAS) since at least the late 1980s. Europe's Generic Intelligent Driver Support initiative ran from 1989 to 1992 and aimed “to determine the requirements and design standards for a class of intelligent driver support systems which will conform with the information requirements and performance capabilities of the individual drivers.” Automakers have spent the past 30 years rolling out such systems to the buying public. Toyota and Mitsubishi started offering radar-based cruise control to Japanese drivers in the mid-1990s. Mercedes-Benz took the technology global with its Distronic adaptive cruise control in the 1998 S-Class. Cadillac followed that two years later with FLIR-based night vision on the 2000 Deville DTS. And in 2003, Toyota launched an automated parallel parking technology called Intelligent Parking Assist on the
Sensata Technologies' booth at this year's IAA Transportation tradeshow included two of the company's Precor radar sensors. The PreView STA79 is a heavy-duty vehicle side-monitoring system launched in May 2024 and designed to comply with Europe-wide blind spot monitoring legislation introduced in June 2024. The PreView Sentry 79 is a front- and rear-monitoring system. Both systems operate on the 79-GHz band as the nomenclature suggests. PreView STA79 can cover up to three vehicle zones: a configurable center zone, which can monitor the length of the vehicle, and two further zones that can be independently set to align with individual customer needs. The system offers a 180-degree field of view to eliminate blind spots along the vehicle sides and a built-in measurement unit that will increase the alert level when turning toward an object even when the turn indicator is not used. The system also features trailer mitigation to reduce false positive alerts on the trailer when turning. The
You've got regulations, cost and personal preferences all getting in the way of the next generation of automated vehicles. Oh, and those pesky legal issues about who's at fault should something happen. Under all these big issues lie the many small sensors that today's AVs and ADAS packages require. This big/small world is one topic we're investigating in this issue. I won't pretend I know exactly which combination of cameras and radar and lidar sensors works best for a given AV, or whether thermal cameras and new point cloud technologies should be part of the mix. But the world is clearly ready to spend a lot of money figuring these problems out.
To round out this issue's cover story, we spoke with Clement Nouvel, Valeo's chief technical officer for lidar, about Valeo's background in ADAS and what's coming next. Nouvel leads over 300 lidar engineers and the company's third-generation Scala 3 lidar is used on production vehicles from European and Asian automakers. The Scala 3 sensor system scans the area around a vehicle 25 times per second, can detect objects more than 200 meters (656 ft) away with a wide field of vision and operates at speeds of up to 130 km/h (81 mph) on the highway. In 2023, Valeo secured two contracts for Scala 3, one with an Asian manufacturer and the other with a “leading American robotaxi company,” Valeo said in its most-recent annual report. Valeo has now received over 1 billion euros (just under $1.1 billion) in Scala 3 orders. Also in 2023, Valeo and Qualcomm agreed to jointly supply connected displays, clusters, driving assistance technologies and, importantly, sensor technology for to two- and three
iMotions employs neuroscience and AI-powered analysis tools to enhance the tracking, assessment and design of human-machine interfaces inside vehicles. The advancement of vehicles with enhanced safety and infotainment features has made evaluating human-machine interfaces (HMI) in modern commercial and industrial vehicles crucial. Drivers face a steep learning curve due to the complexities of these new technologies. Additionally, the interaction with advanced driver-assistance systems (ADAS) increases concerns about cognitive impact and driver distraction in both passenger and commercial vehicles. As vehicles incorporate more automation, many clients are turning to biosensor technology to monitor drivers' attention and the effects of various systems and interfaces. Utilizing neuroscientific principles and AI, data from eye-tracking, facial expressions and heart rate are informing more effective system and interface design strategies. This approach ensures that automation advancements
North America's first electric, fully integrated custom cab and chassis refuse collection vehicle - slated for initial customer deliveries in mid-2024 - is equipped with a standard advanced driver-assistance system (ADAS). “A typical garbage truck uses commercial off-the-shelf active safety technologies, but the electrified McNeilus Volterra ZSL was purpose-built with active safety technologies to serve our refuse collection customer,” said Brendan Chan, chief engineer for autonomy and active safety at Oshkosh Corporation, McNeilus' parent company. “We wanted to make the safest and best refuse collection truck out there. And by using cloud-based simulation, we could accelerate the development of ADAS and other technologies,” Chan said in an interview with Truck & Off-Highway Engineering during the 2024 dSPACE User Conference in Plymouth, Michigan.
Traditional autonomous vehicle perception subsystems that use onboard sensors have the drawbacks of high computational load and data duplication. Infrastructure-based sensors, which can provide high quality information without the computational burden and data duplication, are an alternative to traditional autonomous vehicle perception subsystems. However, these technologies are still in the early stages of development and have not been extensively evaluated for lane detection system performance. Therefore, there is a lack of quantitative data on their performance relative to traditional perception methods, especially during hazardous scenarios, such as lane line occlusion, sensor failure, and environmental obstructions. We address this need by evaluating the influence of hazards on the resilience of three different lane detection methods in simulation: (1) traditional camera detection using a U-Net algorithm, (2) radar detections using infrastructure-based radar retro-reflectors (RRs
The current approach for new Advanced Driver Assistance System (ADAS) and Connected and Automated Driving (CAD) function development involves a significant amount of public road testing which is inefficient due to the number miles that need to be driven for rare and extreme events to take place, thereby being very costly also, and unsafe as the rest of the road users become involuntary test subjects. A new development, evaluation and demonstration method for safe, efficient, and repeatable development, demonstration and evaluation of ADAS and CAD functions called Vehicle-in-Virtual –Environment (VVE) was recently introduced as a solution to this problem. The vehicle is operated in a large, empty, and flat area during VVE while its localization and perception sensor data is fed from the virtual environment with other traffic and rare and extreme events being generated as needed. The virtual environment can be easily configured and modified to construct different testing scenarios on
This paper has been withdrawn by the publisher because of non-attendance and not presenting at WCX 2024.
Kognic's advanced interpretation of sensor data helps artificial intelligence and machine learning recognize the human thing to do. In December 2023, Kognic, the Gothenburg, Sweden-based developer of a software platform to analyze and optimize the massively complex datasets behind ADAS and automated-driving systems, was in Dearborn, Michigan to accept the Tech.AD USA award for Sensor Perception solution of the year. The company doesn't make sensors, but one might say it makes sense of the data that comes from sensors. Kognic, established in 2018, is well-known in the ADAS/AV software sector for its work to help developers extract better performance from and enhance the robustness of safety-critical “ground-truth” information gleaned from petabytes-upon-petabytes of sensor-fusion datasets. Kognic CEO and co-founder Daniel Langkilde espoused a path for improving artificial intelligence-reliant systems based on “programming with data instead of programming with code.”
India is one of the largest markets for the automobile sector and considering the trends of road fatalities and injuries related to road accidents, it is pertinent to continuously review the safety regulations and introduce standards which promise enhanced safety. With this objective, various Advanced Driver Assistance Systems (ADAS) regulations are proposed to be introduced in the Indian market. ADAS such as, Anti-lock Braking Systems, Advanced Emergency Braking systems, Lane Departure Warning Systems, Auto Lane Correction Systems, Driver Drowsiness Monitoring Systems, etc., assist the driver during driving. They tend to reduce road accidents and related fatalities by their advanced and artificial intelligent fed programs. This paper will share an insight on the past, recent trends and the upcoming developments in the regulation domain with respect to safety.
Autonomous Emergency Braking (AEB) systems play a critical role in ensuring vehicle safety by detecting potential rear-end collisions and automatically applying brakes to mitigate or prevent accidents. This paper focuses on establishing a framework for the Verification & Validation (V&V) of Advanced Driver Assistance Systems (ADAS) by testing & verifying the functionality of a RADAR-based AEB ECU. A comprehensive V&V approach was adopted, incorporating both virtual and physical testing. For virtual testing, closed-loop Hardware-in-Loop (HIL) simulation technique was employed. The AEB ECU was interfaced with the real-time hardware via CAN. Data for the relevant target such as the target position, velocity etc. was calculated using an ideal RADAR sensor model running on the real-time hardware. The methodology involved conducting a series of test scenarios, including various driving speeds, obstacle types, and braking distances. Automation was leveraged to perform automated testing and
Items per page:
50
1 – 50 of 843