Browse Topic: Automation
This document describes machine-to-machine (M2M)1 communication to enable cooperation between two or more traffic participants or CDA devices hosted or controlled by said traffic participants. The cooperation supports or enables performance of the dynamic driving task (DDT) for a subject vehicle equipped with an engaged driving automation system feature and a CDA device. Other participants may include other vehicles with driving automation feature(s) engaged, shared road users (e.g., drivers of conventional vehicles or pedestrians or cyclists carrying compatible personal devices), or compatible road operator devices (e.g., those used by personnel who maintain or operate traffic signals or work zones). Cooperative driving automation (CDA) aims to improve the safety and flow of traffic and/or facilitate road operations by supporting the safer and more efficient movement of multiple vehicles in proximity to one another. This is accomplished, for example, by sharing information that can be
The rapid development of open-source Automated Driving System (ADS) stacks has created a pressing need for clear guidance on their evaluation and selection for specific use cases. This paper introduces a scenario-based evaluation framework combined with a modular simulation framework, offering a scalable methodology for assessing and benchmarking ADS solutions, including but not limited to off-the-shelf designs. The study highlights the lack of clear Operational Design Domain (ODD) descriptions in such systems. Without a common understanding, users must rely on subjective assumptions, which hinders the process of accurate system selection. To address this gap, the study proposes adopting a standardised ISO 34503 ODD description format within the ADS stacks. The application of the proposed framework is showcased through a case study evaluating two open-source systems, Autoware and Apollo. By first defining the assumed system’s ODD, then selecting a relevant scenario, and establishing
Safety Management Systems (SMSs) have been used in many safety-critical industries and are now being developed and deployed in the automated driving system (ADS)-equipped vehicle (AV) sector. Industries with decades of SMS deployment have established frameworks tailored to their specific context. Several frameworks for an AV industry SMS have been proposed or are currently under development. These frameworks borrow heavily from the aviation industry although the AV and aviation industries differ in many significant ways. In this context, there is a need to review the approach to develop an SMS that is tailored to the AV industry, building on generalized lessons learned from other safety-sensitive industries. A harmonized AV-industry SMS framework would establish a single set of SMS practices to address management of broad safety risks in an integrated manner and advance the establishment of a more mature regulatory framework. This paper outlines a proposed SMS framework for the AV
As the autonomy of ADAS features are moving from SAE level 0 autonomy to SAE level 5 autonomy of operation, reliance on AI/ML based algorithms in ADAS critical functions like perception, fusion and path planning are increasing predominantly. AI/ML based algorithms offer exceptional performance of the ADAS features, at the same time these advanced algorithms also bring in safety challenges as well. This paper explores the functional safety aspects of AI/ML based systems in ADAS functions like perception, object fusion and path planning, by discussing the safety requirements development for AI/ML systems, dataset safety life cycle, verification and validation of AI systems, and safety analysis used for AI systems. Among all the safety aspects listed above, emphasis is put on dataset safety lifecycle as that is not only the most important element for training ML based algorithms for ADAS usage, but also the most cumbersome and expensive. The safety characteristics associated with dataset
The rapid development of autonomous vehicles necessitates rigorous testing under diverse environmental conditions to ensure their reliability and safety. One of the most challenging scenarios for both human and machine vision is navigating through rain. This study introduces the Digitrans Rain Testbed, an innovative outdoor rain facility specifically designed to test and evaluate automotive sensors under realistic and controlled rain conditions. The rain plant features a wetted area of 600 square meters and a sprinkled rain volume of 600 cubic meters, providing a comprehensive environment to rigorously assess the performance of autonomous vehicle sensors. Rain poses a significant challenge due to the complex interaction of light with raindrops, leading to phenomena such as scattering, absorption, and reflection, which can severely impair sensor performance. Our facility replicates various rain intensities and conditions, enabling comprehensive testing of Radar, Lidar, and Camera
Reproducing driving scenarios involving near-collisions and collisions in a simulator can be useful in the development and testing of autonomous vehicles, as it provides a safe environment to explore detailed vehicular behavior during these critical events. CARLA, an open-source driving simulator, has been widely used for reproducing driving scenarios. CARLA allows for both manual control and traffic manager control (the module that controls vehicles in autopilot manner in the simulation). However, current versions of CARLA are limited to setting the start and destination points for vehicles that are controlled by traffic manager, and are unable to replay precise waypoint paths that are collected from real-world collision and near-collision scenarios, due to the fact that the collision-free pathfinding modules are built into the system. This paper presents an extension to CARLA’s source code, enabling the replay of exact vehicle trajectories, irrespective of safety implications
Over the decades, robotics deployments have been driven by the rapid in-parallel research advances in sensing, actuation, simulation, algorithmic control, communication, and high-performance computing among others. Collectively, their integration within a cyber-physical-systems framework has supercharged the increasingly complex realization of the real-time ‘sense-think-act’ robotics paradigm. Successful functioning of modern-day robots relies on seamless integration of increasingly complex systems (coming together at the component-, subsystem-, system- and system-of-system levels) as well as their systematic treatment throughout the life-cycle (from cradle to grave). As a consequence, ‘dependency management’ between the physical/algorithmic inter-dependencies of the multiple system elements is crucial for enabling synergistic (or managing adversarial) outcomes. Furthermore, the steep learning curve for customizing the technology for platform specific deployment discourages domain
High-efficiency manufacturing involves the transmission of copious amounts of data, exemplified both by trends in the automotive industry and advances in technology. In the automotive industry, products have been growing increasingly complex, owing to multiple SKUs, global supply chains and the involvement of many tier 2 / Just-In Time (JIT) suppliers. On top of that, recalls and incidents in recent years have made it important for OEMs to be able to track down affected vehicles based on their components. All of this has increased the need for OEMs to be able to collect and analyze component data. The advent of Industry 4.0 and IoT has provided manufacturing with the ability to efficiently collect and store large amounts of data, lining up with the needs of manufacturing-based industries. However, while the needs to collect data have been met, corporations now find themselves facing the need to make sense of the data to provide the insights they need, and the data is often unstructured
While some developers of autonomous technology for commercial trucks have stalled out, there's renewed energy to deliver augmented ADAS and automated driving systems to mass production. After a tumultuous 2023 that saw several autonomous trucking startups pivot out of or exit the arena entirely, there has been a recent resurgence of investment and efforts to bring the vision of driverless freight fleets to reality. In the wake of firms like Embark, TuSimple and Waymo scaling back or rolling up operations, Aurora, Continental and Knorr-Bremse have all announced continued development of SAE Level 4 systems with the intention to deploy trucks using these systems at scale. OEMs such as Volvo Trucks have also announced updates to existing technologies that will augment current advanced driver-assistance systems (ADAS) to help human drivers become safer behind the wheel.
This SAE Edge Research Report explores advancements in next-generation mobility, focusing on digitalized and smart cockpits and cabins. It offers literature review, examining current customer experiences with traditional vehicles and future mobility expectations. Key topics include integrating smart cockpit and cabin technologies, addressing challenges in customer and user experience (UX) in digital environments, and discussing strategies for transitioning from traditional vehicles to electric ones while educating customers. User Experience for Digitalized and Smart Cockpits and Cabins of Next-gen Mobility covers both on- and off-vehicle experiences, analyzing complexities in developing and deploying digital products and services with effective user interfaces. Emphasis is placed on meeting UX requirements, gaining user acceptance, and avoiding trust issues due to poor UX. Additionally, the report concludes with suggestions for improving UX in digital products and services for future
Items per page:
50
1 – 50 of 3039