Browse Topic: Autonomous vehicles
The rapid development of autonomous vehicles necessitates rigorous testing under diverse environmental conditions to ensure their reliability and safety. One of the most challenging scenarios for both human and machine vision is navigating through rain. This study introduces the Digitrans Rain Testbed, an innovative outdoor rain facility specifically designed to test and evaluate automotive sensors under realistic and controlled rain conditions. The rain plant features a wetted area of 600 square meters and a sprinkled rain volume of 600 cubic meters, providing a comprehensive environment to rigorously assess the performance of autonomous vehicle sensors. Rain poses a significant challenge due to the complex interaction of light with raindrops, leading to phenomena such as scattering, absorption, and reflection, which can severely impair sensor performance. Our facility replicates various rain intensities and conditions, enabling comprehensive testing of Radar, Lidar, and Camera
Light Detection and Ranging (LiDAR) is a promising type of sensor for autonomous driving that utilizes laser technology to provide perceptions and accurate distance measurements of obstacles in the vehicle path. In recent years, there has also been a rise in the implementation of LiDARs in modern and autonomous vehicles to aid self-driving features. However, navigating adverse weather remains one of the biggest challenges in achieving Level 5 full autonomy due to sensor soiling, leading to performance degradation that can pose safety hazards. When driving in rain, raindrops impact the LiDAR sensor assembly and cause attenuation of signals when the light beams undergo reflections and refractions. Consequently, signal detectability, accuracy, and intensity are significantly affected. To date, limited studies have been able to perform objective evaluations of LiDAR performance, most of which faced limitations that hindered realistic, controllable, and repeatable testing. Therefore, this
As automotive technology advances, the need for comprehensive environmental awareness becomes increasingly critical for vehicle safety and efficiency. This study introduces a novel integrated wind, weather, and motion sensor designed for moving objects, with a focus on automotive applications. The sensor’s potential to enhance vehicle performance by providing real-time data on local atmospheric conditions is investigated. The research employs a combination of sensor design, vehicle integration, and field-testing methodologies. Findings prove the sensor’s capability to accurately capture dynamic environmental parameters, including wind speed and direction, temperature, and humidity. The integration of this sensor system shows promise in improving vehicle stability, optimizing fuel efficiency through adaptive aerodynamics, and enhancing the performance of autonomous driving systems. Furthermore, the study explores the potential of this technology in contributing to connected vehicle
With the surge in adoption of artificial intelligence (AI) in automotive systems, especially Advanced Driver Assistance Systems (ADAS) and autonomous vehicles (AV), comes an increase of AI-related incidents–several of which have ended in injuries and fatalities. These incidents all share a common deficiency: insufficient coverage towards safety, ethical, and/or legal requirements. Responsible AI (RAI) is an approach to developing AI-enabled systems that systematically take such requirements into account. Existing published international standards like ISO 21448:2022 (Safety of the Intended Functionality) and ISO 26262:2018 (Road Vehicles – Functional Safety) do offer some guidance in this regard but are far from being sufficient. Therefore, several technical standards are emerging concurrently to address various RAI-related challenges, including but not limited to ISO 8800 for the integration of AI in automotive systems, ISO/IEC TR 5469:2024 for the integration of AI in functional
Reproducing driving scenarios involving near-collisions and collisions in a simulator can be useful in the development and testing of autonomous vehicles, as it provides a safe environment to explore detailed vehicular behavior during these critical events. CARLA, an open-source driving simulator, has been widely used for reproducing driving scenarios. CARLA allows for both manual control and traffic manager control (the module that controls vehicles in autopilot manner in the simulation). However, current versions of CARLA are limited to setting the start and destination points for vehicles that are controlled by traffic manager, and are unable to replay precise waypoint paths that are collected from real-world collision and near-collision scenarios, due to the fact that the collision-free pathfinding modules are built into the system. This paper presents an extension to CARLA’s source code, enabling the replay of exact vehicle trajectories, irrespective of safety implications
Artificial Intelligence has gained lot of traction and importance in the 21st century with use cases ranging from speech recognition, learning, planning, problem solving to search engines etc. Artificial Intelligence also has played a key role in the development of autonomous vehicles and robots ranging from perception, localization, decision to controls. Within the big AI umbrella there is machine learning which is all about using your computer to "learn" how to deal with problems without “programming". Deep learning is a branch of machine learning based on a set of algorithms that learn to represent the data directly from the input such as an image, text, Sound, etc. Within deep learning there are Convolutional Neural Networks and Recurrent Neural Networks (CNN/RNN). The study here used convolutional neural network approach to perform image/object recognition. Given that the objective of the autonomous or semi-autonomous vehicle is to promote safety and reduce number of accidents, it
Items per page:
50
1 – 50 of 2807