Browse Topic: Driver assistance systems
ABSTRACT This paper describes a simulation model for autonomous vehicles operating in highly uncertain environments. Two elements of uncertainty are studied – rain and pedestrian interaction – and their effects on autonomous mobility. The model itself consists of all the essential elements of an autonomous vehicle: Scene -roads, buildings, etc., Environment - sunlight, rain, snow, etc., Sensors - gps, camera, radar, lidar, etc., Algorithms - lane detection, pedestrian detection, etc., Control - lane keeping, obstacle avoidance, etc., Vehicle Dynamics – mass, drivetrain, tires, etc., and Actuation - throttle, braking, steering, etc. Using this model, the paper presents results that assess the autonomous mobility of a Polaris GEM E6 type of vehicle in varying amounts of rain, and when the vehicle encounters multiple pedestrians crossing in front. Rain has been chosen as it impacts both situational awareness and trafficability conditions. Mobility is measured by the average speed of the
ABSTRACT Popularity of Advanced Driver Assistance Systems (ADAS) in the passenger car industry has seen an explosive growth in recent years. Some ADAS that are becoming ubiquitous are Lane Departure Warning (LDW), Blind Spot Detection (BSD) and automatic parking or parking assistance systems. In many cases, such systems had been developed specifically to handle the most demanding driving conditions at very high speeds, which typically require very sophisticated software and high-power hardware. However, in the other application areas or geographical regions, such sophistication often hinders adoption of the technology. An alternate approach is to use off-the-shelf (OTS) component as much as possible so that similar systems with an appropriate subset of functions can be developed cheaply and quickly. The approach similar to the NASA’s “PhoneSats” program is discussed in this paper
ABSTRACT The Army has identified an operational need for a Robotic Convoy capability for its tactical vehicle fleets. The Department of Defense (DoD), with a fleet of over several hundred thousand tactical vehicles, must identify an approach with supporting technology and supply base to procure and support a Robotic Convoy solution at the lowest possible cost. While cost is a key driver, the selected system approach must be proven and robust to ensure the safety of our soldiers and the supply chain. An effective approach is to integrate and adapt the advanced automotive technologies, components and suppliers currently delivering advanced safety technologies into the automotive market. These advanced automotive technologies merged with DoD robotics enhancements in tactical behaviors, autonomous driving, command & control and unmanned systems collaboration will advance the operational utility of robotic convoy application in manned and unmanned modes. Figure 1 Military Application The
ABSTRACT In order to expedite the development of robotic target carriers which can be used to enhance military training, the modification of technology developed for passenger vehicle Automated Driver Assist Systems (ADAS) can be performed. This field uses robotic platforms to carry targets into the path of a moving vehicle for testing ADAS systems. Platforms which are built on the basis of customization can be modified to be resistant to small arms fire while carrying a mixture of hostile and friendly pseudo-soldiers during area-clearing and coordinated attack simulations. By starting with the technology already developed to perform path following and target carrying operations, the military can further develop training programs and equipment with a small amount of time and investment. Citation: M. Bartholomew, D. Andreatta, P. Muthaiah, N. Helber, G. Heydinger, S. Zagorski, “Bringing Robotic Platforms from Vehicle Testing to Warrior Training,” In Proceedings of the Ground Vehicle
You've got regulations, cost and personal preferences all getting in the way of the next generation of automated vehicles. Oh, and those pesky legal issues about who's at fault should something happen. Under all these big issues lie the many small sensors that today's AVs and ADAS packages require. This big/small world is one topic we're investigating in this issue. I won't pretend I know exactly which combination of cameras and radar and lidar sensors works best for a given AV, or whether thermal cameras and new point cloud technologies should be part of the mix. But the world is clearly ready to spend a lot of money figuring these problems out
To round out this issue's cover story, we spoke with Clement Nouvel, Valeo's chief technical officer for lidar, about Valeo's background in ADAS and what's coming next. Nouvel leads over 300 lidar engineers and the company's third-generation Scala 3 lidar is used on production vehicles from European and Asian automakers. The Scala 3 sensor system scans the area around a vehicle 25 times per second, can detect objects more than 200 meters (656 ft) away with a wide field of vision and operates at speeds of up to 130 km/h (81 mph) on the highway. In 2023, Valeo secured two contracts for Scala 3, one with an Asian manufacturer and the other with a “leading American robotaxi company,” Valeo said in its most-recent annual report. Valeo has now received over 1 billion euros (just under $1.1 billion) in Scala 3 orders. Also in 2023, Valeo and Qualcomm agreed to jointly supply connected displays, clusters, driving assistance technologies and, importantly, sensor technology for to two- and three
North America's first electric, fully integrated custom cab and chassis refuse collection vehicle - slated for initial customer deliveries in mid-2024 - is equipped with a standard advanced driver-assistance system (ADAS). “A typical garbage truck uses commercial off-the-shelf active safety technologies, but the electrified McNeilus Volterra ZSL was purpose-built with active safety technologies to serve our refuse collection customer,” said Brendan Chan, chief engineer for autonomy and active safety at Oshkosh Corporation, McNeilus' parent company. “We wanted to make the safest and best refuse collection truck out there. And by using cloud-based simulation, we could accelerate the development of ADAS and other technologies,” Chan said in an interview with Truck & Off-Highway Engineering during the 2024 dSPACE User Conference in Plymouth, Michigan
iMotions employs neuroscience and AI-powered analysis tools to enhance the tracking, assessment and design of human-machine interfaces inside vehicles. The advancement of vehicles with enhanced safety and infotainment features has made evaluating human-machine interfaces (HMI) in modern commercial and industrial vehicles crucial. Drivers face a steep learning curve due to the complexities of these new technologies. Additionally, the interaction with advanced driver-assistance systems (ADAS) increases concerns about cognitive impact and driver distraction in both passenger and commercial vehicles. As vehicles incorporate more automation, many clients are turning to biosensor technology to monitor drivers' attention and the effects of various systems and interfaces. Utilizing neuroscientific principles and AI, data from eye-tracking, facial expressions and heart rate are informing more effective system and interface design strategies. This approach ensures that automation advancements
The current approach for new Advanced Driver Assistance System (ADAS) and Connected and Automated Driving (CAD) function development involves a significant amount of public road testing which is inefficient due to the number miles that need to be driven for rare and extreme events to take place, thereby being very costly also, and unsafe as the rest of the road users become involuntary test subjects. A new development, evaluation and demonstration method for safe, efficient, and repeatable development, demonstration and evaluation of ADAS and CAD functions called Vehicle-in-Virtual –Environment (VVE) was recently introduced as a solution to this problem. The vehicle is operated in a large, empty, and flat area during VVE while its localization and perception sensor data is fed from the virtual environment with other traffic and rare and extreme events being generated as needed. The virtual environment can be easily configured and modified to construct different testing scenarios on
Traditional autonomous vehicle perception subsystems that use onboard sensors have the drawbacks of high computational load and data duplication. Infrastructure-based sensors, which can provide high quality information without the computational burden and data duplication, are an alternative to traditional autonomous vehicle perception subsystems. However, these technologies are still in the early stages of development and have not been extensively evaluated for lane detection system performance. Therefore, there is a lack of quantitative data on their performance relative to traditional perception methods, especially during hazardous scenarios, such as lane line occlusion, sensor failure, and environmental obstructions. We address this need by evaluating the influence of hazards on the resilience of three different lane detection methods in simulation: (1) traditional camera detection using a U-Net algorithm, (2) radar detections using infrastructure-based radar retro-reflectors (RRs
This paper has been withdrawn by the publisher because of non-attendance and not presenting at WCX 2024
Kognic's advanced interpretation of sensor data helps artificial intelligence and machine learning recognize the human thing to do. In December 2023, Kognic, the Gothenburg, Sweden-based developer of a software platform to analyze and optimize the massively complex datasets behind ADAS and automated-driving systems, was in Dearborn, Michigan to accept the Tech.AD USA award for Sensor Perception solution of the year. The company doesn't make sensors, but one might say it makes sense of the data that comes from sensors. Kognic, established in 2018, is well-known in the ADAS/AV software sector for its work to help developers extract better performance from and enhance the robustness of safety-critical “ground-truth” information gleaned from petabytes-upon-petabytes of sensor-fusion datasets. Kognic CEO and co-founder Daniel Langkilde espoused a path for improving artificial intelligence-reliant systems based on “programming with data instead of programming with code
India is one of the largest markets for the automobile sector and considering the trends of road fatalities and injuries related to road accidents, it is pertinent to continuously review the safety regulations and introduce standards which promise enhanced safety. With this objective, various Advanced Driver Assistance Systems (ADAS) regulations are proposed to be introduced in the Indian market. ADAS such as, Anti-lock Braking Systems, Advanced Emergency Braking systems, Lane Departure Warning Systems, Auto Lane Correction Systems, Driver Drowsiness Monitoring Systems, etc., assist the driver during driving. They tend to reduce road accidents and related fatalities by their advanced and artificial intelligent fed programs. This paper will share an insight on the past, recent trends and the upcoming developments in the regulation domain with respect to safety
Autonomous Emergency Braking (AEB) systems play a critical role in ensuring vehicle safety by detecting potential rear-end collisions and automatically applying brakes to mitigate or prevent accidents. This paper focuses on establishing a framework for the Verification & Validation (V&V) of Advanced Driver Assistance Systems (ADAS) by testing & verifying the functionality of a RADAR-based AEB ECU. A comprehensive V&V approach was adopted, incorporating both virtual and physical testing. For virtual testing, closed-loop Hardware-in-Loop (HIL) simulation technique was employed. The AEB ECU was interfaced with the real-time hardware via CAN. Data for the relevant target such as the target position, velocity etc. was calculated using an ideal RADAR sensor model running on the real-time hardware. The methodology involved conducting a series of test scenarios, including various driving speeds, obstacle types, and braking distances. Automation was leveraged to perform automated testing and
Advanced Autonomous Vehicles (AV) for SAE Level 3 and Level 4 functions will lead to a new understanding of the operation phase in the overall product lifecycle. Regulations such as the EU Implementing Act and the German L4 Act (AFGBV) request a continuous field surveillance, the handling of critical E/E faults and software updates during operation. This is required to enhance the Operational Design Domain (ODD) during operation, offering Functions on Demand (FoD), by increasing software features within these autonomous vehicle systems over the entire digital product lifecycle, and to avoid and reduce downtime by a malfunction of the Autonomous Driving (AD) software stack. Supported by implemented effective management systems for Cyber Security (R155), Software Update Management System (R156) and a Safety Management System (SMS) (in compliance to Automated Lane Keeping System (ALKS) (R157)), the organizations have to ensure safe and secure development, deployment and operation to
The power of advanced driver assistance systems (ADAS) continues to increase alongside vehicle code and software complexity. To ensure ADAS functionality and maximize safety, cost efficiency, and customer satisfaction, original equipment manufacturers (OEMs) must adopt a solution that allows them to mine data, extract meaningful information, send remote software updates and bug fixes, and manage software complexity. All of this is possible with an embedded telematics-based software and data management solution. Event-based logging enables OEMs to actively measure ADAS effectiveness and performance. It allows them to analyze driver behaviors, such as whether response times increase after a certain time of day, and adjust the ADAS settings to increase functionality, such as providing an earlier warning or automated response. A vertically integrated solution also enables the identification and correction of software and calibration defects for the entire vehicle life cycle through over
Toyota's luxury arm concurrently introduced the all-new, three-row 2024 Lexus TX and the long-awaited redesign of the rugged Lexus GX, also a '24 model. Both were met with enthusiasm at a reveal in Austin, Texas, over what Lexus is calling the new “unified spindle,” an evolution of the spindle grille that has been divisive since it appeared on the 2012 GS sedan. In a nifty trick, engineers have figured out how to include ADAS sensors in the grille without having asymmetrical blocks interrupt the bars. Dealers and more mainstream customers will be most interested in the TX, as Lexus Group Vice President Dejuan Ross said buyers have been clamoring for a new three-row SUV. And there's good reason: 70% of all full-size SUVs sold in America have a third row. For midsize SUVs, the number jumped from 6% to 10% from 2016 to 2022, according to J.D. Power
Items per page:
50
1 – 50 of 822