Browse Topic: Advanced driver assistance systems (ADAS)
ABSTRACT In any active safety system, it is desired to measure the “performance”. For the estimation case, generally a cost function like Mean-Square Error is used. For detection cases, the combination of Probability of Detection and Probability of False Alarm is used. Scenarios that would really expose performance measurement involve complex, dangerous and costly driving situations and are hard to recreate while having a low probability of actually being acquired . Using a virtual tool, we can produce the trials necessary to adequately determine the performance of active safety algorithms and systems. In this paper, we will outline the problem of measuring the performance of active safety algorithms or systems. We will then discuss the approach of using complex scenario design and Monte Carlo techniques to determine performance. We then follow with a brief discussion of Prescan and how it can help in this endeavor. Finally, two Monte Carlo type examples for particular active safety
ABSTRACT This paper will describe the demonstrations that the Autonomous Mobility Appliqué System (AMAS) program has completed to date. First providing a high level technical overview of the system to understand how the system and its subsystems work. The paper will then describe the demonstrations and a summary of the results of the demonstrations
ABSTRACT Popularity of Advanced Driver Assistance Systems (ADAS) in the passenger car industry has seen an explosive growth in recent years. Some ADAS that are becoming ubiquitous are Lane Departure Warning (LDW), Blind Spot Detection (BSD) and automatic parking or parking assistance systems. In many cases, such systems had been developed specifically to handle the most demanding driving conditions at very high speeds, which typically require very sophisticated software and high-power hardware. However, in the other application areas or geographical regions, such sophistication often hinders adoption of the technology. An alternate approach is to use off-the-shelf (OTS) component as much as possible so that similar systems with an appropriate subset of functions can be developed cheaply and quickly. The approach similar to the NASA’s “PhoneSats” program is discussed in this paper
ABSTRACT The transportation industry annually travels more than 6 times as many miles as passenger vehicles [1]. The fuel cost associated with this represents 38% of the total marginal operating cost for this industry [8]. As a result, industry’s interest in applications of autonomy have grown. One application of this technology is Cooperative Adaptive Cruise Control (CACC) using Dedicated Short-Range Communications (DSRC). Auburn University outfitted four class 8 vehicles, two Peterbilt 579’s and two M915’s, with a basic hardware suite, and software library to enable level 1 autonomy. These algorithms were tested in controlled environments, such as the American Center for Mobility (ACM), and on public roads, such as highway 280 in Alabama, and Interstates 275/696 in Michigan. This paper reviews the results of these real-world tests and discusses the anomalies and failures that occurred during testing. Citation: Jacob Ward, Patrick Smith, Dan Pierce, David Bevly, Paul Richardson
ABSTRACT The Army has identified an operational need for a Robotic Convoy capability for its tactical vehicle fleets. The Department of Defense (DoD), with a fleet of over several hundred thousand tactical vehicles, must identify an approach with supporting technology and supply base to procure and support a Robotic Convoy solution at the lowest possible cost. While cost is a key driver, the selected system approach must be proven and robust to ensure the safety of our soldiers and the supply chain. An effective approach is to integrate and adapt the advanced automotive technologies, components and suppliers currently delivering advanced safety technologies into the automotive market. These advanced automotive technologies merged with DoD robotics enhancements in tactical behaviors, autonomous driving, command & control and unmanned systems collaboration will advance the operational utility of robotic convoy application in manned and unmanned modes. Figure 1 Military Application The
ABSTRACT In order to expedite the development of robotic target carriers which can be used to enhance military training, the modification of technology developed for passenger vehicle Automated Driver Assist Systems (ADAS) can be performed. This field uses robotic platforms to carry targets into the path of a moving vehicle for testing ADAS systems. Platforms which are built on the basis of customization can be modified to be resistant to small arms fire while carrying a mixture of hostile and friendly pseudo-soldiers during area-clearing and coordinated attack simulations. By starting with the technology already developed to perform path following and target carrying operations, the military can further develop training programs and equipment with a small amount of time and investment. Citation: M. Bartholomew, D. Andreatta, P. Muthaiah, N. Helber, G. Heydinger, S. Zagorski, “Bringing Robotic Platforms from Vehicle Testing to Warrior Training,” In Proceedings of the Ground Vehicle
At the InCabin USA vehicle technology expo in Detroit, Ford customer research lead Susan Shaw said that the sea of letters around ADAS features and control and indicator icons that vary between vehicles are often confusing to drivers. Shaw pointed out that the following all represent features related to driving lanes: LDW, LKA, LKS, LFA, LCA. These initialisms (groups of letters that form words) are not the only ways the industry refers to these technologies, as some OEMs have their own names for similar things. It all contributes to what can be dangerous assumptions on the part of a driver. “It's shocking how many people think their vehicle will apply the brakes in an emergency, when the car has no such system,” she said. As an overview to the subject of control and indicator iconography, Shaw began with an introduction to user experience research by talking about a classic example: Norman is the author of “The Design of Everyday Things.” A so-called Norman door is any door that is
You've got regulations, cost and personal preferences all getting in the way of the next generation of automated vehicles. Oh, and those pesky legal issues about who's at fault should something happen. Under all these big issues lie the many small sensors that today's AVs and ADAS packages require. This big/small world is one topic we're investigating in this issue. I won't pretend I know exactly which combination of cameras and radar and lidar sensors works best for a given AV, or whether thermal cameras and new point cloud technologies should be part of the mix. But the world is clearly ready to spend a lot of money figuring these problems out
New tests for a Truck Safe rating scheme aim to emulate real-world collisions and encourage OEMs to fit collision avoidance technologies and improve driver vision. Euro NCAP has revealed the elements it is considering as part of an upcoming Truck Safe rating, and how it intends to test and benchmark truck performance. The announcement was made to an audience of international road safety experts at the NCAP24 World Congress in Munich, Germany, in April. The action is intended to mitigate heavy trucks' impact on road safety. The organization cited data showing that trucks are involved in almost 15% of all EU road fatalities but represent only 3% of vehicles on Europe's roads. Euro NCAP says the future rating scheme is designed to go further and faster than current EU truck safety regulations. The organization's goal is to drive innovation and hasten the adoption of advanced driver-assistance systems (ADAS) such as automatic emergency braking (AEB) and lane support systems (LSS), while
To round out this issue's cover story, we spoke with Clement Nouvel, Valeo's chief technical officer for lidar, about Valeo's background in ADAS and what's coming next. Nouvel leads over 300 lidar engineers and the company's third-generation Scala 3 lidar is used on production vehicles from European and Asian automakers. The Scala 3 sensor system scans the area around a vehicle 25 times per second, can detect objects more than 200 meters (656 ft) away with a wide field of vision and operates at speeds of up to 130 km/h (81 mph) on the highway. In 2023, Valeo secured two contracts for Scala 3, one with an Asian manufacturer and the other with a “leading American robotaxi company,” Valeo said in its most-recent annual report. Valeo has now received over 1 billion euros (just under $1.1 billion) in Scala 3 orders. Also in 2023, Valeo and Qualcomm agreed to jointly supply connected displays, clusters, driving assistance technologies and, importantly, sensor technology for to two- and three
The lane departure warning (LDW) system is a warning system that alerts drivers if they are drifting (or have drifted) out of their lane or from the roadway. This warning system is designed to reduce the likelihood of crashes resulting from unintentional lane departures (e.g., run-off-road, side collisions, etc.). This system will not take control of the vehicle; it will only let the driver know that he/she needs to steer back into the lane. An LDW is not a lane-change monitor, which addresses intentional lane changes, or a blind spot monitoring system, which warns of other vehicles in adjacent lanes. This informational report applies to original equipment manufacturer and aftermarket LDW systems for light-duty vehicles (gross vehicle weight rating of no more than 8500 pounds) on relatively straight roads with a radius of curvature of 500 m or more and under good weather conditions
North America's first electric, fully integrated custom cab and chassis refuse collection vehicle - slated for initial customer deliveries in mid-2024 - is equipped with a standard advanced driver-assistance system (ADAS). “A typical garbage truck uses commercial off-the-shelf active safety technologies, but the electrified McNeilus Volterra ZSL was purpose-built with active safety technologies to serve our refuse collection customer,” said Brendan Chan, chief engineer for autonomy and active safety at Oshkosh Corporation, McNeilus' parent company. “We wanted to make the safest and best refuse collection truck out there. And by using cloud-based simulation, we could accelerate the development of ADAS and other technologies,” Chan said in an interview with Truck & Off-Highway Engineering during the 2024 dSPACE User Conference in Plymouth, Michigan
Mathematicians, hold your ire. There's a hidden message in this issue that there's no difference between zero and infinity. Let me explain. In mid-May, I attended VI-Grade's Zero Prototype Summit (ZPS). As the name suggests, the company - like so many others working in the test simulation space - is trying to provide OEMs and suppliers with the tools to reduce the number of physical prototypes that have to be developed and built before those vehicles or components reach production-ready status. We haven't yet entered the zero-prototype automotive world, but we're getting closer. You can find detailed ZPS coverage starting on page 22
iMotions employs neuroscience and AI-powered analysis tools to enhance the tracking, assessment and design of human-machine interfaces inside vehicles. The advancement of vehicles with enhanced safety and infotainment features has made evaluating human-machine interfaces (HMI) in modern commercial and industrial vehicles crucial. Drivers face a steep learning curve due to the complexities of these new technologies. Additionally, the interaction with advanced driver-assistance systems (ADAS) increases concerns about cognitive impact and driver distraction in both passenger and commercial vehicles. As vehicles incorporate more automation, many clients are turning to biosensor technology to monitor drivers' attention and the effects of various systems and interfaces. Utilizing neuroscientific principles and AI, data from eye-tracking, facial expressions and heart rate are informing more effective system and interface design strategies. This approach ensures that automation advancements
This chapter delves into the field of multi-agent collaborative perception (MCP) for autonomous driving: an area that remains unresolved. Current single-agent perception systems suffer from limitations, such as occlusion and sparse sensor observation at a far distance. To address this, three unsettled topics have been identified that demand immediate attention. First, it is crucial to establish normative communication protocols to facilitate seamless information sharing among vehicles. Second, collaboration strategies need to be defined, including identifying the need for specific collaboration projects, determining the collaboration partners, defining the content of collaboration, and establishing the integration mechanism. Finally, collecting sufficient data for MCP model training is vital. This includes capturing diverse modal data and labeling various downstream tasks as accurately as possible
On-road vehicles equipped with driving automation features are entering the mainstream public space. This category of vehicles is now extending to include those where a human might not be needed for operation on board. Several pilot programs are underway, and the first permits for commercial usage of vehicles without an onboard operator are being issued. However, questions like “How safe is safe enough?” and “What to do if the system fails?” persist. This is where remote operation comes in, which is an additional layer to the automated driving system where a human assists the so-called “driverless” vehicle in certain situations. Such remote-operation solutions introduce additional challenges and potential risks as the entire chain of “automated vehicle, communication network, and human operator” now needs to work together safely, effectively, and practically. And as much as there are technical questions regarding network latency, bandwidth, cybersecurity, etc., aspects like human
Items per page:
50
1 – 50 of 1162