Your Selections

Visibility
Show Only

Collections

File Formats

Content Types

Dates

Sectors

Topics

Authors

Publishers

Affiliations

Committees

Events

Magazine

 

Numerical Investigation of Wiper Drawback

Dassault Systemes-Jonathan Jilesen
Exa Corporation-Tom Linden
Published 2019-04-02 by SAE International in United States
Windscreen wipers are an integral component of the windscreen cleaning systems of most vehicles, trains, cars, trucks, boats and some planes. Wipers are used to clear rain, snow, and dirt from the windscreen pushing the water from the wiped surface. Under certain conditions however, water which has been driven to the edge of the windscreen by the wiper can be drawn back into the driver’s field of view by aerodynamic forces introduced by the wiper motion. This is wiper drawback, an undesirable phenomenon as the water which is drawn back on to the windscreen can reduce driver’s vision and makes the wiper less effective.The phenomena of wiper drawback can be tested for in climatic tunnels using sprayer systems to wet the windscreen. However, these tests require a bespoke test property or prototype vehicle, which means that the tests are done fairly late in the development of the vehicle. Furthermore, these results do not provide significant insight into the mechanisms driving the wiper drawback.In order to better understand wiper drawback a numerical simulation is presented of…
Annotation icon
 

Nighttime Visibility in Varying Moonlight Conditions

4M Safety-Michael Kuzel
Kineticorp LLC-William Neale, James Marr, Nathan McKelvey
Published 2019-04-02 by SAE International in United States
When the visibility of an object or person in the roadway from a driver’s perspective is an issue, the potential effect of moonlight is sometimes questioned. To assess this potential effect, methods typically used to quantify visibility were performed during conditions with no moon and with a full moon. In the full moon condition, measurements were collected from initial moon rise until the moon reached peak azimuth. Baseline ambient light measurements of illumination at the test surface were measured in both no moon and full moon scenarios. Additionally, a vehicle with activated low beam headlamps was positioned in the testing area and the change in illumination at two locations forward of the vehicle was recorded at thirty-minute intervals as the moon rose to the highest position in the sky. Also, two separate luminance readings were recorded during the test intervals, one location 75 feet in front and to the left of the vehicle, and another 150 feet forward of the vehicle. These luminance readings yielding the change in reflected light attributable to the moon. In…
Datasets icon
Annotation icon
 

Enhancing Contrast-Sensitivity Charts for Validating Visual Representations of Low-Illumination Scenes

Engineering Systems Inc.-James Sprague, Manuel Meza-Arroyo, Peggy Shibata, Jack Auflick
Published 2019-04-02 by SAE International in United States
This aim of this study was to introduce and test three different design enhancements to the contrast-sensitivity charts developed by Ayers and Kubose [1]. Contrast-sensitivity charts are the current, critical instrument for generating photographic representations of low-illumination scenes. However, their range of applicability is limited to a specific range of lighting conditions for any given scene, and a limited set of testing and perceptual conditions for observers. A total of four contrast charts were presented to ten dark-adapted observers in nine different lighting conditions that changed in ascending order from low to high levels of illumination. For each lighting condition, the order and orientation of the charts was randomized. Observations related to the number of detected contrast levels were then compared to find the utilization ranges for each chart. In addition, observers were asked to report the directionality of the detected center light wedges for the two enhanced chart designs. The non-linear contrast functions minimized floor and ceiling effects, specifically at the lower tested frequencies of 3.5 and 7.0 c/d. Additionally, the wedge orientation of…
Datasets icon
Annotation icon
 

Impacts of Flashing Emergency Lights and Vehicle-Mounted Illumination on Driver Visibility and Glare

Rensselaer Polytechnic Institute-John Bullough, Nicholas Skinner, Mark Rea
Published 2019-04-02 by SAE International in United States
Flashing emergency lights on police cars, fire trucks, and ambulances need to be bright enough to alert otherwise unaware drivers about their presence on and near the roadway. Anecdotal evidence suggests that public safety agencies select emergency lighting systems with red or blue flashing lights based on their apparent brightness, with brighter lights judged as "better." With the advent of light emitting diodes (LEDs), emergency flashing lights are brighter and produce more highly saturated colors, thereby causing greater discomfort and disability glare. As a result, first response workers are at higher risk for being injured or killed in vehicle crashes because approaching drivers cannot see them. In the present study, participants viewed red and blue flashing lights on a scale model police vehicle, conforming to present recommended practices for emergency lights. Lights varied in intensity and optical power (intensity × duration). Participants were asked to view the scale model police vehicle and identify whether a police officer figure was standing beside the vehicle. In some trials, white LED sources were energized, providing low-level illumination on…
Annotation icon
 

GPU Implementation for Automatic Lane Tracking in Self-Driving Cars

Oakland University-Ayomide Yusuf, Shadi Alawneh
Published 2019-04-02 by SAE International in United States
The development of efficient algorithms has been the focus of automobile engineers since self-driving cars become popular. This is due to the potential benefits we can get from self-driving cars and how they can improve safety on our roads. Despite the good promises that come with self-driving cars development, it is way behind being a perfect system because of the complexity of our environment. A self-driving car must understand its environment before it makes decisions on how to navigate, and this might be difficult because the changes in our environment is non-deterministic. With the development of computer vision, some key problems in intelligent driving have been active research areas. The advances made in the field of artificial intelligence made it possible for researchers to try solving these problems with artificial intelligence. Lane detection and tracking is one of the critical problems that need to be effectively implemented. The ability of a self-driving car to successfully drive from point A to point B without going off track is dependent on lane tracking. Lane tracking in self-driving…
Annotation icon
 

Validation of the Cycles Engine for Creation of Physically Correct Lighting Models

JS Forensic Consulting, LLC-Jeffrey Suway
Momenta, LLC-Anthony Dominic Cornetto
Published 2019-04-02 by SAE International in United States
Vision is the primary sense used to navigate through this world when driving, walking, biking, or performing most tasks. and thus visibility is a critical concern in the design of roadways, pathways, vehicles, buildings, etc. and the investigation of accidents. In order to assess visibility, the accident scene can be documented under similar conditions. Geometric and photometric measurements can be taken for later analysis. Calibrated photographs or video of a recreated scene can be captured to illustrate the visibility at a later time. This process can often require significant coordination of the physical features at the scene. It can be difficult to precisely control the motion and timing of moving features such as pedestrians and vehicles. The result is fixed in that you capture specific scenarios with specific conditions with the selected field of view and perspective of the cameras used. Alternatively, three-dimensional computer modeling and physically-based rendering (PBR) can be used to recreate an accident scene geometry and lighting conditions. PBR is a rendering method to create synthetic images and video by accurately simulating…
Annotation icon
 

Enabling Expanded Aerospace Automation Using Tactile Cognition Analytics

Northrop Grumman Aerospace-George Nicholas Bullen
Published 2019-03-19 by SAE International in United States
Aerospace assembly operations are still highly dependent on human labor and processes. This paper will describe and illustrate the transformational technologies that will enable replication of the human cognitive (in context) textural ability to assemble airplane and space structure. The paper will also provide use case examples where these innovative technologies have been applied successfully.Without context, data are just dots, floating around without meaning. With context data becomes an enlightened component of knowledge that brings value to information. In-context enlightened knowledge provides manufacturing visibility within factory operations. Visibility is a key component of the Smart, Brilliant, or Intelligent Factory.
Annotation icon
 

The Swedish Word for AV Tech

Autonomous Vehicle Engineering: March 2019

Lindsay Brooke
  • Magazine Article
  • 19AVEP03_08
Published 2019-03-01 by SAE International in United States

Veoneer, a new Tier 1 supplier with well-established roots, is moving rapidly into AI, says veteran research boss Ola Boström.

The first automotive camera with built-in Deep Learning-considered to be a significant step forward in autonomous-vehicle sensor technology-is due to launch this year. It was developed and will be manufactured by a Tier 1 whose name is still seeking broader recognition in the industry.

Annotation icon
 

A Novel Barricade Warning Light System Using Wireless Communications

Rensselaer Polytechnic Institute-Mark S. Rea, Nicholas P. Skinner, John D. Bullough
Published 2018-09-12 by SAE International in United States
Workers in construction and transportation sectors are at increased risk for work-related injuries and fatalities by nearby traffic. Barricade-mounted warning lights meeting current specifications do not always provide consistent and adequate visual guidance to drivers and can contribute to glare and reduced safety. Through an implementation of sensors and wireless communications, a novel, intelligent set of warning lights and a tablet-based interface were developed. The lights modulate between 100% and 10% of maximum intensity rather than between 100% and off in order to improve visual guidance and adjust their overall intensity based on ambient conditions. The lights can be synchronized or operated in sequential flash patterns at any frequency between 1 and 4 Hz, and sequential patterns automatically update based on global positioning satellite (GPS) locations displayed in the control interface. A successful field demonstration of the system verified that its functions were viewed favorably by transportation safety personnel.
Annotation icon
 

The influence of forward up vision on driver visibility

General Motors-Alex Cardoso Santos, Adalberto Gerez, Julio Silva, Piero Genaro, Rei Silva, Sonia Ferreira
Published 2018-09-03 by SAE International in United States
During the early phase of vehicle development, one of the key design attributes to consider is visibility for the driver. Visibility is the ability to see the surrounding environment as one is driving. This need should drive the vehicle design enabling a move favorable view for the driver.Certain vehicle characteristics such as the size of windshield and the design of the pillar influence the perception of visibility for the driver. One specific characteristic influencing satisfaction is forward up vision, which is the subject of this paper.The objective of this project was to analyze the influence of forward up vision on driver satisfaction under real world driving conditions. Other influences such as the positon of the occupant in the seat was also studied. This study was supported by research, statistical data analysis and dynamic clinics.
Annotation icon