Your Selections

Head-up displays
Show Only

Collections

File Formats

Content Types

Dates

Sectors

Topics

Authors

Publishers

Affiliations

Committees

Events

Magazine

 

Evaluation of Different ADAS Features in Vehicle Displays

University of Michigan-Abhishek Mosalikanti, Pranove Bandi, Sang-Hwan Kim
Published 2019-04-02 by SAE International in United States
The current study presents the results of an experiment on driver performance including reaction time, eye-attention movement, mental workload, and subjective preference when different features of Advanced Driver Assistance Systems (ADAS) warnings (Forward Collision Warning) are displayed, including different locations (HDD (Head-Down Display) vs HUD (Head-Up Display)), modality of warning (text vs. pictographic), and a new concept that provides a dynamic bird’s eye view for warnings.Sixteen drivers drove a high-fidelity driving simulator integrated with display prototypes of the features. Independent variables were displayed as modality, location, and dynamics of the warnings with driver performance as the dependent variable including driver reaction time to the warning, EORT (Eyes-Off-Road-Time) during braking after receiving the warning, workload and subject preference. The primary results were in line with previous research, validating previous claims of the superiority of HUD over HDD in warning delivery. It was also found that the text format of the warning yielded higher response rates along with lower workload, while most participants preferred the dynamic bird’s eye view layout.
Annotation icon
 

Efficient Method for Head-Up Display Image Compensation by Using Pre-Warping

Hyundai Mobis-Mijin Jeon, Youna Lee
Hyundai Motor Group-Iksoon Lim
Published 2019-04-02 by SAE International in United States
A Head-Up Display (HUD) is electrical device that provides virtual images in front of driver. Virtual images are consists of various driving information. Because HUD uses optical system there exist image distortions with respect to image height and driver’s eye position. Image warping is image correction method that makes a geometrical change on image to minimize image distortions. In this paper to minimize image distortions, we use optical data driven warping matrix for each image height. But even though we applied data driven warping matrix, image distortions occur due to assemble and manufacturing tolerances when HUD is built. In this paper, we also suggest pre-warping method to minimize image distortions considering tolerances. We simulated 3 compensation functions to get rid of image distortions from the tolerances. By using proposed pre-warping method we could reduce maximum x, y distance by 31.5%, 39% and average distance by 32.2%, 27.9% of distortions.
Datasets icon
Annotation icon
 

Hazard Cuing Systems for Teen Drivers: A Test-Track Evaluation on Mcity

DENSO International America Inc.-Yu Zhang, Te-Ping Kang
University of Michigan-Michael Flannagan, Shan Bao, Anuj Pradhan, John Sullivan
Published 2019-04-02 by SAE International in United States
There is a strong evidence that the overrepresentation of teen drivers in motor vehicle crashes is mainly due to their poor hazard perception skills, i.e., they are unskilled at appropriately detecting and responding to roadway hazards. This study evaluates two cuing systems designed to help teens better understand their driving environment. Both systems use directional color-coding to represent different levels of proximity between one’s vehicle and outside agents. The first system provides an overview of the location of adjacent objects in a head-up display in front of the driver and relies on drivers’ focal vision (focal cuing system). The second system presents similar information, but in the drivers’ peripheral vision, by using ambient lights (peripheral cuing system). Both systems were retrofitted into a test vehicle (2014 Toyota Camry). A within-subject experiment was conducted at the University of Michigan Mcity test-track facility. The study collected data from seventeen teen participants. Each participant experienced three cuing conditions (focal cuing, peripheral cuing and dual system cuing conditions) as well as three no cuing system conditions (two practice, a…
Datasets icon
Annotation icon
 

2019 Vehicle Technology Review

Automotive Engineering: April 2019

Paul Seredynski
  • Magazine Article
  • 19AUTP04_02
Published 2019-04-01 by SAE International in United States

Reviewing the latest tech applications in the automotive space and the trends they're serving.

With massive shifts looming in the automotive engineering space - the titanic trio of Autonomy, Mobility and Electrification (AME) - it's easy to forget that the pace of innovation continues unaltered in the here and now. We've reviewed the latest technologies on the newest OEM models and how they point to current trends in the automotive landscape. Though the AME macro trends represent the majority investment in the automotive space, and work on traditional engineering projects including new powertrains continues, small features that resonate often serve as a guideposts to what's next.

Annotation icon
 

Rethinking the HUD

Automotive Engineering: March 2019

Dan Carney
  • Magazine Article
  • 19AUTP03_01
Published 2019-03-01 by SAE International in United States

New tech solutions move toward augmented reality to bring greater capability to head-up displays.

Head-up displays (HUD) debuted in the late 1950s as a means of providing jet fighter pilots critical information while maintaining situational awareness outside the cockpit. Today these systems for projecting data onto the windscreens of cars and trucks are becoming a vital conduit of information to drivers.

Annotation icon
 

Standard - Optical System HUD for Automotive

Vehicular Flat Panel Display Standards Committee
  • Ground Vehicle Standard
  • J1757-2_201811
  • Current
Published 2018-11-06 by SAE International in United States
This SAE Standard provides measurement methods to determine HUD optical performance in typical automotive ambient lighting conditions. It covers indoor measurements with simulated outdoor lighting for the measurement of HUD virtual images. HUD types addressed by this standard includes w-HUD (windshield HUD) and c-HUD (combiner HUD) with references to Augmented Reality (AR) HUD as needed. It is not the scope of this document to set threshold values for automotive compliance; however, some recommended values are presented for reference.
Datasets icon
Annotation icon
 

Minimum Performance Standard for Airborne Multipurpose Electronic Displays

A-4ED Electronics Display Subcommittee
  • Aerospace Standard
  • AS8034C
  • Current
Published 2018-07-30 by SAE International in United States
This SAE Aerospace Standard (AS) specifies minimum performance standards for all types of electronic displays and electronic display systems that are intended for use in the flight deck by the flight crew in all 14 CFR Part 23, 25, 27, and 29 aircraft. The requirements and recommendations in this document are intended to apply to all installed electronic displays and electronic display systems including those that have a touch screen interface within the flight deck, regardless of intended function, criticality, or location within the flight deck, but may also be used for non-installed electronic displays. This document provides baseline requirements and recommendations (see 2.3 for definitions of “shall” and “should”). This document primarily addresses hardware requirements, such as electrical, mechanical, optical, and environmental. It does not address system specific functions. It does not contain an exhaustive or comprehensive list of requirements for specific systems or functions, such as TCAS, ADS-B, GPS, weather, or shared display considerations (e.g., when should alerts be inhibited on a display system that simultaneously depicts navigation data integrated with terrain data…
Datasets icon
Annotation icon
 

Military Optics Technology

  • Magazine Article
  • TBMG-29737
Published 2018-07-01 by Tech Briefs Media Group in United States

The demand for innovative solutions to enhance the safety of military personnel is continually on the rise. This includes the need to improve the performance of military vehicles and aircraft, in terms of both safety against laser attack and maximizing the information that can be presented to pilots without obstructing their view.

 

Optical Advantages of Thin Window Hybrid Windshields

Corning Inc.-Sang-Ki Park, Vikram Bhatia
Published 2018-04-03 by SAE International in United States
The adoption of head-up displays (HUDs) is increasing in modern automobiles. Yet integrating this technology into vehicles with standard windshield (WS) laminates can create negative effects for drivers, primarily due to the thickness of glass used. The double ghosting in HUD images is typically overcome by employing a wedged PVB between the two glass plies of the laminate. Another solution is to reduce the thickness of the glass without impacting the overall windshield toughness. Although this still requires the use of a wedged PVB to eliminate HUD ghosting, the thinner glass provides opportunity to increase the image size. However, reducing the thickness of a soda-lime glass (SLG) ply or plies in a conventional soda-lime glass (SLG) laminate can significantly impact the robustness of the laminate to external impact events.This paper will review how a hybrid laminate made from one ply of a relatively thick SLG and a second ply of relatively thin, chemically-strengthened glass, will not only improve the windshield robustness but simultaneously provide better optical performance for HUD applications. Exemplary thin, chemically-strengthened glass can…
Annotation icon
 

A Sense of Distance and Augmented Reality for Stereoscopic Vision

DENSO Corporation-Kodai Takeda, Kazuyuki Ishihara
Kitasato University-Takushi Kawamorita
Published 2018-04-03 by SAE International in United States
Head-up displays (HUDs) give visual information to drivers in an easy to understand manner and prevent traffic accidents. Augmented reality head-up displays (AR-HUDs) display the driving information overlaid on the actual scenery. The AR-HUD must allow the visual information and the actual scene to be viewed at the same time, and a sense of depth and distance are key factors in achieving this. Binocular parallax used in stereoscopic 3D display is one of the most useful methods of providing a sense of depth and distance. Generally, stereoscopic 3D displays must limit the image range to within Panum’s fusional area to ensure fusion of the stereoscopic images. However, when using a stereoscopic 3D display for an AR-HUD, the image range must extend beyond Panum’s fusional area to allow the visual information and the actual scene to be displayed at the same time. In this study, we investigate the visibility of images displayed beyond Panum’s fusional area on a stereoscopic 3D display for an AR-HUD. Ease of fusion was measured by the recognition time for participants watching…
Datasets icon
Annotation icon