This content is not included in your SAE MOBILUS subscription, or you are not logged in.
Visualization of Driver and Pedestrian Visibility in Virtual Reality Environments
ISSN: 0148-7191, e-ISSN: 2688-3627
Published April 06, 2021 by SAE International in United States
This content contains downloadable datasetsAnnotation ability available
Event: SAE WCX Digital Summit
In 2016, Virtual Reality (VR) equipment entered the mainstream scientific, medical, and entertainment industries. It became both affordable and available to the public market in the form of some of the technologies earliest successful headset: the Oculus Rift™ and HTC Vive™. While new equipment continues to emerge, at the time these headsets came equipped with a 100° field of view screen that allows a viewer a seamless 360° environment to experience that is non-linear in the sense that the viewer can chose where they look and for how long. The fundamental differences, however, between the conventional form of visualizations like computer animations and graphics and VR are subtle. A VR environment can be understood as a series of two-dimensional images, stitched together to be a seamless single 360° image. In this respect, it is only the number of images the viewer sees at one time that separates a conventional visualization from a VR experience. The research presented here compares the conventional methods of representing driver and pedestrian views through animations and visualization with a VR environment of the same content. This involves using established methods for conventional visualization and adapting them to the unique requirements needed for a VR environment, including obtaining and processing photographs and video from the driver and pedestrian views. The research evaluates how existing techniques for daytime and nighttime visibility can be adapted to VR environments and discusses the practices and techniques to achieve the best results. An evaluation is also made between the end products produced through conventional visualization media and the VR environment in terms of quality, resolution, clarity, and experience.
CitationNeale, W., Terpstra, T., Mckelvey, N., and Owens, T., "Visualization of Driver and Pedestrian Visibility in Virtual Reality Environments," SAE Technical Paper 2021-01-0856, 2021, https://doi.org/10.4271/2021-01-0856.
Data Sets - Support Documents
|Unnamed Dataset 1|
|Unnamed Dataset 2|
|Unnamed Dataset 3|
|Unnamed Dataset 4|
- Virtual Reality Society http://www.vrs.org.uk/virtual-reality/history.html
- Neale , W.T.C. , Marr , J. , and Hessel , D. Nighttime Videographic Projection Mapping to Generate Photo-Realistic Simulation Environments SAE Paper 2016-01-0415 2016
- Neale , W.T.C. , Terpstra , T. , and Hashemian , A. Photogrammetry and Analysis of Digital Media Published through SAE Technical Course Material Troy, Michigan 2017
- Rose , N. and Neale , W. Dec. 2018
- Forbes , L.M.
- Smardon , R.C. , Palmer James , F. , and Felleman John , P. Foundations for Visual Project Analysis John Wiley & Sons, Inc 1986
- Wolfe , B. , Dobres , J. , Rosenholtz , R. , and Reimer , B. More than the Useful Field: Considering Peripheral Vision in Driving Applied Ergonomic 65 2017 316 325
- Rose , N. and Neale , W. Dec. 2018
- Fenton , S. , Neale , W. , Rose , N. , and Hughes , C. Determining Crash Data Using Camera Matching Photogrammetric Technique SAE Technical Paper 2001-01-3313 2001 https://doi.org/10.4271/2001-01-3313
- Terpstra , T. , Dickinson , J. , Hashemian , A. , and Fenton , S. Reconstruction of 3D Accident Sites Using USGS LiDAR, Aerial Images, and Photogrammetry SAE Technical Paper 2019-01-0423 2019 https://doi.org/10.4271/2019-01-0423
- Diaz , J. July 9, 2018 https://www.fastcodesign.com/90156138/how-vr-is-helping-convict-nazis-in-court
- Daley , J. Smithsonian.com 2016 https://www.smithsonianmag.com/smart-news/how-virtual-reality-helping-prosecute-nazi-war-criminals-180960743/