The SAE MOBILUS platform will continue to be accessible and populated with high quality technical content during the coronavirus (COVID-19) pandemic. x

Your Selections

Human factors
Show Only

Collections

File Formats

Content Types

Dates

Sectors

Topics

Authors

Publishers

Affiliations

Committees

Events

Magazine

   This content is not included in your SAE MOBILUS subscription, or you are not logged in.

Formula SAE Data Acquisition and Detailed Analysis of a Lap

Georgia Southern University-Connor M. Ashford, Aniruddha Mitra
  • Technical Paper
  • 2020-01-0544
To be published on 2020-04-14 by SAE International in United States
Formula Society of Automotive Engineers (FSAE) International is a student design competition organized by SAE. The student design involves engineering and manufacturing a formula style racecar and evaluating its performance. Testing and validation of the vehicle is an integral part of the design and performance during the competition. At the collegiate level the drivers are at the amateur level. As a result, the human factor plays a significant role in the outcome of the dynamic events. In order to reduce the uncertainty factor and improve the general performance, driver training is necessary. Instead of overall performance of the driver based on individual lap, our current research focuses on the more detailed components of the driver’s actions throughout different sections of the lap. A complete lap consists of several components, such as, straight line acceleration and braking, max and min radius cornering, slalom or “S” movements, and bus stops or quick braking and turning. In order to evaluate the performance of each driver in each of these components, an AiM data acquisition system is mounted in…
   This content is not included in your SAE MOBILUS subscription, or you are not logged in.

A Method for Mapping a Light Source Utilizing HDR Imagery

JS Forensic Consulting, LLC-Jeffrey Suway
Momenta, LLC-Anthony Cornetto
  • Technical Paper
  • 2020-01-0566
To be published on 2020-04-14 by SAE International in United States
Mapping a light source, any light source, is of broad interest to accident reconstructionists, human factors professionals and lighting experts. Such mappings are useful for a variety of purposes, including determining the effectiveness and appropriateness of lighting installations, and performing visibility analyses for accident case studies. Currently, mapping a light source can be achieved with several different methods. One such method is to use an illuminance meter and physically measure each point of interest on the roadway. Another method utilizes a goniometer to measure the luminous intensity distribution, this is a near-field measurement. Both methods require significant time and the goniometric method requires extensive equipment in a lab. A third method measures illumination distribution in the far-field using a colorimeter or photometer. These systems utilize a CCD sensor to measure the illuminance distribution and then software can convert that illuminance distribution to an IES file for use in a Physically-Based Rendering (PBR) engine. Again, this photometer method requires extensive equipment and the measurements must be taken in a laboratory setting. The method presented in this…
   This content is not included in your SAE MOBILUS subscription, or you are not logged in.

Quantifying Retroreflective Materials using Digital Imagery

JS Forensic Consulting, LLC-Jeffrey Suway
Momenta, LLC-Anthony Cornetto
  • Technical Paper
  • 2020-01-0570
To be published on 2020-04-14 by SAE International in United States
Retroreflection occurs when a light ray incident on a surface is reflected back towards the light source. The performance of a retroreflective material is of interest to accident reconstructionist, human factors professionals, lighting professionals, and roadway design professionals. The retroreflective effect of a material can be defined by the coefficient of retroreflection, which is a function of the light’s entrance angle and the viewer’s observation angle. The coefficient of retroreflection of a material is typically measured in a laboratory environment or in the field with a retroreflectometer. Often the material in question cannot be taken to a laboratory for testing and commercially available portable retroreflectometers are limited to entrance angles of 45 degrees or less and may be cost prohibitive in some cases. This paper presents a methodology to capture images of a retroreflective material at entrance angles between -90 degrees and 90 degrees and observations angles between 0.2 degrees and 1.2 degrees. The process of calibrating the camera and the light source is presented and the coefficient of retroreflection is calculated from the images…
   This content is not included in your SAE MOBILUS subscription, or you are not logged in.

Conceptualization of Human Factors in Automated Driving by Work Domain Analysis

Monash University-Gavan Lintern
SAIC Motor Corporation Limited-You Zhang, Liping Gao, Zhao Zhang
  • Technical Paper
  • 2020-01-1202
To be published on 2020-04-14 by SAE International in United States
The increasing automation of driving functionalities is one of the most important trends in the automotive industry. The trend is moving towards systems which allow the driver to be absent from the active driving task. During the process, on one hand, the human driver more and more relies upon the driving automation to perform the dynamic driving tasks. Therefore the driver needs to trust the driving automation. On the other hand, even the high driving automation (e.g SAE Level 4) can only performs its functionality within the specific operational design domain and the driving automation relies upon the human driver to handle events when the vehicle operates outside the domain. What’s more, for the lower level driving automation, the driver still needs to assume some fallback responsibility, and may be required to react promptly when the driving automation even inside the operational design domain is inadequate to operate the vehicle. From above, it is obvious that the interactions between human driver and driving automation are becoming complicated and less transparent. Hazardous events may occur due…
   This content is not included in your SAE MOBILUS subscription, or you are not logged in.

Analysis of LiDAR and Camera Data in Real-World Weather Conditions for Autonomous Vehicle Operations

Western Michigan University-Nick Goberville, Mohammad El-Yabroudi, Mark Omwanas, Johan Rojas, Rick Meyer, Zachary Asher, Ikhlas Abdel-Qader
  • Technical Paper
  • 2020-01-0093
To be published on 2020-04-14 by SAE International in United States
Autonomous vehicle technology has the potential to improve the safety, efficiency, and cost of our current transportation system by removing human error. With sensors available today, it is possible for the development of these vehicles, however, there are still issues with autonomous vehicle operations in adverse weather conditions (e.g. snow-covered roads, heavy rain, fog, etc.) due to the degradation of sensor data quality and insufficiently robust software algorithms. Since autonomous vehicles rely entirely on sensor data to perceive their surrounding environment, this becomes a significant issue in the performance of the autonomous system. The purpose of this study is to collect sensor data under various weather conditions to understand the effects of weather on sensor data. The sensors used in this study were one camera and one LiDAR. These sensors were connected to an NVIDIA Drive Px2 which operated in a 2019 Kia Niro. Two custom scenarios (static and dynamic objects) were chosen to collect sensor data operating in four real-world weather conditions: fair, cloudy, rainy, and light snow. An algorithm developed herein was used…
   This content is not included in your SAE MOBILUS subscription, or you are not logged in.

Capability-Driven Adaptive Task Distribution for Flexible Multi-Human-Multi-Robot (MH-MR) Manufacturing Systems

Chang’an University-Shaobo Zhang
Clemson University-Yunyi Jia
  • Technical Paper
  • 2020-01-1303
To be published on 2020-04-14 by SAE International in United States
Collaborative robots are more and more used in smart manufacturing because of their capability to work beside and collaborate with human workers. With the deployment of these robots, manufacturing tasks are more inclined to be accomplished by multiple humans and multiple robots (MH-MR) through teaming effort. In such MH-MR collaboration scenarios, the task distribution among the multiple humans and multiple robots is very critical to efficiency. It is also more challenging due to the heterogeneity of different agents. Existing approaches in task distribution among multiple agents mostly consider humans with assumed or known capabilities. However human capabilities are always changing due to various factors, which may lead to suboptimal efficiency. Although some researches have studied several human factors in manufacturing and applied them to adjust the robot task and behaviors. However, the real-time modeling and calculation of multiple human capabilities and real-time adaptive task distribution in flexible MH-MR manufacturing according to human capabilities are still challenging due to the complexity of human capabilities and heterogeneous multi-agent interactions. To address these issues, this paper first proposes…
   This content is not included in your SAE MOBILUS subscription, or you are not logged in.
new

Human Factor Considerations in the Implementation of IVHM

HM-1 Integrated Vehicle Health Management Committee
  • Aerospace Standard
  • AIR6915
  • Current
Published 2020-03-20

This SAE Aerospace Information Report (AIR) offers information on how human factors should be considered when developing and implementing IVHM capabilities for both military and civil fixed wing aircraft. These considerations will cover the perception, analysis, and action taken by the flight crew and the maintenance personnel in response to outputs from the IVHM system. These outputs would be onboard realtime for the flight crew and post flight for maintenance. This document is not intended to be a guideline; it is intended to provide information that should be considered when designing and implementing future IVHM systems.

   This content is not included in your SAE MOBILUS subscription, or you are not logged in.
new

A Data-Driven Radar Object Detection and Clustering Method Aided by Camera

Dongfeng Motor Corporation Technial Center, China-Zhang Darui, Yang Hang, Wang Daihan, Bian Ning, Zhou Jianguang
Laval University-Liu Ruoyu
  • Technical Paper
  • 2020-01-5035
Published 2020-02-24 by SAE International in United States
The majority of road accidents are caused by human oversight. Advanced Driving Assistance System (ADAS) has the potential to reduce human error and improve road safety. With the rising demand for safety and comfortable driving experience, ADAS functions have become an important feature when car manufacturers developing new models. ADAS requires high accuracy and robustness in the perception system. Camera and radar are often combined to create a fusion result because the sensors have their own advantages and drawbacks. Cameras are susceptible to bad weather and poor lighting condition and radar has low resolution and can be affected by metal debris on the road.Clustering radar targets into objects and determine whether radar targets are valid objects are challenging tasks. In the literature, rule-based and thresholding methods have been proposed to filter out stationary objects and objects with low reflection power. However, static vehicles could be missed and thus result in low detection accuracy. To overcome these drawbacks, a data-driven method has been proposed, which uses a variety of features and thus is more suitable for…
This content contains downloadable datasets
Annotation ability available

5 Ws of 3D Modeling for the Visually Impaired

  • Magazine Article
  • TBMG-35805
Published 2020-01-01 by Tech Briefs Media Group in United States

Creating a 3D object with computer software is often the first step in producing it physically. Even with 3D modeling software that has more accessible ways of inputting designs, the visually impaired or blind still have to evaluate their work by either creating a physical version they can touch or by listening to a description provided by a sighted person.

   This content is not included in your SAE MOBILUS subscription, or you are not logged in.

Unsettled Domains Concerning Autonomous System Validation and Verification Processes

EllectroCrafts Aerospace-Fabio Alonso da Silva
  • Research Report
  • EPR2019012
Published 2019-12-30 by SAE International in United States
The Federal Aviation Administration (FAA) and the Department of Transportation’s (DOT’s) National Highway Traffic Safety Administration (NHTSA) face similar challenges regarding the regulation of autonomous systems powered by artificial intelligence (AI) algorithms that replace the human factor in the decision-making process. Validation and verification (V&V) processes contribute to implementation of correct system requirements and the development life cycle - starting with the definition of regulatory, marketing, operational, performance, and safety requirements. The V&V process is one of the steps of a development life cycle starting with the definition of regulatory, marketing, operational, performance, and safety requirements. They define what a product is, and they flow down into lower level requirements defining control architectures, hardware, and software. The industry is attempting to define regulatory requirements and a framework to gain safety clearance of such products. This report suggests a regulatory text and a safety and V&V approach from an aerospace engineering perspective assessing the replacement of the human driver from the decision-making role by a computational system. It also suggests an approach where aerospace guidelines can…
This content contains downloadable datasets
Annotation ability available