The SAE MOBILUS platform will continue to be accessible and populated with high quality technical content during the coronavirus (COVID-19) pandemic. x

Your Selections

Global positioning systems (GPS)
Show Only

Collections

File Formats

Content Types

Dates

Sectors

Topics

Authors

Publishers

Affiliations

Committees

Events

Magazine

   This content is not included in your SAE MOBILUS subscription, or you are not logged in.

Autonomous Vehicle Multi-Sensors Localization in Unstructured Environment

FEV North America Inc.-Qusay Alrousan, Hamzeh Alzu'bi, Andrew Pfeil, Tom Tasky
  • Technical Paper
  • 2020-01-1029
To be published on 2020-04-14 by SAE International in United States
Autonomous driving in unstructured environments is a significant challenge due to the inconsistency of important information for localization such as lane markings. To reduce the uncertainty of vehicle localization in such environments, sensor fusion of LiDAR, Radar, Camera, GPS/IMU, and Odometry sensors is utilized. This paper discusses a hybrid localization technique developed using: LiDAR based Simultaneous Localization and Mapping (SLAM), GPS/IMU and Odometry data, and object lists from Radar and Camera sensors. An Extended Kalman Filter (EKF) is utilized to fuse data from all sensors in two phases. In the preliminary stage, the SLAM-based vehicle coordinates are fused with the GPS-based positioning. The output of this stage is then fused with the objects-based localization. This approach was successfully tested on FEV’s Smart Vehicle Demonstrator at FEV’s HQ representing a complicated test environment with dynamic and static objects. The test results show that multi-sensor fusion improves the vehicle’s localization compared to GPS or LiDAR alone.
   This content is not included in your SAE MOBILUS subscription, or you are not logged in.

A Smart Measuring System for Vehicle Dynamics Testing

Politecnico di Torino-Enrico Galvagno, Stefano Mauro, Stefano Pastorelli, Antonio Tota
  • Technical Paper
  • 2020-01-1066
To be published on 2020-04-14 by SAE International in United States
A fast measurement of the car handling performance is highly desirable to easily compare and assess different car setup, e.g. tires size and supplier, suspension settings, etc. Instead of the expensive professional equipment normally used by car manufacturers for vehicle testing, the authors propose a low cost solution that is nevertheless accurate enough for comparative evaluations. The paper presents a novel measuring system for vehicle dynamics analysis, which is based uniquely on the sensors embedded in a smartphone and completely independent on the signals available through vehicle CAN bus. Data from tri-axial accelerometer, gyroscope, GPS and camera are jointly used to compute the typical quantities analyzed in vehicle dynamics applications. In addition to signals like yaw rate, lateral and longitudinal acceleration, vehicle speed and trajectory, normally available when working with Inertial Measurement Units (IMU) equipped with GPS, in the present application also the steering wheel angle is measured by artificial vision algorithms that use the phone camera.. The latter signal, besides being important for identifying the maneuver imposed by the driver, it enables the usage…
   This content is not included in your SAE MOBILUS subscription, or you are not logged in.

Statistical Analysis of City Bus Driving Cycle Features for the Purpose of Multidimensional Driving Cycle Synthesis

University of Zagreb-Jakov Topić, Branimir Skugor, Josko Deur
  • Technical Paper
  • 2020-01-1288
To be published on 2020-04-14 by SAE International in United States
Driving cycles are typically defined as time profiles of vehicle velocity, and as such they reflect basic driving characteristics. They have a wide application from the perspective of both conventional and electric road vehicles, ranging from prediction of fuel/energy consumption (e.g. for certification purposes), estimation of greenhouse gas and pollutant emissions to selection of optimal vehicle powertrain configuration and design of its control strategy. In the case of electric vehicles, the driving cycles are also applied to determine effective vehicle range, battery life period, and charging management strategy. Nowadays, in most applications artificial certification driving cycles are used. As they do not represent realistic driving conditions, their application results in generally unreliable estimates and analyses. Therefore, recent research efforts have been directed towards development of statistically representative synthetic driving cycles derived from recorded GPS driving data. The state-of-the-art synthesis approach is based on Markov chains, typically including vehicle velocity and acceleration as Markov chain states. However, apart from the vehicle velocity and acceleration, a road slope and vehicle mass are also shown to significantly impact…
   This content is not included in your SAE MOBILUS subscription, or you are not logged in.

3-D HORN

John Deere Technology Center-Shubham Jaiswal, Shrutika Upase
  • Technical Paper
  • 2020-01-1375
To be published on 2020-04-14 by SAE International in United States
3-D HORN is a vehicle to vehicle communication based technology which helps in reducing the noise pollution, which occurs, due to honking of automobile horns by letting only the drivers of the automobile to hear the horns and not the whole environment around him. To achieve this, a number of relatively small horn speakers are placed inside the car. These speakers are controlled by drivers of other cars. In this way honking will be heard only by the drivers. The most unique feature of this technology is the 3-D effect caused by the speakers which will let the driver know the location of the outside car which is honking. The 3-D effect is achieved by varying the intensity and proper allotment of sound to the positioned speakers in such a way that it will give the feel of the location of the outside car to the driver. Human detection is another important feature this technology provides. It will recognize whether the horn is honked for an automobile or for a human. In case of human…
   This content is not included in your SAE MOBILUS subscription, or you are not logged in.

Mobile Robot Localization Evaluations with Visual Odometry in Varying Environments using Festo-Robotino

German Jordanian University-Abdallah Abdo, Randa Ibrahim
Michigan Technological University-Nathir A. Rawashdeh
  • Technical Paper
  • 2020-01-1022
To be published on 2020-04-14 by SAE International in United States
Autonomous ground vehicles can use a variety of techniques to navigate the environment and deduce their motion and location from sensory inputs. Visual Odometry can provide a means for an autonomous vehicle to gain orientation and position information from camera images recording frames as the vehicle moves. This is especially useful when global positioning system (GPS) information is unavailable, or wheel encoder measurements are unreliable. Feature-based visual odometry algorithms extract corner points from image frames, thus detecting patterns of feature point movement over time. From this information, it is possible to estimate the camera, i.e. the vehicle’s motion. Visual odometry has its own set of challenges, such as detecting an insufficient number of points, poor camera setup, and fast passing objects interrupting the scene. This paper investigates the effects of various disturbances on visual odometry. Moreover, it discusses the outcomes of several experiments performed utilizing the Festo-Robotino robotic platform. The experiments are designed to evaluate how changing the system’s setup will affect the overall quality and performance of an autonomous driving system. Environmental effects such…
   This content is not included in your SAE MOBILUS subscription, or you are not logged in.

A Novel Asynchronous UWB Positioning System for Autonomous Trucks in An Automated Container Terminal

Tongji University-Mingyang Wang, Aiguo Zhou, Xinbo Chen, Yong Shen, Zhenyu Li
  • Technical Paper
  • 2020-01-1026
To be published on 2020-04-14 by SAE International in United States
As a key technology to autonomous vehicles, high precise positioning is essential for automated container terminals to implement intelligent dispatching and to improve container transport efficiency. In view of the unstable performance of global positioning system (GPS) in some circumstances, an ultra wide band (UWB) positioning system is developed for autonomous trucks in an automated container terminal. In this paper, an asynchronous structure is adopted in the system and a three-dimension (3D) localization method is proposed. Other than a traditional UWB positioning system with a server, in this asynchronous system, positions are calculated in vehicle. Therefore, propagation delays from the server to vehicles are eliminated and real-time performance of the system can be significantly improved. Traditional 3D localization methods based on TDOA are mostly invalid with anchors in the same plane. However, in order to guarantee anchors and tags in line of sight (LOS), anchors have to be installed in a vertical plane under the tyre cranes. Coping with this problem, an improved method is presented, which overcomes the matrix singularity. Three hyperboloids can be…
   This content is not included in your SAE MOBILUS subscription, or you are not logged in.

LiDAR Based Classification Optimization of Localization Policies of Autonomous Vehicles

National Research Council Canada-Ismail Hamieh, Ryan Myers, Taufiq Rahman
  • Technical Paper
  • 2020-01-1028
To be published on 2020-04-14 by SAE International in United States
People through many years of experience, have developed a great intuitive sense for navigation and spatial awareness. With this intuition people are able to apply a nearly rules based approach to their driving. With a transition to autonomous driving, these intuitive skills need to be taught to the system which makes perception is the most fundamental and critical task. One of the major challenges for autonomous vehicles is accurately knowing the position of the vehicle relative to the world frame. Currently, this is achieved by utilizing expensive sensors such as a differential GPS which provides centimeter accuracy, or by using computationally taxing algorithms to attempt to match live input data from LiDARs or cameras to previously recorded data or maps. Within this paper an algorithm and accompanying hardware stack is proposed to reduce the computational load on the localization of the robot relative to a prior map. The principal of the software stack is to leverage deep learning and powerful filters to perform classification of landmark objects within a scan of the LiDAR. These landmarks…
   This content is not included in your SAE MOBILUS subscription, or you are not logged in.

Cooperative Estimation of Road Grade Based on Multidata Fusion for Vehicle Platoon with Optimal Energy Consumption

Jilin University-Fangwu Ma, Yu Yang, Jiawei Wang, Yang Zhao, Yucheng Shen, Guanpu Wu
The Ohio State University-Bilin Aksun Guvenc, Levent Guvenc
  • Technical Paper
  • 2020-01-0586
To be published on 2020-04-14 by SAE International in United States
The platooning of automated vehicles possesses the significant potential of reducing energy consumption in the Intelligent Transportation System (ITS). Moreover, with the rapid development of eco-driving technology, vehicle platoon can further enhance the fuel efficiency by optimizing the efficiency of the powertrain. Since road grade takes great account effectting energy consumption of vehicle, the estimation of the road grade with high accuracy is the key factor for connected vehicle platoon to optimize energy consumption using vehicle-to-vehicle (V2V) communication. Commonly the road grade is quantified by single consumer grade global positioning system (GPS) with the geodetic height data which is rough in meter-level, increasing the difficulty to precisely estimate the road grade. This paper presents a novel cooperative estimation method Extended Kalman Filter (EKF) to obtain the accurate information of slopes by multidata fusion of GPS, Inertial Measurement Unit (IMU) using vehicle platoon communication, i.e. the following vehicle fuses the data which was measured by the on-board sensors and delivered by the proceding vehicle. Considering the accurate road grade information, the fuel consumption optimazition of the…
   This content is not included in your SAE MOBILUS subscription, or you are not logged in.

Drive Horizon: An Artificial Intelligent Approach to Predict Vehicle Speed for Realizing Predictive Powertrain Control

Hitachi America Ltd-Mitesh Agrawal, Sunny Bellary
Hitachi America, Ltd.-Yashodeep Lonari, Subrata Kundu
  • Technical Paper
  • 2020-01-0732
To be published on 2020-04-14 by SAE International in United States
Demand for predictive powertrain control is rapidly increasing with the recent advancement of Advanced Driving Assistance Systems and Autonomous Driving. The full or semi-autonomous functions could be leveraged to realize better user acceptance as well as powertrain efficiency of the connected vehicle utilizing the proposed Drive Horizon technology. The sensors of automated driving provide perception of surrounding driving environment which is required to safely navigate the vehicle in real-world driving scenarios. The proposed Drive Horizon provides real-time forecast of driving environment that a vehicle will encounter during its entire travel. This paper summarized the vehicle future speed prediction technique which is an integral part of Drive Horizon for optimized energy control of the vehicle. The prediction model has been developed that integrates information from multiple sources including vehicle’s global positioning system, traffic information and high-definition map data. Recurrent Neural Networks and Bayesian approaches including generative models (Variational Autoencoders, Generative Adversarial Network) have been studied for predicting the vehicle speed. Contrary to the previous research works which mainly focused on deterministic neural networks for speed prediction…
   This content is not included in your SAE MOBILUS subscription, or you are not logged in.

Lidar Inertial Odometry and Mapping for Autonomous Vehicle in GPS-denied Parking Lot

Jilin University-Xuesong Chen, Sumin Zhang, Jian Wu, Rui He, Shiping Song, Bing Zhu, Jian Zhao
  • Technical Paper
  • 2020-01-0103
To be published on 2020-04-14 by SAE International in United States
High-precision and real-time ego-motion estimation is vital for autonomous vehicle. There is a lot GPS-denied maneuver such as underground parking lot in urban areas. Therefore, the localization system relying solely on GPS cannot meets the requirements. Recently, lidar odometry and visual odometry have been introduced into localization systems to overcome the problem of missing GPS signals. Compared with visual odometry, lidar odometry is not susceptible to light, which is widely applied in weak-light environments. Besides, the autonomous parking is highly dependent on the geometric information around the vehicle, which makes building map of surroundings essential for autonomous vehicle. We propose a lidar inertial odometry and mapping. By sensor fusion, we compensate for the drawback of applying a single sensor, allowing the system to provide a more accurate estimate. Compared to other odometry using IMU and lidar, we apply a tight coupled of lidar and IMU method to achieve lower drift, which can effectively overcome the degradation problem based on pure lidar method, ensuring precise pose estimation in fast motion. In addition, we propose a map…