AN AUGMENTED REALITY UAV-GUIDED GROUND NAVIGATION INTERFACE IMPROVE HUMAN PERFORMANCE IN MULTI-ROBOT TELE-OPERATION

2024-01-3323

11/15/2024

Features
Event
2024 NDIA Michigan Chapter Ground Vehicle Systems Engineering and Technology Symposium
Authors Abstract
Content
ABSTRACT

This research proposes a human-multirobot system with semi-autonomous ground robots and UAV view for contaminant localization tasks. A novel Augmented Reality based operator interface has been developed. The interface uses an over-watch camera view of the robotic environment and allows the operator to direct each robot individually or in groups. It uses an A* path planning algorithm to ensure obstacles are avoided and frees the operator for higher-level tasks. It also displays sensor information from each individual robot directly on the robot in the video view. In addition, a combined sensor view can also be displayed which helps the user pin point source information. The sensors on each robot monitor the contaminant levels and a virtual display of the levels is given to the user and allows him to direct the multiple ground robots towards the hidden target. This paper reviews the user interface and describes several initial usability tests that were performed. This research demonstrates the development of a humanmultirobot interface that has the potential to improve cooperative robots for practical applications.

Meta TagsDetails
DOI
https://doi.org/10.4271/2024-01-3323
Pages
7
Citation
Lee, S., Lucas, N., Cao, A., Pandya, A. et al., "AN AUGMENTED REALITY UAV-GUIDED GROUND NAVIGATION INTERFACE IMPROVE HUMAN PERFORMANCE IN MULTI-ROBOT TELE-OPERATION," SAE Technical Paper 2024-01-3323, 2024, https://doi.org/10.4271/2024-01-3323.
Additional Details
Publisher
Published
Nov 15
Product Code
2024-01-3323
Content Type
Technical Paper
Language
English