This content is not included in
your SAE MOBILUS subscription, or you are not logged in.
Development of a Virtual Reality Environment (VRE) for Intuitive Drone Operations
Technical Paper
2017-01-2070
ISSN: 0148-7191, e-ISSN: 2688-3627
This content contains downloadable datasets
Annotation ability available
Sector:
Language:
English
Abstract
Recent advances in Small Unmanned Aerial Systems (SUAS) or drone technologies has resulted in their widespread use in a number of civilian applications, such as aerial imaging, infrastructure inspection, precision agriculture, among others. While this technology is accessible for everyone, it still requires a highly skilled operator to be able to successfully operate these drones in a safe and efficient manner. At the same time, developments in Virtual/Augmented Reality (V/AR) technologies present opportunities for combining the two into novel applications and use cases by providing an intuitive interface for interacting with the drones - opening up possibilities for safe and effective use of drones by relatively untrained operators. This effort addresses the development and implementation of an interface that provides an operator wearing an Oculus Rift virtual reality headset interfaced with a Leap Motion controller the ability to control drones in a virtual reality environment and translate the commands to a physical implementation, in a motion capture volume. This includes actions such as selecting drones, take-off and landing, and commanding the drones to fly a pre-defined flight pattern. DroneKit-Python was used to communicate commands to drones while OptiTrack motion capture cameras and the NatNet SDK (both provided by Naturalpoint Inc.) combine to provide the precise physical location of each drone in an indoor laboratory setting. Unreal Engine 4 was used as the development platform to create the virtual scene the operator resides in. A QAV250 quadcopter from Lumenier Labs was used as the UAS platform, with a Pixhawk flight controller, interfaced with a Raspberry Pi 3 Single Board Computer (SBC) as the companion computer.
In this effort, the virtual environment was created and successfully integrated with the motion capture system. In addition, the QAV 250 quadcopter was successfully controlled through the operator interface in the VR environment and take-offs and flights along pre-defined flight paths (triangles) were successfully achieved. Further tests are planned to increase user interaction and achieve more complex flight paths.
Authors
Citation
Anderson, N., Gao, J., Whitman, E., and Gururajan, S., "Development of a Virtual Reality Environment (VRE) for Intuitive Drone Operations," SAE Technical Paper 2017-01-2070, 2017, https://doi.org/10.4271/2017-01-2070.Data Sets - Support Documents
Title | Description | Download |
---|---|---|
Unnamed Dataset 1 | ||
Unnamed Dataset 2 | ||
Unnamed Dataset 3 | ||
Unnamed Dataset 4 | ||
Unnamed Dataset 5 |
Also In
References
- http://www.lumenier.com/products/multirotors/qav250
- http://wiki.optitrack.com/index.php?title=OptiTrack_Documentation_Wiki
- http://www.optitrack.com/products/flex-13/
- http://wiki.optitrack.com/index.php?title=Calibration#Wanding_Steps
- http://python.dronekit.io/develop/companion-computers.html#supported-companion-computers
- https://www.raspberrypi.org/
- http://beagleboard.org/bone
- http://www.hardkernel.com/main/main.php
- https://software.intel.com/en-us/iot/hardware/edison
- http://python.dronekit.io/
- http://mavlink.org/messages/common
- http://diydrones.com/profiles/blogs/new-3dr-autopilot-pixhawk-mini-2
- http://wiki.optitrack.com/index.php?title=OptiTrack_Unreal_Engine_4_Plugin
- http://www.optitrack.com/products/motive/
- Hu Chao , Meng M.Q. , Liu P.X. , and Wang Xiang Visual gesture recognition for human-machine interface of robot teleoperation Intelligent Robots and Systems, 2003. (IROS 2003). Proceedings. 2003. IEEE/RSJ International Conference on 2 1560 1565 2 Oct 2003
- Fernández R. A. S. , Sanchez-Lopez J. L. , Sampedro C. , Bavle H. , Molina M. and Campoy P. Natural user interfaces for human-drone multi-modal interaction 2016 International Conference on Unmanned Aircraft Systems (ICUAS) Arlington, VA 2016 1013 1022 10.1109/ICUAS.2016.7502665
- Salih Dilman Natural User Interfaces Research Topics in HCI, School of Computer Science, University of Birmingham Birmingham 2015