This content is not included in your SAE MOBILUS subscription, or you are not logged in.

Eyes In the Sky

  • Magazine Article
  • 19AERP05_04
Published May 01, 2019 by SAE International in United States
Language:
  • English

For Drones, Combining Vision Sensor and IMU Data Leads to More Robust Pose Estimation

Drones (i.e. quadrotors) are a popular and increasingly wide-spread product used by consumers as well as in a diversity of industrial, military and other applications. Historically under the control of human operators on the ground, they're becoming increasingly autonomous as the cameras built into them find use not only for capturing footage of the world around them but also in understanding and responding to their surroundings. Combining imaging with other sensing modalities can further bolster the robustness of this autonomy.

When a drone flies, it needs to know where it is in three-dimensional space at all times, across all six degrees of freedom for translation and rotation. Such pose estimation is crucial for flying without crashes or other errors. Drone developers are heavily challenged when attempting to use a single IMU or vision sensor to measure both orientation and translation in space. A hybrid approach combining IMU and vision data, conversely, improves the precision of pose estimation for drones based on the paired strengths of both measuring methods.