Drones, or Unmanned Aerial Vehicles (UAVs) pose an increasing threat to military
ground vehicles due to their precision strike capabilities, surveillance
functions, and ability to engage in electronic warfare. Their agility, speed,
and low visibility allow them to evade traditional defense systems, creating an
urgent need for advanced AI-driven detection models that quickly and accurately
identify UAV threats while minimizing false positives and negatives.
Training effective deep-learning models typically requires extensive, diverse
datasets, yet acquiring and annotating real-world UAV imagery is expensive,
time-consuming, and often non-feasible, especially for imagery featuring
relevant UAV models in appropriate military contexts. Synthetic data, generated
via digital twin simulation, offers a viable approach to overcoming these
limitations.
This paper presents some of the work Duality AI is doing in conjunction with the
Army’s Program Executive Office Ground Combat Systems (PEO GCS) Advanced
Capabilities Team, focusing on using synthetic data from high-fidelity digital
twin simulations for UAV detection. We introduce a novel metric to refine
synthetic data iteratively, ensuring realistic replication of critical
operational and environmental conditions. Lastly, we test models trained on
real, synthetic, and hybrid datasets, showing that models trained solely on
synthetic data outperform those trained solely on real data, while a hybrid
approach yields the highest overall performance.