UAVs face a significant challenge in maintaining accurate navigation when Global Navigation Satellite System (GNSS) signals are unavailable, such as in urban canyons, indoors, or during intentional jamming. This limitation severely restricts their operational capabilities.
This project addresses this problem by developing a robust alternative navigation system. It leverages deep learning to estimate the UAV's state (attitude, velocity, and position) using only data from its onboard Inertial Measurement Unit (IMU).
The core of this project is an innovative hierarchical cascade architecture of three interconnected Long Short-Term Memory (LSTM) networks. Each network is trained for a specific predictive task, creating a dependency chain that mimics the physical nature of motion: attitude estimation is required for velocity prediction, and both are necessary for accurate position tracking. The models were trained and validated on real-world flight data from quadcopters, sourced from the public "Flight Review" platform.
The system demonstrates promising performance, achieving a Median Maximum Position Error (MPE) of 28 meters on validation flight data, showcasing its ability to maintain acceptable drift over extended periods without GNSS.