We develop a real-time system that tracks 6 degrees of freedom of head poses of the user under GPS-denied environments by incorporating multiple low-cost sensors (cameras, IMUs, and range radios) mounted on the user. The system is based on an error-state Kalman filter algorithm we propose to fuse local measurements from visual odometry, global measurements from landmark matching through a pre-built visual landmark database, and ranging measurements from either static or dynamic ranging radios.
The
system has been demonstrated to provide highly-accurate results both indoors and
outdoors over large areas, even in vision-impaired conditions such as smoky
scenes.
Demos
Tracking of Users in GPS-Denied
Environment
Publications
Taragay Oskiper, Han-Pang Chiu, Zhiwei Zhu, Supun Samarasekera, and Rakesh
Kumar,
"Multi-Modal Sensor Fusion Algorithm for Ubiquitous Infrastructure-Free
Localization in Vision-Impaired Environments ", IEEE/RSJ
International Conference on Intelligent Robots and Systems (IROS), 2010.
pdf