Simultaneous localization and tracking (SLAM) has been an active research area in robotics for the last five years. The primary focus has been to model an in-door environment for UGV's.
We are investigating how temporary landmarks detected by the vision system can be used to stabilize the IMU dead-reckoning drift, and in such a way provide the navigation system with a means to find the way back to home base, as is the goal of the first demonstrator project. The database of landmarks is used to compute absolute movements as complement to the relative movments the IMU senses, and to make it possible to automatically recognize and position the UAV in areas already visited. A secondary use of this landmark database is for mission planning and guidance.
Activities include the following sub-projects:
- Development platform (David Törnqvist, Jeroen Hol)
- Fast SLAM (Thomas Schön, David Törnqvist, Rickard Karlsson)
- Sensor calibration (Jeroen Hol)
- SLAM module for UAV integration (Rickard Karlsson)
- SLAM with ground vehicle tracking (Rickard Karlsson, Umut Orguner)
1. The first step involves developing a development platform and experimental environment which can deliver realistic data under strictly controlled conditions. The ABB industrial robot available at the department is perfect for this purpose. The picture below illustrates how a box with IMU and camera is attached to the robot.
A special environment suitable for UAV SLAM algorithms has been created. It consists of small balls randomly placed above the floor, on which simple image processing algorithms provide measurements similar to the features obtained on the UAV. The picture below gives a snapshot of the camera view, where the crosses denote detected features.
The SLAM idea is to study the feature displacement (also known as structure of motion) and adaptively estimate each features 3D position. Under the assumption that stationary features are detected, these are then used to stabilize the inertial navigation system.