Liam Paull

UGV Competition

Unmanned Systems Canada Recently hosted the first ever Unmanned Ground Vehicle (UGV) competition. The task was to navigate an environment from start to finish and identify and localize targets along the way. In addition, we were supposed to complete a simulation using the middleware Player and the Gazebo 3D simulation environment. We used the Corobot platform, which comes with motor encoders, as well as a Hokuyu UTM-30LX laser, an Ocean-Server compass, a GPS, thermal rangers, and the Kinect Camera built by Microsoft for the Xbox.

Path Planning

The path planning was very simple and was done with a state machine:

This approach worked very well for the task at hand, however it should be generalized to more complex tasks using a more rigorous approach (for example: potential fields)

Automatic Target Recognition

The Kinect Camera gives two outputs: an RGB image, and a depth map associated with the RGB image. To localize the targets (by color) required finding the colored objects in the RGB image, and then looking up the depth in depth map for that pixel. The final target location was found using simple triangulation using the robots current pose.

An example of an RGB image and corresponding depth map are shown here:

RGB image    depth

The RGB image is converted to HSV format and then thresholded to find the targets.

    threshed

We placed 1st at the competition. Here's the press release along with some photos.

Here's some more about the hardware we used:


Corobot platform


Hokuyu UTM-30LX Laser


Microsoft Kinect Camera