Precision Localization and Mapping for Autonomous Outdoor UGVs

By Mike Ball / 28 Jul 2020
Follow UST

Rainos UGV with Xsens AHRS

Xsens has released a case study showcasing how its MTi-30 AHRS (Attitude and Heading Reference System) has been integrated into the centimeter-level precision position tracking system for a new range of outdoor autonomous UGVs (unmanned ground vehicles).

Read the case study on Xsens’ website here

Innok Robotics is a supplier of modular UGV development platforms and is now developing a new range of standard autonomous UGVs for use in a variety of applications such as robotic ground transportation, automated gardening, and agriculture. The robotic vehicles may have to climb and descend slopes, move just as comfortably on uneven soil or rock as on level asphalt, and operate in all weather conditions and temperatures.

While indoor UGVs usually operate under stable, confined and easily delineated conditions, in outdoor environments outdoors there are no boundaries, conditions may change often as vehicles come and go or vegetation levels change, and the terrain must be mapped and traversed in 3D.

The solution to these challenges uses a technique known as SLAM (Simultaneous Localization and Mapping), involving the continuous and high-precision tracking of the vehicle from a known starting point in all three dimensions. The Xsens MTi-30 AHRS measures acceleration in 3D, fusing this raw acceleration data with data from its gyroscopes and magnetometers to produce estimates of roll, pitch and yaw which are updated at a rate of up to 400Hz.

Innok’s Rainos UGV, specially designed to water the floral displays and flower beds planted in graveyards, is an example of a system that utilises this SLAM technology. In the initialization phase, the operator ‘shows’ Rainos how to perform the task, using a remote control to guide the vehicle along all permitted routes and tag the locations of graves to be watered. The known start point from which each irrigation session begins is also recorded.

The Rainos UGV will then create its own map of the graveyard, using inputs from three sources:

  • A LiDAR sensor which builds a 3D map of the scene
  • Odometry – motion data derived from the turning of the wheels
  • A motion tracker, which provides 3D motion data and heading information

After initialization is finished, the Rainos AGV has a complete 3D map of the graveyard, the routes by which it is able to travel round the graveyard, and the locations in which it must perform watering operations. It can determine its position by a process of dead reckoning from a known starting position, using accurate, low-drift motion and heading data from the MTi-30 AHRS.

To find out more about how Xsens’ inertial sensor technology can be applied to SLAM applications for robotics, read the case study on Xsens’ website.

Posted by Mike Ball Mike Ball is our resident technical editor here at Unmanned Systems Technology. Combining his passion for teaching, advanced engineering and all things unmanned, Mike keeps a watchful eye over everything related to the unmanned technical sector. With over 10 years’ experience in the unmanned field and a degree in engineering, Mike’s been heading up our technical team here for the last 8 years. Connect & Contact