The 2015 DRC is a competition consisting of several disaster-related tasks for robots to perform. RoboSimian will be using Velodyne’s spinning LiDAR sensor as a key element of the robot’s perception system. The sensor, which is capable of rotating a full 360° up to 20 times per second, enables the robot to look between 10° up and 30° down. In the trials round last December, the JPL team won a spot to compete in the finals, which will be held in Pomona, Calif., in June 2015.
In the finals, the robot will be faced with such tasks as driving a vehicle and getting in and out of it, negotiating debris blocking a doorway, cutting a hole in a wall, opening a valve and crossing a field with cinderblocks or other debris. Organizers have also promised a surprise task.
RoboSimian moves around on four limbs, making it best suited to travel over complex terrain, including true climbing. Each robot in the Challenge has an “inventory” of objects with which it can interact. Engineers have to program the robots to recognize these objects and perform pre-set actions on them, such as turning a valve or climbing over blocks.
“The NASA/JPL robot was developed expressly to go where humans could not, so the element of sight – in this case, LiDAR-generated vision – is absolutely critical,” said Wolfgang Juchmann, Ph.D., Velodyne Director of Sales & Marketing. “We’re recognized worldwide for developing real-time LiDAR sensors for all kinds of autonomous applications, including 3D mapping and surveillance as well as robotics. With a continuous 360-degree sweep of its environment, our lightweight sensors capture data at a rate of almost a million points per second, within a range of 100 meters – ideal for taking on obstacle courses, wherever they may be.”
JPL researchers are currently working on getting RoboSimian to walk more quickly. Partners at the California Institute of Technology and the University of California, Santa Barbara are collaborating on enhancing the robot’s walking speeds.