NASA’s Jet Propulsion Laboratory (JPL) has announced that RoboSimian, the laboratory’s four-legged, ape-like robot, has been awarded fifth place in the 2015 DARPA Robotics Challenge. The competition was held in Pomona, California, with nearly two dozen robots and the engineers who created them performing simple tasks in environments that are too dangerous for humans. Japan’s 2011 Fukushima nuclear plant explosion provided the impetus for the Challenge. Partnering with NASA/JPL in the development of RoboSimian were the California Institute of Technology and the University of California, Santa Barbara.
Velodyne’s 3D LiDAR sensor was central to RoboSimian’s perception system, as well as that of a robot named “SPOT” from Boston Dynamics. The HDL-32E sensor, which is capable of viewing a full 360° with a 40° vertical spread, enabled the robot to “look” up, down and around for the most comprehensive view of its environment.
Velodyne is recognized worldwide as an authority on high-definition, real-time 3D LiDAR (Light Detection and Ranging) sensors for autonomous vehicle applications, having created enabling technology for the industry. Velodyne introduced multi-channel, real-time 3D LiDAR during the 2004-2005 DARPA Grand Challenge and has since optimized the technology for a range of other applications, from unmanned aerial vehicles and mobile mapping to robotics and factory automation.
In Pomona, points were awarded based on the number of tasks completed and the time it took to complete them. Team Kaist of South Korea took home first-place honors – a $2 million research award. Robots faced such tasks as driving a vehicle and getting in and out of it, negotiating debris blocking a doorway, cutting a hole in a wall, opening a valve and crossing a field with cinderblocks or other debris. Competitors also were asked to perform two surprise tasks – pulling down an electrical switch and plugging and unplugging an electrical outlet. Each robot in the Challenge had an “inventory” of objects with which it could interact. Engineers programmed the robots to recognize these objects and perform pre-set actions on them, such as turning a valve or climbing over blocks.
Team RoboSimian was in third place after the first day, scoring seven of eight possible points, and ultimately finishing fifth overall. RoboSimian moves around on four limbs, making it best suited to travel over complex terrain, including true climbing.
“The NASA/JPL robot was developed expressly to go where humans can not, so the element of sight – in this case, LiDAR-generated vision – was absolutely critical,” said Wolfgang Juchmann, Ph.D., Velodyne Director of Sales & Marketing. “Velodyne is a worldwide leader in the development of real-time LiDAR sensors for robotics, as well as array of other applications, including mobile mapping and UAVs. With a continuous 360-degree sweep of its environment, our lightweight sensors capture data at a rate of almost a million points per second, within a range of 100 meters from whatever danger or obstacle may exist.”