Sporting Velodyne's 3D LiDAR Sensor, NASA/JPL's 'RoboSimian' Competes in 2015 DARPA Robotics Challenge
Industry-Leading HDL-32E Sensor Also Deployed on Boston Dynamics SPOT; Helps Robots ‘See Danger Lurking in the Aftermath of a Disaster
MORGAN HILL, Calif. June 15, 2015
With Velodynes HDL-32E LiDAR sensor mounted atop its four shoulders, RoboSimian nabbed fifth place among nearly two dozen robots participating in the DARPA Robotics Challenge, held in Pomona, Calif., June 6-7. There, robots and the engineers who created them performed simple tasks in environments that are too dangerous for humans. Japans 2011 Fukushima nuclear plant explosion provided the impetus for the Challenge. Partnering with NASA/JPL in the development of RoboSimian were the California Institute of Technology and the University of California, Santa Barbara.
Velodynes 3D LiDAR sensor was central to RoboSimians perception system, as well as that of a robot named "SPOT" from Boston Dynamics (http://bit.ly/1KBTsxL). SPOT proved to be a spectator favorite, making an appearance several times during the three days of competition. The HDL-32E sensor, which is capable of viewing a full 360° with a 40° vertical spread, enables the robot to "look" up, down and around for the most comprehensive view of its environment.
Velodyne is recognized worldwide as the standard for high-definition, real-time 3D LiDAR (Light Detection and Ranging) sensors for autonomous vehicle applications, having created enabling technology for the industry. Velodyne introduced multi-channel, real-time 3D LiDAR during the 2004-2005 DARPA Grand Challenge and has since optimized the technology for a range of other applications, from unmanned aerial vehicles and mobile mapping to robotics and factory automation.
In Pomona, points were awarded based on the number of tasks completed and the time it took to complete them. Team Kaist of South Korea took home first-place honors - a $2 million research award (http://bit.ly/1S2sIKG). Robots faced such tasks as driving a vehicle and getting in and out of it, negotiating debris blocking a doorway, cutting a hole in a wall, opening a valve and crossing a field with cinderblocks or other debris. Competitors also were asked to perform two surprise tasks - pulling down an electrical switch and plugging and unplugging an electrical outlet. Each robot in the Challenge had an "inventory" of objects with which it could interact. Engineers programmed the robots to recognize these objects and perform pre-set actions on them, such as turning a valve or climbing over blocks.
Team RoboSimian was in third place after the first day, scoring seven of eight possible points, and ultimately finishing fifth overall. RoboSimian moves around on four limbs, making it best suited to travel over complex terrain, including true climbing (http://1.usa.gov/1B0N6Ys).
"The NASA/JPL robot was developed expressly to go where humans can not, so the element of sight - in this case, LiDAR-generated vision - was absolutely critical," said Wolfgang Juchmann, Ph.D., Velodyne Director of Sales & Marketing. "Velodyne is a worldwide leader in the development of real-time LiDAR sensors for robotics, as well as array of other applications, including mobile mapping and UAVs. With a continuous 360-degree sweep of its environment, our lightweight sensors capture data at a rate of almost a million points per second, within a range of 100 meters from whatever danger or obstacle may exist."
About the DARPA Robotics Challenge
According to the Department of Defense, some disasters, due to grave risks to the health and wellbeing of rescue and aid workers, prove too great in scale or scope for timely and effective human response. The DARPA Robotics Challenge (http://www.theroboticschallenge.org) seeks to address the problem by promoting innovation in human-supervised robotic technology for disaster-response operations. The primary technical goal of the DRC is to develop human-supervised ground robots capable of executing complex tasks in dangerous, degraded, human-engineered environments. Competitors in the DRC are developing robots that can utilize standard tools and equipment commonly available in human environments, ranging from hand tools to vehicles. To achieve its goal, the DRC is advancing the state of the art of supervised autonomy, mounted and dismounted mobility, and platform dexterity, strength, and endurance. Improvements in supervised autonomy, in particular, aim to enable better control of robots by non-expert supervisors and allow effective operation despite degraded communications (low bandwidth, high latency, intermittent connection). The California Institute of Technology manages JPL for NASA.
About Velodyne LiDAR
Founded in 1983 and based in Californias Silicon Valley, Velodyne Acoustics, Inc. is a diversified technology company known worldwide for its high-performance audio equipment and real-time LiDAR sensors. The companys LiDAR division evolved after founder/inventor David Hall competed in the 2004-05 DARPA Grand Challenge using stereovision technology. Based on his experience during this challenge, Hall recognized the limitations of stereovision and developed the HDL-64 high-resolution LiDAR sensor. Velodyne subsequently released its compact, lightweight HDL 32E sensor, available for many applications including UAVs, and the new VLP-16 LiDAR Puck, a 16-channel real-time LiDAR sensor that is both substantially smaller and dramatically less expensive than previous generation sensors. Market research firm Frost & Sullivan has honored the company and the VLP-16 with its 2015 North American Automotive ADAS (Advanced Driver Assistance System) Sensors Product Leadership Award. Since 2007, Velodynes LiDAR division has emerged as the leading developer, manufacturer and supplier of real-time LiDAR sensor technology used in a variety of commercial applications including autonomous vehicles, vehicle safety systems, 3D mobile mapping, 3D aerial mapping and security. For more information, visit http://www.velodynelidar.com.
Featured Product
Piab's Kenos KCS Gripper
Piab's Kenos KCS gripper enables a collaborative robot to handle just about anything at any time. Combining Piab's proprietary air-driven COAX vacuum technology with an easily replaceable technical foam that molds itself around any surface or shape, the gripper can be used to safely grip, lift and handle any object. Standard interface (ISO) adapters enable the whole unit to be attached to any cobot type on the market with a body made in a lightweight 3D printed material. Approved by Universal Robots as a UR+ end effector.