Neurotechnology, a developer of robotics and high-precision object recognition and biometric identification technologies, has announced the release of the SentiBotics Development Kit 2.0. SentiBotics is designed to help robotics developers and researchers reduce the time and effort required to develop mobile robots by providing the basic robot infrastructure, hardware, component-tuning and robotic software functionality.
The kit includes a tracked reference mobile robotic platform, a 3D vision system, a modular robotic arm and Robot Operating System (ROS) framework-based software with many proprietary robotics algorithms fully implemented. Full source code, detailed descriptions of the robotics algorithms, hardware documentation and programming samples are also included.
“This new version of the SentiBotics robotics kit contains not only substantial improvements to existing features but additional functionality as well,” said Dr. Povilas Daniusis, Neurotechnology robotics team lead. “These new capabilities not only can save time and effort for developers of mobile manipulation control systems, they also enable SentiBotics to serve as an educational platform for use at universities.”
The new SentiBotics Development Kit 2.0 includes motion planning software and accurate 3D models of the robot, enabling the robot to grasp and manipulate objects while avoiding obstacles. The 3D object recognition and object grasping system also allows the robot to grasp arbitrarily-oriented objects. In addition, Neurotechnology has added the ability to use a simulation engine that enables robotics developers towork in virtual environments.
SentiBotics software includes source code of bio-inspired simultaneous localization and mapping (SLAM), autonomous navigation, 3D object recognition and object grasping systems that are tuned to work with the SentiBotics hardware platform.
New features and upgraded components include:
- Object delivery – The robot navigates through its previously-mapped locations until it reaches a location where an assigned object was previously recognized. The robot tries to directly recognize the assigned object and will reposition itself until recognition occurs and grasping is possible. The object is then grasped using the robotic arm, placed into the attached box and delivered to a place where the delivery command was given.
- Object grasping in occluded scenes – The SentiBotics robot can perform path planning for its manipulator, avoiding obstacles that might be between the recognized object and the manipulator itself. If necessary, the robot can automatically reposition itself in order to perform the grasping task. For example, the robot can drive closer or reorient its angle to the object such that it is in the optimal position for picking it up. The SentiBotics robot can automatically determine an object’s orientation and arrange its manipulator in a way best suited for grasping a particular object according to that object’s position in space.
- Support for simulation engine – Enables the development and testing of robotics algorithms in simulated environments, which can reduce development time.
- 3D models of the robot – SentiBotics includes 3D models of the mobile platform and robotic arm which are useful for path planning, visualization and simulation.
- Higher level behavior module – Enables easily programmable, higher-level behavior such as the aforementioned object delivery task, which includes autonomous navigation, object recognition and object grasping.
- Additional upgrades – Includes more accurate SLAM, 3D object recognition system, improved mobile platform controllers and calibration algorithms.
SentiBotics robot hardware includes the following components:
- Tracked mobile platform – Includes motor encoders and an inertial measurement unit (IMU), capable of carrying a payload of up to 10kg.
- Modular robotic arm with seven degrees of freedom – Based on Dynamixel servo motors, capable of lifting objects up to 0.5kg. Each motor provides feedback on position, speed and force.
- 3D vision system – Allows the robot to measure distances in a range of 0.15 to 3.5 meters.
- Powerful onboard computer – Intel NUC i5 computer with 8 GB of RAM, 64 GB SSD drive, 802.11N wireless network interface; comes with pre-installed SentiBotics software.
- Durable 20 AH (LiFePo4) battery with charger.
- Control pad.
All platform components can be easily obtained from manufacturers and suppliers worldwide, so robotics developers and researchers in private industry, universities and other academic institutions can use SentiBotics as reference hardware to build their own units or to incorporate different platforms and materials.
The SentiBotics Development Kit also includes:
- Details of all algorithms used, including descriptions and code documentation.
- ROS-based infrastructure – Allows users to rapidly integrate third-party robotics algorithms, migrate to other hardware (or modify existing hardware) and provides a unified framework for robotic algorithm development. SentiBotics 2.0 is based on the ROS-Indigo version.
- Step-by-step tutorial – Describes how to setup the robot, connect to it and test its capabilities.
- Hardware documentation and schematic.
- Demonstration videos and code samples (C++ and Python) – Can be used for testing or demonstration of the robot’s capabilities, including how to:
-Drive the robot platform and control the robotic arm with the control pad.
-Build a map of the environment by simply driving the robot around and use this map for autonomous robot navigation.
-Calibrate the robot.
-Teach the robot to recognize objects.
-Grasp a recognized object with the robotic arm, including cases where the grasping scene contains obstacles.
-Deliver an object that is located in a previously-visited place.