Neurotechnology Announces SentiBotics Mobile Robotics Development Kit 2.0

By Mike Ball / 02 Sep 2015

Smart Sourcing for Unmanned Systems

Discover cutting-edge solutions from leading global suppliers
SUPPLIER SPOTLIGHT
Follow UST

Sentibotics Mobile RobotNeurotechnology, a developer of robotics and high-precision object recognition and biometric identification technologies, has announced the release of the SentiBotics Development Kit 2.0. SentiBotics is designed to help robotics developers and researchers reduce the time and effort required to develop mobile robots by providing the basic robot infrastructure, hardware, component-tuning and robotic software functionality.

The kit includes a tracked reference mobile robotic platform, a 3D vision system, a modular robotic arm and Robot Operating System (ROS) framework-based software with many proprietary robotics algorithms fully implemented. Full source code, detailed descriptions of the robotics algorithms, hardware documentation and programming samples are also included.

“This new version of the SentiBotics robotics kit contains not only substantial improvements to existing features but additional functionality as well,” said Dr. Povilas Daniusis, Neurotechnology robotics team lead. “These new capabilities not only can save time and effort for developers of mobile manipulation control systems, they also enable SentiBotics to serve as an educational platform for use at universities.”

The new SentiBotics Development Kit 2.0 includes motion planning software and accurate 3D models of the robot, enabling the robot to grasp and manipulate objects while avoiding obstacles. The 3D object recognition and object grasping system also allows the robot to grasp arbitrarily-oriented objects. In addition, Neurotechnology has added the ability to use a simulation engine that enables robotics developers towork in virtual environments.

SentiBotics software includes source code of bio-inspired simultaneous localization and mapping (SLAM), autonomous navigation, 3D object recognition and object grasping systems that are tuned to work with the SentiBotics hardware platform.

New features and upgraded components include:

  • Object delivery – The robot navigates through its previously-mapped locations until it reaches a location where an assigned object was previously recognized. The robot tries to directly recognize the assigned object and will reposition itself until recognition occurs and grasping is possible. The object is then grasped using the robotic arm, placed into the attached box and delivered to a place where the delivery command was given.
  • Object grasping in occluded scenes – The SentiBotics robot can perform path planning for its manipulator, avoiding obstacles that might be between the recognized object and the manipulator itself. If necessary, the robot can automatically reposition itself in order to perform the grasping task. For example, the robot can drive closer or reorient its angle to the object such that it is in the optimal position for picking it up. The SentiBotics robot can automatically determine an object’s orientation and arrange its manipulator in a way best suited for grasping a particular object according to that object’s position in space.
  • Support for simulation engine – Enables the development and testing of robotics algorithms in simulated environments, which can reduce development time.
  • 3D models of the robot – SentiBotics includes 3D models of the mobile platform and robotic arm which are useful for path planning, visualization and simulation.
  • Higher level behavior module – Enables easily programmable, higher-level behavior such as the aforementioned object delivery task, which includes autonomous navigation, object recognition and object grasping.
  • Additional upgrades – Includes more accurate SLAM, 3D object recognition system, improved mobile platform controllers and calibration algorithms.

SentiBotics robot hardware includes the following components:

  • Tracked mobile platform – Includes motor encoders and an inertial measurement unit (IMU), capable of carrying a payload of up to 10kg.
  • Modular robotic arm with seven degrees of freedom – Based on Dynamixel servo motors, capable of lifting objects up to 0.5kg. Each motor provides feedback on position, speed and force.
  • 3D vision system – Allows the robot to measure distances in a range of 0.15 to 3.5 meters.
  • Powerful onboard computer – Intel NUC i5 computer with 8 GB of RAM, 64 GB SSD drive, 802.11N wireless network interface; comes with pre-installed SentiBotics software.
  • Durable 20 AH (LiFePo4) battery with charger.
  • Control pad.

All platform components can be easily obtained from manufacturers and suppliers worldwide, so robotics developers and researchers in private industry, universities and other academic institutions can use SentiBotics as reference hardware to build their own units or to incorporate different platforms and materials.

The SentiBotics Development Kit also includes:

  • Details of all algorithms used, including descriptions and code documentation.
  • ROS-based infrastructure – Allows users to rapidly integrate third-party robotics algorithms, migrate to other hardware (or modify existing hardware) and provides a unified framework for robotic algorithm development. SentiBotics 2.0 is based on the ROS-Indigo version.
  • Step-by-step tutorial – Describes how to setup the robot, connect to it and test its capabilities.
  • Hardware documentation and schematic.
  • Demonstration videos and code samples (C++ and Python) – Can be used for testing or demonstration of the robot’s capabilities, including how to:
    -Drive the robot platform and control the robotic arm with the control pad.
    -Build a map of the environment by simply driving the robot around and use this map for autonomous robot navigation.
    -Calibrate the robot.
    -Teach the robot to recognize objects.
    -Grasp a recognized object with the robotic arm, including cases where the grasping scene contains obstacles.
    -Deliver an object that is located in a previously-visited place.
Posted by Mike Ball Mike Ball is our resident technical editor here at Unmanned Systems Technology. Combining his passion for teaching, advanced engineering and all things unmanned, Mike keeps a watchful eye over everything related to the unmanned technical sector. With over 10 years’ experience in the unmanned field and a degree in engineering, Mike’s been heading up our technical team here for the last 8 years. Connect & Contact

Latest Articles

WarrenUAS Secures FAA 44807 Exemption, Expanding Large UAS Training Capabilities

WarrenUAS has received FAA 44807 approval to train students on unmanned aerial systems weighing over 55 pounds in the NAS, placing it among only five U.S. organizations with this clearance, and expanding its national training leadership

May 15, 2025
ARK Electronics Expands Presence on DIU Blue UAS Framework

ARK Electronics' M.2 LTE Module has been added to the DIU Blue UAS Framework, joining other NDAA-compliant components approved for U.S. defense applications

May 15, 2025
ZIYAN Tech to Showcase Unmanned Helicopters at Shenzhen Exhibition

ZIYAN Tech will present their range of unmanned technologies at the Shenzhen International UAV Exhibition, including heavy-lift cargo delivery and long-endurance helicopter UAVs

May 15, 2025
ATAK UAS TOOL Integration for the Vision2 GCS

Vantage Robotics has integrated ATAK and UAS Tool directly into the Vision2 GCS, enabling military operators to control the Vesper drone and manage tactical awareness seamlessly on a single device

May 15, 2025
New High-Energy Lithium-Ion Battery Introduced for VTOL UAVs

T-DRONES has launched the 8S 14.4Ah lithium-ion battery, offering enhanced energy density and long endurance for VTOL UAVs such as the VA17 drone

May 15, 2025
HP to Demonstrate Advanced 3D Printing Innovations at XPONENTIAL 2025

HP Advanced Manufacturing (HP) will exhibit its latest 3D printing technologies for drone and robotics applications at XPONENTIAL 2025

May 15, 2025

Featured Content

Product Spotlight: Durable Lightweight Lithium-Ion Batteries for UAVs

American Lithium Energy (ALE)'s energy-dense lithium-ion battery cells combine high power output with built-in safety features, ideal for UAV and eVTOL applications in challenging environments

May 14, 2025
SAE Media Group Releases Agenda for Counter UAS Middle East & Africa 2025

The inaugural Counter UAS Middle East & Africa 2025 will gather global defense leaders in Amman to explore threats, technologies, and strategies shaping regional drone defense

May 13, 2025
New Tactical Grade IMU from Inertial Labs for UAVs & Guided Munitions

Inertial Labs has launched the IMU-H100, a compact, tactical-grade MEMS IMU designed to enhance precision navigation for UAVs and guided systems

May 08, 2025
Advancing Unmanned Systems Through Strategic Collaboration UST works with major OEMs to foster collaboration and increase engagement with SMEs, to accelerate innovation and drive unmanned systems capabilities forward.