Neurotechnology Releases SentiBotics Development Kit 2.0

The new SentiBotics version provides additional navigation, grasping and higher level behavior modules for the development of mobile robots.

Vilnius, Lithuania - September 2, 2015 - Neurotechnology, a provider of robotics and high-precision object recognition and biometric identification technologies, today announced the release of the SentiBotics Development Kit 2.0. SentiBotics is designed to help robotics developers and researchers reduce the time and effort required to develop mobile robots by providing the basic robot infrastructure, hardware, component-tuning and robotic software functionality. The kit includes a tracked reference mobile robotic platform, a 3D vision system, a modular robotic arm and Robot Operating System (ROS) framework-based software with many proprietary robotics algorithms fully implemented. Full source code, detailed descriptions of the robotics algorithms, hardware documentation and programming samples are also included. See SentiBotics in action.


"This new version of the SentiBotics robotics kit contains not only substantial improvements to existing features but additional functionality as well," said Dr. Povilas Daniusis, Neurotechnology robotics team lead. "These new capabilities not only can save time and effort for developers of mobile manipulation control systems, they also enable SentiBotics to serve as an educational platform for use at universities."

The new SentiBotics Development Kit 2.0 includes motion planning software and accurate 3D models of the robot, enabling the robot to grasp and manipulate objects while avoiding obstacles. The 3D object recognition and object grasping system also allows the robot to grasp arbitrarily-oriented objects. In addition, Neurotechnology has added the ability to use a simulation engine that enables robotics developers to work in virtual environments.

SentiBotics software includes source code of bio-inspired simultaneous localization and mapping(SLAM), autonomous navigation, 3D object recognition and object grasping systems that are tuned to work with the SentiBotics hardware platform. New features and upgraded components include:

* Object delivery - The robot navigates through its previously-mapped locations until it reaches a location where an assigned object was previously recognized. The robot tries to directly recognize the assigned object and will reposition itself until recognition occurs and grasping is possible. The object is then grasped using the robotic arm, placed into the attached box and delivered to a place where the delivery command was given.
* Object grasping in occluded scenes - The SentiBotics robot can perform path planning for its manipulator, avoiding obstacles that might be between the recognized object and the manipulator itself. If necessary, the robot can automatically reposition itself in order to perform the grasping task. For example, the robot can drive closer or reorient its angle to the object such that it is in the optimal position for picking it up. The SentiBotics robot can automatically determine an object's orientation and arrange its manipulator in a way best suited for grasping a particular object according to that objects position in space.
* Support for simulation engine - Enables the development and testing of robotics algorithms in simulated environments, which can reduce development time.
* 3D models of the robot - SentiBotics includes 3D models of the mobile platform and robotic arm which are useful for path planning, visualization and simulation.
* Higher level behavior module - Enables easily programmable, higher-level behavior such as the aforementioned object delivery task, which includes autonomous navigation, object recognition and object grasping.
* Additional upgrades - Includes more accurate SLAM, 3D object recognition system, improved mobile platform controllers and calibration algorithms.

SentiBotics robot hardware includes the following components:

* Tracked mobile platform - Includes motor encoders and an inertial measurement unit (IMU), capable of carrying a payload of up to 10kg.
* Modular robotic arm with seven degrees of freedom - Based on Dynamixel servo motors, capable of lifting objects up to 0.5kg. Each motor provides feedback on position, speed and force.
* 3D vision system - Allows the robot to measure distances in a range of 0.15 to 3.5 meters.
* Powerful onboard computer - Intel NUC i5 computer with 8 GB of RAM, 64 GB SSD drive, 802.11N wireless network interface; comes with pre-installed SentiBotics software.
* Durable 20 AH (LiFePo4) battery with charger.
* Control pad.

All platform components can be easily obtained from manufacturers and suppliers worldwide, so robotics developers and researchers in private industry, universities and other academic institutions can use SentiBotics as reference hardware to build their own units or to incorporate different platforms and materials.

The SentiBotics Development Kit also includes:

* Details of all algorithms used, including descriptions and code documentation.
* ROS-based infrastructure - Allows users to rapidly integrate third-party robotics algorithms, migrate to other hardware (or modify existing hardware) and provides a unified framework for robotic algorithm development. SentiBotics 2.0 is based on the ROS-Indigo version.
* Step-by-step tutorial - Describes how to setup the robot, connect to it and test its capabilities.
* Hardware documentation and schematic.
* Demonstration videos and code samples (C++ and Python) - Can be used for testing or demonstration of the robot's capabilities, including how to:

** Drive the robot platform and control the robotic arm with the control pad.
** Build a map of the environment by simply driving the robot around and use this map for autonomous robot navigation.
** Calibrate the robot.
** Teach the robot to recognize objects.
** Grasp a recognized object with the robotic arm, including cases where the grasping scene contains obstacles.
** Deliver an object that is located in a previously-visited place.

SentiBotics and the entire line of Neurotechnology products for AI, robotics, object recognition and biometric identification are available through Neurotechnology or from distributors worldwide. For more information, go to: www.neurotechnology.com.

About Neurotechnology
Neurotechnology is a provider of high-precision software development products for AI, robotics, object recognition and biometric fingerprint, face, iris, palmprint and voice identification. More than 3000 system integrators, security companies and hardware providers integrate Neurotechnology's algorithms into their products, with millions of customer installations worldwide.

Drawing from years of academic research in the fields of neuroinformatics, image processing and pattern recognition, Neurotechnology was founded in 1990 in Vilnius, Lithuania and released its first fingerprint identification system in 1991. Since that time the company has released more than 130 products and version upgrades for identification and verification of objects and personal identity.

Featured Product

Model TR1 Tru-Trac

Model TR1 Tru-Trac

The Model TR1 Tru-Trac® linear measurement solution is a versatile option for tracking velocity, position, or distance over a wide variety of surfaces. An integrated encoder, measuring wheel, and spring-loaded torsion arm in one, compact unit, the Model TR1 is easy to install. The spring-loaded torsion arm offers adjustable torsion load, allowing the Model TR1 to be mounted in almost any orientation - even upside-down. The threaded shaft on the pivot axis is field reversible, providing mounting access from either side. With operating speeds up to 3000 feet per minute, a wide variety of configuration options - including multiple wheel material options - and a housing made from a durable, conductive composite material that minimizes static buildup, the Model TR1 Tru-Trac® is the ideal solution for countless applications.