Aptima Develops Sense-Making System for Robots

By Caroline Rees / 29 Apr 2013

Smart Sourcing for Unmanned Systems

Discover cutting-edge solutions from leading global suppliers
SUPPLIER SPOTLIGHT
Follow UST

aptima_logoAptima, which applies expertise in human-inspired machine systems, has developed “Cognitive Patterns”, a knowledge-based, collaborative sense-making system for robots to better recognize, adapt to, and intelligently work with their human counterparts in novel situations. The Cognitive Patterns prototype was developed for DARPA’s Defense Sciences Office and the US Army Research Laboratory’s Cognitive Robotics team. The ROS-compliant technology is expected to advance a new class of robots with higher level decision-making, in turn, lowering pre-mission preparation costs, minimizing the need for human intervention, and increasing mission flexibility.

Cognitive Patterns’ architecture is inspired by the neuroscience of human perception and sense-making. First, the high-level knowledge on board the robot is combined with lower level sensor data so the robot can recognize a situation as much as possible on its own, just as humans do. Second, when confronted with ambiguous information or scenarios that don’t match its current knowledge, the system blends existing concepts to generate new knowledge for the robot, akin to the sense-making mind. Networked with the robot, the human operator can adjust how it categorizes objects, people, and environments, boosting the robot’s high-level knowledge and ability to draw conclusions from its sensory data.

Watch the video:

“Even with their state-of-the-art sensors, robots aren’t capable of recognizing what they haven’t seen before, which severely limits their usefulness,” said Webb Stacy, Aptima’s Principal Investigator for the Cognitive Patterns contract. “They’re designed to operate from the bottom up. If the images hitting its camera don’t match what’s in its brain, they’re unable to understand what would be clear to us, which requires lots of ‘hand-holding’.

“Humans, on the other hand, make sense of the world from the top down. We blend concepts in our visual memory or mind’s eye that allows us to recognize a friend regardless of the clothing they wear, or identify an object as a coffee cup despite the innumerable colors, shapes, and sizes they come in,” Stacy added.

As an automated system, Cognitive Patterns combines both top-down and bottom up processing, allowing the robot and human to each do what they do best. It matches sensory input to abstract patterns in a manner similar to the mechanisms used for visual perception in the human brain. The result is a rich situation model shared between robot and human that could not have been created by either alone. The robot needs to interact with the operator only when something unusual or unexpected occurs and to receive mission orders.

Posted by Caroline Rees Caroline co-founded Unmanned Systems Technology and has been at the forefront of the business ever since. With a Masters Degree in marketing Caroline has her finger on the pulse of all things unmanned and is committed to showcasing the very latest in unmanned technical innovation. Connect & Contact

Latest Articles

Most Read Articles on UST This Week

Here’s our round-up of the five most read articles on UnmannedSystemsTechnology.com this week

Jun 20, 2025
New AI Emission Detector for Autonomous Drones Enables Remote Methane Monitoring

Percepto has introduced an AI-powered emission detection system that combines autonomous drones and real-time analytics to streamline methane monitoring and regulatory compliance for oil and gas operators

Jun 20, 2025
CRP Technology Receives ISO 14001:2015 Environmental Certification for Additive Manufacturing

CRP Technology has obtained ISO 14001:2015 certification from DNV, covering its additive manufacturing processes and confirming alignment with international environmental management standards

Jun 20, 2025
Chess Dynamics Demonstrates Advanced AI Tracking in Challenging Field Trials

Chess Dynamics has demonstrated its advanced AI tracking technology in demanding multi-domain trials, showcasing reliable target tracking, classification and sensor performance in complex conditions

Jun 20, 2025
Marine Laboratory Advances Shallow Water Mapping with RIEGL Bathymetric LiDAR

Plymouth Marine Laboratory is advancing shallow water mapping using RIEGL's UAV-mounted bathymetric LiDAR, enabling high-resolution, low-impact environmental data collection and research

Jun 20, 2025
Evolito to Develop Electric Propulsion System in New Aerospace Collaboration

Evolito and VÆRIDION have partnered to develop electric propulsion technology for the Microliner aircraft, aiming to deliver improved propulsion efficiency and operational safety

Jun 19, 2025

Featured Content

Exail Showcases Integrated USV & AUV Operations in Collaborative Trial

Exail’s DriX USV and Ifremer’s Ulyx AUV were jointly deployed for the first time, demonstrating advanced cooperative capabilities in deep-sea exploration

Jun 19, 2025
HBK Updates RTK Service for Precise Positioning in Autonomous Systems

HBK has updated its SensorCloud RTK service, integrating with MicroStrain by HBK navigation systems to provide centimeter-level accuracy and simplified deployment for autonomous systems

Jun 17, 2025
MEMS DC Accelerometers from SDI Deliver Superior Performance for Unmanned Systems

SDI's MEMS DC accelerometers offer a robust solution for unmanned aerial, ground, and underwater vehicles, providing precise static and dynamic motion sensing, crucial for autonomous navigation, attitude feedback, and structural monitoring

Jun 12, 2025
Advancing Unmanned Systems Through Strategic Collaboration UST works with major OEMs to foster collaboration and increase engagement with SMEs, to accelerate innovation and drive unmanned systems capabilities forward.