Robotics

Aptima's Cognitive Patterns system helps robots make sense of the world

Aptima's Cognitive Patterns system helps robots make sense of the world
An iRobot PackBot navigates a school, in a demonstration of the Cognitive Patterns system
An iRobot PackBot navigates a school, in a demonstration of the Cognitive Patterns system
View 2 Images
An iRobot PackBot navigates a school, in a demonstration of the Cognitive Patterns system
1/2
An iRobot PackBot navigates a school, in a demonstration of the Cognitive Patterns system
At the base of the system is a rich onboard database of objects that the robot can cross-reference with its relatively limited sensory input
2/2
At the base of the system is a rich onboard database of objects that the robot can cross-reference with its relatively limited sensory input

It’s sometimes easy to forget that for all their human-like qualities, robots are in fact machines. While some systems allow them to recognize basic objects, they still don’t necessarily make sense of what they’re looking at – they might see and recognize a box, for instance, but what does the presence of a box suggest to them? Now, researchers at Massachusetts-based engineering firm Aptima are developing a system known as Cognitive Patterns. It allows robots and humans to collaborate on building the robots’ understanding of the world, thus allowing them to operate on their own more effectively.

At the base of the system is a rich onboard database of objects that the robot can cross-reference with its relatively limited sensory input. In this way, even if not a lot of information is available visibly, what information there is can go a long way towards identifying an object. In some cases, even if an entire object isn’t entirely recognizable, its basic components may still be – by running that combination of those components through the database, a match may yet result.

When the robot is stymied by something that’s truly unidentifiable, it can combine different concepts from its database to create a unique new identity for that item. This closely resembles the process that takes place in the human brain, which allows us to interpret things as the sum of their recognizable parts, instead of just as mysteries.

In such a situation, however, the Cognitive Patterns remote user interface also allows the mostly-autonomous robot to query its human “partner” about things that it doesn’t understand. That person can respond by assigning an identity to the item, which is added to the robot’s database for future reference. This lets the robot actually learn, with some help from its teacher.

At the base of the system is a rich onboard database of objects that the robot can cross-reference with its relatively limited sensory input
At the base of the system is a rich onboard database of objects that the robot can cross-reference with its relatively limited sensory input

The interface also allows the user to query the robot about what it’s perceiving, it lets the robot post alerts to the user, and it displays live feeds from the robot’s video and mapping systems.

However it goes about recognizing or assigning new identities to objects, the system also provides the robot with some context. If it identifies a refrigerator, for instance, the robot is able to ascertain that it’s probably in a kitchen or lunch room. Should it be trying to locate a furnace, the robot will be able to independently figure out that the presence of a fridge strongly suggests that it’s not in the right room.

Aptima is developing Cognitive Patterns with industry partner iRobot, for DARPA's Defense Sciences Office and the US Army Research Laboratory’s Cognitive Robotics team. More information on the system is available in the video below.

Source: Aptima

1 comment
1 comment
JAT
Here's that slippery slope again!