Computers

How Intel's bleeding-edge Loihi 2 chips help robots perceive the world

How Intel's bleeding-edge Loihi 2 chips help robots perceive the world
Loihi 2 is Intel's second-generation neuromorphic research chip. It supports new classes of neuro-inspired algorithms and applications, while providing faster processing, greater resource density and improved energy efficiency.
Loihi 2 is Intel's second-generation neuromorphic research chip. It supports new classes of neuro-inspired algorithms and applications, while providing faster processing, greater resource density and improved energy efficiency.
View 4 Images
Dr. Tobias Fischer, right, with Professor Michael Milford and PhD student Somayeh Hussaini in the robotics lab
1/4
Dr. Tobias Fischer, right, with Professor Michael Milford and PhD student Somayeh Hussaini in the robotics lab
Loihi 2 is Intel's second-generation neuromorphic research chip. It supports new classes of neuro-inspired algorithms and applications, while providing faster processing, greater resource density and improved energy efficiency.
2/4
Loihi 2 is Intel's second-generation neuromorphic research chip. It supports new classes of neuro-inspired algorithms and applications, while providing faster processing, greater resource density and improved energy efficiency.
Loihi 2 enables ultra-efficient spiking neural networks to replace resource-hungry deep neural networks in certain applications
3/4
Loihi 2 enables ultra-efficient spiking neural networks to replace resource-hungry deep neural networks in certain applications
Released in October 2021, Loihi 2 is a bleeding-edge, research-grade chip
4/4
Released in October 2021, Loihi 2 is a bleeding-edge, research-grade chip
View gallery - 4 images

Computers destroy humans at chess, but there's not a robot in the world you could send into an unfamiliar house and tell it to feed the dog; the general intelligence and adaptability of the human brain remain unrivalled. Intel's research-grade Loihi 2 neuromorphic chips are designed to help close the gap, drawing inspiration from nature's greatest necktop supercomputer.

We spoke to Queensland University of Technology researcher Dr. Tobias Fischer about his work integrating these cutting-edge chips into autonomous robots, where they're outperforming resource-draining supercomputers on certain tasks. Fischer's team is working specifically on localization and navigation – helping robots work out where they are in unfamiliar situations.

"The Center for Robotics is a huge lab," said Dr. Fischer. "Over 100 people. We do everything from manipulation – grasping objects and picking them up – to space robotics, a bit of human interaction and the social elements needed when you talk with humans. We do a lot of research on vision techniques, using cameras and sensors to help robots perceive the world similarly to how we do with our eyes. Taking a series of pixel intensities and giving it a higher level meaning to say that's a car, that's a chair. Super simple, even for a five-year-old, but incredibly hard for a computer.

"Where I come into play is localization and navigation. So if you tell a robot to unpack your dishwasher, it needs to know how to find your kitchen. It needs to perceive objects and decide whether they're of interest, or to be ignored. Whether it can go over top of them or not."

Dr. Tobias Fischer, right, with Professor Michael Milford and PhD student Somayeh Hussaini in the robotics lab
Dr. Tobias Fischer, right, with Professor Michael Milford and PhD student Somayeh Hussaini in the robotics lab

Deep neural networks have been useful in this area, showing an impressive ability to learn over time, and apply high-level labels to certain objects based solely upon visual information. But training them can use an incredible amount of energy.

"Most universities have high-performance supercomputers with huge, air conditioned storage rooms," said Dr. Fischer. "Those supercomputers consume as much energy as a medium-sized city would in a year, just to train some of these bigger networks. This is crazy, it's a huge use of resources. Intel's Loihi 2 chips take a different approach, that gives you a super power-efficient way to run a particular class of networks. You can only run a subset of networks or optimization algorithms, but you can run them very, very efficiently."

The efficiency gap comes down to how the millions of "neurons" on the chips activate in deep neural networks as opposed to spiking neural networks like the Loihi chip. "You input an image, and let's say you want to classify whether there's a chair or a person in that image," said Dr. Fischer. "In a deep neural network, all the hundreds of millions of neurons are activated during each processing step. That uses quite a bit of resources. In a spiking neural network, only a very small subset of neurons are activated at a time. They have an internal state that accumulates some evidence that something might look like a chair, but they don't give out a 'spike' until that evidence reaches a certain threshold."

These chips are compact enough to run directly on small robots, and even drones. "Basically, we plug a little USB stick into the existing computer that runs on the robot," said Dr. Fischer. "It works like a hardware accelerator. We're hoping they help us develop adaptive localization techniques that run extremely efficiently and at high speeds, with low latency; that's critical when you're trying to distinguish and track things in a video feed. Also, efficiency itself is great for robots and drones that have to carry their own power source."

Loihi 2 enables ultra-efficient spiking neural networks to replace resource-hungry deep neural networks in certain applications
Loihi 2 enables ultra-efficient spiking neural networks to replace resource-hungry deep neural networks in certain applications

"When we say adaptive localization, we'd really like these things to be able to adjust to changing conditions," he continued. "Nighttime to daytime, obviously, makes a huge difference to what your environment looks like. But also adapting from sunny conditions to the middle of a thunderstorm or a snowstorm, that sort of thing. If you can adapt to changes in your environment, that'll improve the positioning and localization techniques and the accuracy we can obtain."

Loihi has some drawbacks, he admits, that might hold it back from breaking out of research labs into a commercial product that might start shipping on drones, for example. "The problem that Intel and neuromorphic communities have is that we often lag behind in robustness and accuracy when compared to conventional algorithms that run more on the CPU or GPU," he explained. "And it's much harder to come up with these algorithms in the first place, as compared with conventional deep neural networks that are being researched by God knows how many tens of thousands of researchers every day. We haven't made the breakthrough yet that'll make them generally applicable in a wide enough range of scenarios that there'll be interest in commercialization. Intel is obviously collaborating with a number of universities like ourselves to make the steps toward this happening, but we're not there yet."

Source: QUT/Intel

View gallery - 4 images
No comments
0 comments
There are no comments. Be the first!