Researchers at Cornell University have developed an electronic chip that they describe as a "microwave brain." The simplified chip is analog rather than digital, yet can process ultrafast data and wireless communication signals simultaneously.
We are so used to thinking of computers as digital machines running on binary code that it's easy to forget that these are only one type of computer. In fact, both historically and today, many devices that we can classify as computers are analog in function.
As school lessons and popular science articles keep telling us, modern computers are digital. That is, they are composed of on/off switches strung together in logic circuits that process data as a string of binary ones and zeros. Analog computers, on the other hand, are models of something real or abstract that can be used for calculations.
One example of an analog computer is a mechanical clock. It calculates that passage of time by means of springs, gears and an escapement that models the real world. Many other examples include slide rules (yes, I'm that old), speedometers, spring or liquid thermometers, and more.
There were even advanced analog computers that solved complex equations using rods and cams and others that simulated national economies by the flow of liquid through tubes between reservoirs. One from 1947 was even designed to be built from a Meccano set by budding computer engineers. And it wasn't that long ago that many electronic computers used analog circuits with potentiometers and voltmeters to crunch numbers.
But now that digital computers are king, why any interest in analog versions? The reason is that analog circuits have many advantages. They're much simpler than digital circuits and can eliminate many steps that digital computers use to solve problems. They're also much faster because they can run tasks in parallel, use much less power, are better at problems involving continuous change and complex systems because they rely on physical behavior, and, because they don't run on discrete numbers, they can handle data over a near-infinite range of values.
Now, Cornell is working on their microwave brain that is billed as the first fully integrated silicon microchip to function as a true microwave neural network. In other words, by forsaking digital for the analog physics of microwaves, they can mimic how the human brain uses neurons to recognize patterns and learn in a simplified way that cuts out many of the steps for signal processing that digital computers require.
It can also do this using much less power, with an estimated 200 milliwatts needed to run it at tens of gigahertz. In addition, tests have marked its accuracy at 88% in classifying wireless signal types.
The new chip is also remarkably small, indicating that it can be used in smart watches and phones to give them AI capabilities without being connected to cloud servers. If that wasn't enough, the technology could also be used to increase hardware security, detect anomalies in wireless communications, and improve radar target tracking and radio signal decoding.
"In traditional digital systems, as tasks get more complex, you need more circuitry, more power and more error correction to maintain accuracy," said research lead Bal Govind. "But with our probabilistic approach, we’re able to maintain high accuracy on both simple and complex computations, without that added overhead."
The research was published in Nature Electronics.
Source: Cornell University