Electronics

Computer memory prototype ditches 1s and 0s for denser data storage

Cambridge scientists have tested a prototype of a new form of computer memory, made using barium bridges between thin films of a disordered material
Cambridge scientists have tested a prototype of a new form of computer memory, made using barium bridges between thin films of a disordered material

Cambridge scientists have developed a new prototype for computer memory that could make for faster chips that could hold up to 100 times more data. The system is made up of barium bridges between films of a disordered material.

As powerful as current computer technology can be, there are a few hard limits to it. Data is encoded into just two states – one or zero. And this data is stored and processed in different parts of a computer system, so it needs to be shuttled back and forth, which consumes energy and time.

But an emerging form of computer memory, known as resistive switching memory, is designed to be far more efficient. Rather than flipping a bit of information into one of two possible states, this new kind of memory can create a continuous range of states. This is done by applying an electrical current to certain types of materials, which causes their electrical resistance to become either stronger or weaker. A broad spectrum of these slight differences in electrical resistance creates a series of possible states to store data.

“A typical USB stick based on continuous range would be able to hold between 10 and 100 times more information, for example,” said Dr. Markus Hellenbrand, first author of the study.

For the new study, the team developed a prototype of a resistive switching memory device made with a material called hafnium oxide, which is already in use in the semiconductor industry as an insulator. Normally it’s challenging to use for memory because it has no structure at the atomic level – its hafnium and oxygen atoms are randomly mixed together. But here, the Cambridge researchers found that adding an extra ingredient helped change that.

When barium was thrown into the mix, it formed vertical “bridges” between stacked thin films of hafnium oxide. Since these barium bridges are highly structured, electrons can travel through them easily. An energy barrier is created at the points where the bridges meet the device contacts, and the height of this barrier can be controlled which changes the electrical resistance of the overall material. That in turn is what encodes the data.

“This allows multiple states to exist in the material, unlike conventional memory which has only two states,” said Hellenbrand. “What’s really exciting about these materials is they can work like a synapse in the brain: they can store and process information in the same place, like our brains can, making them highly promising for the rapidly growing AI and machine learning fields.”

The researchers say that their device, using thin films of hafnium oxide connected by barium bridges, has a few advantages to help it along the path to commercialization. For one, these structures can self-assemble under relatively low temperatures, which is easier than the high-temperature manufacturing that many others need. Plus, the materials are already in wide use in the computer chip industry, so it should be easier to incorporate them into existing manufacturing techniques. Feasibility studies on the materials will allow the scientists to investigate how well they might work at larger scales.

The research was published in the journal Science Advances.

Source: Cambridge University

  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
5 comments
paul314
People have been building memories with bits based on multiple levels of charge for years. If you can distinguish 0, 33%, 66% and 100%, you get 2 bits, which is 4x the information. And so forth with more distinguishable levels. But each increase means tighter control of manufacturing tolerances, more sensitive amplifiers and comparators, and so forth. It may be worth it, may not. As the number of levels increases, so do the chances of errors that can corrupt all of your data.

These folks might be thinking about storing analog values in their memory cells, which is a different matter, because then small errors in storage or sensing (or even leakage over time) become less important.
TechGazer
Nice in theory, but probably hard to do in practice. The more different states the device is capable of, the more likely for errors to occur. There's probably a mathematical formula to define the expected error rate. Adding in error-correction systems reduces speed and density. At some point there might be a commercial product with 4 states per element that can compete with binary memory, but new binary technologies will be developed too. A device that can store 100 states per element and do it quickly and reliably is fairly unlikely. There would be trade-offs between density, speed and reliability.

I am not expecting 100-state RAM anytime soon.
Expanded Viewpoint
EXACTLY, TG! The more complexity there is in a system, the higher the number of potential failure points there are. A 1965 Ford Mustang had a total of 4 fuses, as I recall. One made today, probably has at least 40! My brother's 1996 Cadillac has three fuse boxes in it, one under the hood, one in the glove box, and a third in the trunk!! And it's sitting right now, waiting for us to put a new fuel pump in it!
Calcfan
It is frustrating to read such writing. No values of what is thought to be practical. Others have assumed 4 states, but . . . ?
Bob Flint
The resistance of a material, with impurities, and the quest for cheap fast storage, balanced against the requirements for accuracy, repeatability, factoring in the temperature, vibration, frequency, etc. seems no easy answers here.