Computers

HP unleashes "The Machine" memory-centric supercomputer prototype

HP unleashes "The Machine" memory-centric supercomputer prototype
Hewlett Packard Enterprise has unveiled a computer prototype it calls The Machine, which uses a completely new architecture that puts memory at the center of the system
Hewlett Packard Enterprise has unveiled a computer prototype it calls The Machine, which uses a completely new architecture that puts memory at the center of the system
View 2 Images
Rather than fragmenting memory between different processors like standard computer systems, HPE's Machine gives all of the processors equal access to a shared pool of 160 TB of memory
1/2
Rather than fragmenting memory between different processors like standard computer systems, HPE's Machine gives all of the processors equal access to a shared pool of 160 TB of memory
Hewlett Packard Enterprise has unveiled a computer prototype it calls The Machine, which uses a completely new architecture that puts memory at the center of the system
2/2
Hewlett Packard Enterprise has unveiled a computer prototype it calls The Machine, which uses a completely new architecture that puts memory at the center of the system

To keep up with the constant flood of data the digital world is generating, computer manufacturers have thrown more and more processing cores at the problem – currently, the world's most powerful supercomputer boasts over 10 million cores. Forging a different path, Hewlett Packard Enterprise (HPE) has overhauled computing architecture to put memory at the center of the system, showcasing it through a prototype it calls "The Machine".

Current computers are built around a series of processors, which will dedicate themselves to one task at a time, although they can be further divided into more cores and threads to help them multi-task. But with each processor relying largely on its own little pockets of memory, there's a lot of wasted time and energy as they try to talk to each other, and even more as data is shuffled between memory and storage. Improvements are constantly being made to speed things up, but there's an inevitable bottleneck to this fractured architecture.

The Machine project scrapped the existing system and started again, with an eye towards crunching big data. HPE put what it calls Memory-Driven Computing at the heart of the new architecture, giving all the processors in the system equal access to a single shared pool of memory.

Rather than fragmenting memory between different processors like standard computer systems, HPE's Machine gives all of the processors equal access to a shared pool of 160 TB of memory
Rather than fragmenting memory between different processors like standard computer systems, HPE's Machine gives all of the processors equal access to a shared pool of 160 TB of memory

The current prototype shares a whopping 160 TB between 40 nodes, making it the largest single-memory system in the world. Using non-volatile memory (NVM), the data is processed and stored in the same place, eliminating the need to send it to different parts of the system. And rather than each component communicating through its own interconnect system, Memory-Driven Computing uses a universal protocol, which makes The Machine more efficient and potentially modular.

Communication between the nodes is also sped up thanks to photonics. Instead of sending information in the form of electrons moving through copper wires, photonics allows the system to transmit light through optic fiber ribbons, making the interconnects smaller, faster, cooler and more energy efficient.

While it's already an impressive system, The Machine is scalable, and HPE says future versions could be capable of an absolutely mind-boggling 4,096 yottabytes (YB) – for reference, 1 YB is more than 1 trillion TB. That's access to an incredible amount of data simultaneously, and HPE believes such a system has a future in data centers and space travel.

"We believe Memory-Driven Computing is the solution to move the technology industry forward in a way that can enable advancements across all aspects of society," says Mark Potter, CTO at HPE. "The architecture we have unveiled can be applied to every computing category — from intelligent edge devices to supercomputers."

The Machine team is still looking into ways to improve the photonics, software and security of the system. The scientists describe the project in the video below.

Source: Hewlett Packard Enterprise

The Computer Built for the Era of Big Data

6 comments
6 comments
christopher
Time to sell your etherium coins before this makes them worthless!
Mzungu_Mkubwa
Interesting that you blithely refer to "non-volatile memory (NVM)" as being used in this system, when in reality this is the core technology that is enabling this to happen. "Memristors" that HP has invented is a tech that is (nearly) as fast as DRAM (current volatile memory) but does not need power to retain its data (much like the memory used in your favorite thumb drive). This transformative tech removes the need to separate system RAM and system storage. They can be one and the same, which when you think about it, changes the traditional paradigm significantly, and is why HPE is having to radically transform the architecture. There's no more "executing" a program or "loading" a file, since these are merely layman's terms for moving code from the system storage (traditionally a hard drive) into the system's RAM. With the "storage" medium being as fast as the RAM, they no longer have to be distinct/duplicated. This is great stuff! I welcome our SkyNet Overlord!
JaySmith
MzunguMkubwa, this version does not use memristors. HP has yet to develop a process to manufacture them in large numbers and at reasonable prices.
zr2s10
"The Machine" ... Love it. Apparently HP Engineers are big "Person of Interest" fans, lol.
Kenlbear2
What address size does it use to cover all that memory? I guess it's the limiting factor. Memory is probably partitioned, which really makes it a series of much smaller individual systems.
Mzungu_Mkubwa
@JaySmith, so you're telling me that the type of memory they're employing (described as "non-volatile" in the article) is some kind of NAND flash memory, like in your SSD? Sorry, but that's not fast enough to support this kind of endeavor, nor does it have the read/write longevity to make this setup run for very long (DRAM memory gets written/read orders of magnitude more frequently than storage media.) Sorry, not buying it. (You could be right, tho - am too lazy to research it... meh.)