AI and Humanoids

Are AI data centers still needed, or have the Swiss punched holes in them?

Are AI data centers still needed, or have the Swiss punched holes in them?
EPFL's Anyway Systems software offers a local alternative to remote power-hungry data
EPFL's Anyway Systems software offers a local alternative to remote power-hungry data centers
View 2 Images
EPFL's Anyway Systems software offers a local alternative to remote power-hungry data
1/2
EPFL's Anyway Systems software offers a local alternative to remote power-hungry data centers
This graphic illustrates the basic principle at work behind Anyway Systems
2/2
This graphic illustrates the basic principle at work behind Anyway Systems

What’s not to love about giant AI data centers?

I mean, of course, other than electronic waste, massive use of water (especially in arid regions), reliance on destructive and human rights-abusing mining to extract rare elements, and sucking electricity from dirty energy sources?

Yeah, but how else are you going to get all those amazing Studio Ghibli replicant pictures, those awesome AI-only albums, or AI “retro” videos of The Avengers “recast” with AI Paul Newman and AI Robert Redford? They’re not gonna make themselves! And least not without those giant AI data centers.

But more seriously, because AI offers enormous benefits for medical diagnoses and treatments and addressing climate change, how do we reap the benefits without paying the terrible price?

Turns out the Swiss may have punched holes in the entire AI data center industry and many of the problems it causes. At Switzerland’s École Polytechnique Fédérale de Lausanne (EPFL), a major technology and public research university, researchers have created software they’re now selling through their own company that takes out the middle-man of “Big Cloud.”

Now, thanks to EPFL researchers Gauthier Voron, Geovani Rizk, and Rachid Guerraoui in the School of Computer and Communication Sciences, we have a much better option for AI. Instead of sending our processing needs to remote servers for “inference” (AI production of predictions and conclusions that currently expends 80 to 90% of all AI-computing power), you’ll be able to download Anyway Systems to your desktop. There, Anyway downloads open-source AI models such as ChatGPT in minutes, so you can ask questions globally, but process locally.

“For years,” says DCL head Rachid Guerraoui, “people have believed that it’s not possible to have large language models and AI tools without huge resources, and that data privacy, sovereignty and sustainability were just victims of this, but this is not the case. Smarter, frugal approaches are possible.”

Instead of using warehouse-bound arrays of servers resembling dark, dystopian cities of endless, identical skyscrapers, Anyway Systems distributes processing on a local network – in the above case with ChatGPT-120B, requiring a maximum of four computers – robustly self-stabilizing for optimal use of local hardware. Guerraoui says that while Anyway Systems is ideal for inference, it may be a bit slower responding to prompts, and “it’s just as accurate.”

Who needs a massive Death Star when all you need is a few small X-wing fighters to compete?

Even better, installation takes as little as 30 minutes, and because processing is local, users keep their private data private, and companies, unions, NGOs, and countries keep their data sovereign, and away from the clutches (and “ethics”) of Big Data.

This graphic illustrates the basic principle at work behind Anyway Systems
This graphic illustrates the basic principle at work behind Anyway Systems

While home users would need more than a single computer to form the local network needed to operate Anyway Systems, the history of increased speed and capacity, and decreased size of hardware, means that the Swiss option may soon be more widely available. “We will be able to do everything locally in terms of AI,” says Guerraoui. “We could download our open-source AI of choice, contextualize it to our needs, and we, not Big Tech, could be the master of all the pieces.”

But doesn’t Google’s AI Edge already offer such abilities on a single phone?

“Google AI Edge is meant to be run on mobile phones for very specific and small Google-made models with each user running a model constrained by the phone’s capacity,” counters Guerraoui. “There is no distributed computing to enable the deployment of the same large and powerful AI models that are shared by many users of the same organization in a scalable and fault-tolerant manner. The Anyway System can handle hundreds of billion parameters with just a few GPUs.”

According to Guerraoui, similar logic applies for people operating local LLMs such as msty.ai and Llama. “Most of these approaches help deploy a model on a single machine, which is a single source of failures,” he says, noting that the most powerful AI models require extremely expensive machines found in data centers.

Furthermore, individual users can’t combine commodity machines efficiently to deploy large models, and even if they could, doing so “would require a team to manage and maintain the system. The Anyway System does this transparently, robustly and automatically.”

So, while malevolent actors using generative AI continue to pose a threat to amusing little luxuries such as, say, democracy, at least scientific researchers and others who are using AI to add value to human life and the planet will be able to do so without inflicting as much damage to the environment, or harming the miners and communities producing the elements and minerals that Big AI demands.

Sources: EPFL, Anyway Systems

No comments
0 comments
There are no comments. Be the first!