AI and Humanoids

What's the deal with space-based data centers for AI?

What's the deal with space-based data centers for AI?
Big Tech believes orbital data centers are the best way to scale up compute infrastructure needed to run AI services
Big Tech believes orbital data centers are the best way to scale up compute infrastructure needed to run AI services
View 3 Images
Big Tech believes orbital data centers are the best way to scale up compute infrastructure needed to run AI services
1/3
Big Tech believes orbital data centers are the best way to scale up compute infrastructure needed to run AI services
As Elon Musk's SpaceX and xAI are set to merge, he's keen on increasing the number of satellite-carrying rocket launches per year to serve the need for space data centers
2/3
As Elon Musk's SpaceX and xAI are set to merge, he's keen on increasing the number of satellite-carrying rocket launches per year to serve the need for space data centers
Starcloud has already begun running and training a large language model in space, so it can speak Shakespearean English
3/3
Starcloud has already begun running and training a large language model in space, so it can speak Shakespearean English
View gallery - 3 images

Terrestrial data centers are so 2025. We're taking our large-scale compute infrastructure into orbit, baby! Or at least, that's what Big Tech is yelling from the rooftops at the moment. It's quite a bonkers idea that's hoovering up money and mindspace, so let's unpack what it's all about – and whether it's even grounded in reality.

Let's start with the basics. You might already know that a data center is essentially a large warehouse filled with thousands of servers that run 24/7.

AI companies like Anthropic, OpenAI, and Google use data centers in two main ways:

  • Training AI models – This is incredibly compute-intensive. Training a model like the ones powering OpenAI's ChatGPT or Anthropic's Claude required running calculations across thousands of specialized chips (GPUs) simultaneously for weeks or months.
  • Running AI services – When you converse with those models' chatbots, your messages go to a data center where servers process it and send back the model's response. Multiply that by millions of users having conversations simultaneously, and you need enormous computing power ready on-demand.

AI companies need data centers because they provide the coordinated power of thousands of machines working in tandem on these functions, plus the infrastructure to keep them running reliably around the clock.

To that end, these facilities are always online with ultra-fast internet connections, and they have vast cooling systems to keep those servers running at peak performance levels. All this requires a lot of power, which puts a strain on the grid and squeezes local resources.

So what's this noise about data centers in space? The idea's been bandied about for a while now as a vastly better alternative that can harness infinitely abundant solar energy and radiative cooling hundreds of miles above the ground in low Earth orbit.

Powerful GPU-equipped servers would be contained in satellites, and they'd move through space together in constellations, beaming data back and forth as they travel around the Earth from pole to pole in the sun-synchronous orbit.

The thinking behind space data centers is that it'll allow operators to scale up compute resources far more easily than on Earth. Up there, there aren't any constraints of easily available power, real estate, and fresh water supplies needed for cooling.

Starcloud has already begun running and training a large language model in space, so it can speak Shakespearean English
Starcloud has already begun running and training a large language model in space, so it can speak Shakespearean English

There are a number of firms getting in on the action, including big familiar names and plucky upstarts. You've got Google partnering with Earth monitoring company Planet on Project Suncatcher to launch a couple of prototype satellites by next year. Aetherflux, a startup that was initially all about beaming down solar power from space, now intends to make a data center node in orbit available for commercial use early next year. Nvidia-backed Starcloud, which is focused exclusively on space-based data centers, sent a GPU payload into space last November, and trained and ran a large language model on it.

The latest to join the fold is SpaceX, which is set to merge with Elon Musk's AI company xAI in a purported US$1.25-trillion deal with a view to usher in the era of orbital data centers.

According to Musk's calculations, it should be possible to increase the number of rocket launches and the data center satellites they can carry. "There is a path to launching 1 TW/year (1 terawatt of compute power per year) from Earth," he noted in a memo, adding that AI compute resources will be cheaper to generate in space than on the ground within three years from now.

As Elon Musk's SpaceX and xAI are set to merge, he's keen on increasing the number of satellite-carrying rocket launches per year to serve the need for space data centers
As Elon Musk's SpaceX and xAI are set to merge, he's keen on increasing the number of satellite-carrying rocket launches per year to serve the need for space data centers

In an excellent article in The Verge from last December, Elissa Welle laid out the numerous challenges these orbital data centers will have to overcome in order to operate as advertised. For starters, they'd have to safely wade through the 6,600 tons of space debris floating around in orbit, as well as the 14,000-plus active satellites in orbit. Dodging these will require fuel.

You've also got to dissipate heat from the space-based data centers, and have astronauts maintain them periodically. And that's to say nothing about how these satellites will affect the work of astronomers or potentially increase light pollution.

Ultimately, there's a lot of experimentation and learning to be gleaned from these early efforts to build out compute resources in space before any company or national agency can realistically scale them up.

And while it might eventually become possible to do so despite substantial difficulties, it's worth asking ourselves whether AI is actually on track to benefit humanity in all the ways we've been promised, and whether we need to continually build out infrastructure for it – whether on the ground or way up beyond the atmosphere.

View gallery - 3 images
10 comments
10 comments
paul314
So let's figure that each watt worth of computer (plus power plus cooling plus structure plus fuel) weighs in at one gram (might easily be 100 times that). One terawatt of computing would be a billion kilograms, or a million metric tons, which is a ridiculous multiple of how much has been launched into orbit during all of recorded spaceflight history. At a price of $1000/kg to get into orbit, that would be $1 trillion just for launch costs. And then you get into the interesting signal-latency problems of having stuff far out enough in space so that it doesn't burn up in a few years or go into cascading-collision mode.
Seems to me that this is more about distracting investors from disappointing current results, combined with the techbro fantasy of once and for all escaping all earthly government. I wish them luck.
Neutrino23
There are so many problems with this. They can’t use off-the-shelf NVIDIA GPUs. They need radiation hardened electronics which are slower and less compact. These have yet to be designed or built.
They need active cooling. On earth they use chilled water. This would be a good idea in space as well because there is no air or water to absorb the heat and radiative cooling doesn’t work well till you have high temperatures. So a heat pump could chill water, producing very hot water at the other end and maybe this hot water, pumped through radiators, could get rid of the heat. This requires a lot of radiators, heat pumps and plumbing. No leaks allowed.
The GPUs are only good for a few years. Either they break down and need replacing or become obsolete because newer chips are developed.
On top of all the costs for building and launching all this infrastructure they will need a crew of astronauts in space constantly fixing the heat pumps and swapping out GPUs for newer ones or replacing those that burn out.
The real kicker is that so far AI is not producing hardly any revenue. So how will they finance all of this?
Tristan P
It seems less a matter of 'if', and more a matter of 'when'.
jsopr
Absolutely no benefits. Solar power is only slightly better, and everything else is much worse. And the heat dissipation is physically impossible. As much of a scam as Musk's spandex robot.
Trylon
"... these facilities are always online with ultra-fast internet connections." And how would orbital data centers fare using radio transmissions? I suspect physical connections using optical fiber will always be orders of magnitude faster. I would put my money on undersea data centers like China is building. They have the advantage in every aspect. Cheaper to build with no exorbitant launch costs. Easier to maintain and to upgrade as technology advances. Fast fiber connections. Good cooling using seawater rather than requiring large radiators. No hazards from space debris or micrometeoroids.
Aermaco
Everyone but Tristan sees the hype the problems and illogic. Remember Elon wants to live on Mars, he wants to drill through the earth with elevators to get traffic down to it just to reduce car surface traffic claiming traffic reduction in the sky is dumb as "hub caps will fall from flying cars", he wants to have robots proliferate before known to be completly safe. So its no surpize now he wants HUGE servers in space with all the dumb issues noted above. Why? its simply stupid logic profit to promote teslas and rocket factories.
Undersea seems best for many reasons, but what would be best is if the server's chips could be 100x to 1000x faster smaller cooler using vastly less energy.
Steve7734
The spacex proposal is reckless. A million more satellites in orbit, being replaced and deorbited on an ongoing basis. We already seem to be on the edge of Kessler syndrome with 30,000 satellites. Before anything like that is built we should have an orbital immune system constantly cleaning out debris... but that would require global cooperation...
MantisShrimpGiant
Musk apparently said "adding that AI compute resources will be cheaper to generate in space than on the ground within three years from now."
Musk is not very good at the timeline of these predictions & projections. Will this happen eventually? Yes. Will it be 3 years? I don't think so. Way to many complex variables and launch cost is just the easiest to wrap you heard around.
Musk has an internal cost of about $1k Kilo to ship his stuff to outer space right now. Assuming he hits his $100 kilo goal in 3 years, you still need new chip designs, new cooling designs, a maintenance model, ways to modularly expand and decomission, etc. My guess is this is actually about an 11-year timeline and that's only if some breakthrough isn't developed first that negates the need for this completely.
Garrulinae
Musk always comes out with ridiculous hyperbole. Still, readers shouldn't write the idea off entirely. As the article points out, Google and Nvidia are exploring the possibilities, amongst others. There are apparently some more level heads who consider the idea potentially feasible.
Christian
The biggest advantage of AI is that it'll support the rapidly shrinking human population. It would be impossible to maintain the level of manufacturing we have now with even a 5-10% drop in human numbers. But with AI and increased automation we might be able to sustain things...maybe...