The Economist
Elon Musk thinks it will be feasible “within two years, maybe three at the latest”. Sam Altman of OpenAI says it is “ridiculous … we are not there yet”. Google plans to test the concept next year. Eric Schmidt, its former boss, has bought a rocket-launch company to pursue it.
At issue is whether the best place to build data centres for artificial intelligence is not on Earth but in space.
It is getting harder to build terrestrial data centres. Of the global capacity due to come on stream this year, 30 per cent to 50 per cent could be delayed, according to Sightline Climate, a research outfit, up from 26 per cent in 2025.
There are many reasons for this. Winning construction permits and establishing grid connections takes time; public opposition, of the sort that has led several American states to propose moratoriums on new projects, is high; and demand for electricity is soaring.
That is why putting a constellation of number-crunching satellites into orbit, where solar energy is abundant, strikes some as a good idea. Musk has just merged SpaceX, his rocket company, with xAI, his AI startup, with this aim in mind, and applied for a licence to build an orbital data centre consisting of up to 1 million satellites. But does it make sense?
Musk is talking up the possibility of orbital data centres as he prepares to take SpaceX public.
The most obvious barrier is launch cost. SpaceX delivers payloads to orbit at a price of around $US1500 ($2107) per kilogram with its Falcon Heavy, or $US3400/kg with its Falcon 9. (The actual cost to SpaceX is about 25 per cent of this sum.)
But two other numbers are just as crucial: specific power (how many watts of processing power can be provided per kilogram of satellite) and satellite cost (in dollars per watt of processing power). Those depend, in large part, on the weight and performance of solar panels and heat-emitting radiators. Another unknown is the impact of radiation on the reliability of AI chips running in space.
Estimates of all these numbers are needed to determine the feasibility of orbital data centres.
Andrew McCalip, an engineer who works at Varda, a space startup, has built a web-based calculator that allows the cost of an orbital data centre of a given capacity to be compared with that of a terrestrial one.
It estimates that building a data centre with an extremely high capacity of 1GW and running it for five years on Earth costs $US15.9 billion.
An orbital equivalent, assuming a launch cost of $US500/kg, a specific power of 37W/kg, a satellite cost of $US22/W and special orbits that keep the satellites in daylight 98 per cent of the time, would cost an exorbitant $US51.1 billion. (Those totals exclude the cost of the AI chips, or GPUs, which would be $US15 billion to $US30 billion because the same chips are needed either way.)
So a slam-dunk for Earth, then? Not quite.
Starcloud, a company founded in 2024 to pursue the idea of orbital data centres, has been crunching the numbers for AI in orbit – literally. In November, the company sent Starcloud-1, a fridge-size satellite containing an ordinary Nvidia H100 GPU, of the type used in AI data centres, into space.
Starcloud used it to train a small AI language model, NanoGPT, on the works of Shakespeare, and to answer some queries while running Gemma, an open-source large language model made by Google.
That provided valuable data on the reliability of AI chips under orbital conditions. The company also has a good handle on the other crucial figures.
Stats of play
Start with specific power. McCalip’s figure of 37W/kg comes from the thousands of satellites used in SpaceX’s Starlink constellation, thought to be state of the art, which provide high-speed internet to users around the world.
But Starlink satellites have to do things that AI satellites do not. They need costly “phased-array” antennas to communicate with the ground, and must have a stable orientation at all times. AI satellites, by contrast, would not need to communicate with the ground—only with their neighbours, using laser links. They would, therefore, be able to devote much more of their mass to delivering processing power.
And without the need for such accurate pointing, they could have solar panels that are slightly flexible, reducing their mass and further boosting specific power.
Musk has said that he thinks a specific power of 100W/kg is feasible for an AI satellite, and some believe that by using more efficient solar cells even 150W/kg may be possible in future.
Philip Johnston, Starcloud’s boss, says his firm is aiming for a specific power of 70W/kg for its forthcoming satellites, based on what it considers to be quite conservative assumptions.
Moving on to satellite cost, Johnston says Starcloud expects its design to cost “less than $US5 per watt” when GPU costs are excluded. McCalip estimates that Starlink’s current satellites cost around $US22/W, down from $US32/W for its original version. Again, an AI satellite should cost less to build, GPUs aside, because it does not require costly communications components.
Move the sliders on McCalip’s calculator to a specific power of 70W/kg and a satellite cost of $US5/W, and the numbers look rather different: now the 1GW orbital data centre costs $US16.7 billion, only 5 per cent more than the terrestrial one.
A number of optimistic assumptions are needed to get there. First, a launch price of $US500/kg, roughly a third of what is available today. But, if SpaceX’s new Starship rocket starts working, launch costs could fall fast. Because Starship is designed to be fully reusable, the price of sending a kilogram into orbit could drop to $US100 to $US200, says Johnston. (The actual cost to SpaceX would be much less; possibly as low as $US20/kg.)
Put a launch price of $US200/kg into McCalip’s calculator, and the cost of the 1 gigawatt orbital data centre drops to $US12 billion—less than the terrestrial one. The idea, in short, may not be quite as crazy as it looks.
Another unknown is cooling. Starcloud’s initial satellite could not run its GPU round the clock because (as expected) it got too hot. The firm plans to launch a second test satellite, Starcloud-2, this year to evaluate its design for an unfolding radiator, to provide cooling. Johnston says it will be “the largest commercial deployable radiator in space”, second in size only to the radiator on the International Space Station, but providing 10 times as much heat dissipation per kilogram. Starcloud’s cost estimates assume that this radiator will work as planned.
Other assumptions may be too pessimistic. For one, McCalip’s calculator assumes that as many as 9 per cent of GPUs launched into orbit will fail every year. But one lesson from Starcloud-1, says Johnston, is that “GPUs work better in space than we had expected.” He is reluctant to share the exact figures. But if only 5 per cent of GPUs fail each year, fewer satellites would be needed, and the cost of the orbital data centre would drop to $US11.1 billion.
Of course, the costs of terrestrial data centres can come down too—and such reductions might be easier to achieve than building ones in space. McCalip’s calculator assumes that terrestrial data centres rely on natural gas generators for electricity, but solar would be cheaper, knocking perhaps $US1 billion to $US2 billion off the total cost. Construction might also be much less expensive outside America, particularly in a low-wage economy with abundant sunshine, such as India.
For now, the thing to keep an eye on is whether Starship can be made to work in a reliable and reusable manner. Musk is talking up the possibility of orbital data centres as he prepares to take SpaceX public, sometime in the coming year.
For its part, Starcloud is skating to where it expects the puck to be in a couple of years, assuming that Starship opens up opportunities based on low-cost launch. Starship’s next test flight, its 12th, is expected to take place in March. Many in the AI industry will be watching closely.
The Economist
The Market Recap newsletter is a wrap of the day’s trading. Get it each weekday afternoon.