In the early hours of Friday Beijing time, Elon Musk appeared as a guest on the renowned tech podcast Dwarkesh Podcast. The 'Space GPU,' a hot topic in the global capital markets, was once again elaborated on by the world's richest individual.
“Mark my words, within 36 months, space will become the cheapest place to deploy artificial intelligence.”
Regarding the recent heated discussions about 'space data centers,' Musk began by addressing the topic from multiple dimensions.
The world’s richest man pointed out that the core reason for sending data centers into space is that the growth of electricity supply cannot keep up with chip production. Musk remarked, “Chip output is almost growing exponentially, but power generation remains flat. So how do you power all these chips? With magic power sources? Magic electricity elves?”
He also made a rather startling prediction: by the end of this year, people will reach a point where they cannot power up large clusters. Chips will pile up, but there won’t be enough electricity.
He also noted that sending solar panels into space not only leads to higher energy efficiency in space and eliminates the need for additional battery configurations but also avoids the complicated approval procedures required for establishing photovoltaic farms on the ground. From this perspective, large-scale expansion on the ground will be more challenging than in space.
Musk casually outlined a timeline for the economic feasibility of space GPUs: “Any solar panel in space can generate approximately five times the power of one on the ground. At the same time, you don’t need to bear the cost of configuring batteries to get through the night. In fact, deploying in space could be much cheaper. My judgment is that running AI in space will become the lowest-cost option, and overwhelmingly so. This shift will happen within 36 months, and it might even take just 30 months.”
On the maintenance issues of sending GPUs into space, Musk pointed out that chips may experience some early failures upon delivery, which can obviously be resolved on the ground. After an initial debugging period, they can then be sent into space. Once chips reach a certain stage, they become quite reliable, so maintenance will not be an issue.
Musk reiterated enthusiastically: “You can remember what I’m saying. Within 36 months, or even closer to 30 months, putting AI in space will be the most economically attractive option. And from that point onward, the advantages of placing it in space will become absurdly good.”
What the world's richest man struggles to buy: power generation equipment
Amid a shortage of electricity supply, Musk also explained why data centers cannot be equipped with large-scale colocated power facilities: gas turbines are unavailable, and US tariffs make imported solar panels excessively expensive.
Musk explained: "The bottleneck of gas turbines lies in the guide vanes and impeller blades inside the turbine, as the casting of these turbine blades and guide vanes is a highly specialized process when using gas power generation. Other forms of power generation are actually difficult to scale up. Solar energy can theoretically be expanded, but currently, the tariffs imposed by the U.S. on imported photovoltaic products are astonishingly high, while domestic production capacity is extremely limited."
For solar cells intended for space, Musk stated that since there is no weather in space, solar cells launched into space require less glass and no heavy mounting frames, making them actually 5 to 10 times cheaper than their terrestrial counterparts.
Musk also expressed frustration, stating that outsiders have no idea how power-intensive it is to operate data centers.
He explained that, apart from NVIDIA chips, power must also be supplied to all networking hardware and storage devices, and planning must account for peak cooling demands at each location. He mentioned that at xAI’s data center in Memphis, cooling alone increases power consumption by 40%, and power equipment requires offline maintenance, necessitating an additional 20%-25% capacity. Therefore, approximately one gigawatt of power capacity is needed to support 330,000 GB300 units.
Incidentally, Musk also expressed concern over the skyrocketing prices of memory chips, stating that the path to manufacturing logic chips appears clearer than securing sufficient memory to support those chips. Jokingly, he remarked, 'If you’re stranded on a deserted island and write “help” in the sand, no one will come. But if you write “DDR memory,” boats will flock to you.'
In his vision, the limitation before sending AI to space is energy, but after reaching space, the challenge shifts to chips. Thus, in his conception, TeraFab will not only need to produce its own logic chips but may also have to handle storage and packaging independently in the future.
Editor/Melody