The future of data centers is taking a dramatic turn as companies explore the potential of operating in low Earth orbit. Industry leaders, including Elon Musk, Jensen Huang, Jeff Bezos, and Sam Altman, are now openly discussing the viability of this concept. A recent white paper from the Nvidia-backed startup Starcloud emphasizes that establishing artificial intelligence (AI) data centers in space could provide solutions to significant challenges faced by terrestrial facilities, including energy shortages and infrastructure limitations.
As the demand for AI workloads continues to escalate, terrestrial data centers are expected to hit capacity limits by the end of this decade, with expectations of multi-gigawatt electricity needs. This scenario is exacerbated by worsening grid bottlenecks and increasing electricity demands. Starcloud’s proposal suggests that orbital data centers could effectively bypass these constraints by harnessing nearly continuous solar energy and utilizing passive radiative cooling methods to maintain optimal operating temperatures in space.
According to Starcloud’s report, “Orbital data centers can leverage lower cooling costs using passive radiative cooling in space to directly achieve low coolant temperatures.” The company points out that these facilities can be scaled rapidly and almost indefinitely due to their modular designs, which allow for efficient deployment via SpaceX rockets. The timing for this opportunity is particularly favorable, as new reusable and cost-effective heavy-lift launch vehicles are becoming available, alongside advancements in in-orbit networking.
Starcloud has already made significant strides in this direction. On November 13, 2025, they launched their Starcloud-1 satellite, which carries the Nvidia H100 GPU, the most powerful compute chip ever deployed in space. This milestone was marked by the successful training of NanoGPT, a lightweight language model, on the complete works of Shakespeare, making it the first AI model trained in orbit.
Additionally, Starcloud is operating Google’s open-source large language model, Gemma, in space. This represents a groundbreaking achievement, as it is the first instance of a high-powered Nvidia GPU being used to run a large language model outside Earth’s atmosphere.
Transitioning some data center operations to low Earth orbit may serve as a short-term answer to the increasing demand for power and resources, particularly as nuclear power generation ramps up. This shift could also ignite a new wave of investment, reminiscent of the competitive spirit seen in space exploration. Notably, SpaceX is preparing for a public offering next year, with an anticipated valuation of $800 billion. The company’s Starlink project is expected to play a crucial role in powering these space-based data centers.
As the industry moves forward, the implications of orbital data centers promise to reshape the landscape of AI infrastructure, potentially alleviating existing terrestrial constraints and paving the way for unprecedented advancements in technology.
