Google Unveils ‘Project Suncatcher’ for In-Space Computing
The technology sector is rapidly constructing data centers for artificial intelligence, acquiring land to support this expansion. The significant energy requirements and logistical complexities associated with these facilities have prompted interest in exploring space-based infrastructure solutions.
Industry leaders such as Jeff Bezos and Elon Musk are exploring the concept of deploying GPUs in space and now Google has now confirmed its own initiative in this area. Google’s latest project, known as Project Suncatcher, aims to develop scalable networks of orbiting TPUs.
Project Suncatcher is a new “moonshot” initiative dedicated to advancing in-space computing and solar-powered satellite technology. Scheduled for an orbital demonstration with Planet in early 2027, Project Suncatcher originates from X, Google’s division known as the “moonshot factory.”
The project aims to investigate the potential of a solar-powered satellite network, featuring Google’s Tensor Processing Unit (TPU) AI chips, to harness solar energy for scalable artificial intelligence computing in space. Google published early research findings indicating that space-based machine learning computation is feasible according to current physics and economic considerations. “In the future, space may be the best place to scale AI compute,” Google executives wrote in a blog post.
According to the pre-print study released by Google, there are notable benefits to the approach. Suncatcher satellites would occupy a dawn-dusk sun-synchronous low-Earth orbit, ensuring near-continuous exposure to sunlight. This arrangement addresses the persistent issue of electricity costs for terrestrial data centers; moreover, solar panels operate up to eight times more efficiently in orbit compared to on Earth’s surface. Sustained, high-efficiency solar power enables enhanced data processing capabilities.
However, several engineering challenges, such as thermal management, high-bandwidth ground communications, and reliable on-orbit systems, must be addressed.
Google requires that TPUs operate reliably for at least five years, equating to radiation exposure of 750 rad. Testing procedures include subjecting the latest v6e Cloud TPU (Trillium) to a 67 MeV proton beam, revealing that while memory components are most susceptible to damage, TPUs can withstand approximately three times the anticipated radiation (nearly 2 krad) before data integrity is compromised.
In collaboration with Planet, Google intends to deploy two prototype satellites by early 2027 to assess the functionality of its machine learning models and TPU hardware in space and to validate the application of optical inter-satellite links for distributed machine learning tasks.
According to Planet, this focus on sophisticated, high-performance space computation aligns with its technology development trajectory for the recently announced Owl mission. The Project Suncatcher prototypes will utilize the same satellite bus as the Owl mission.
Looking ahead to the mid-2030s, the company projects that launch expenses may decline to $200 per kilogram, at which point space-based data centers could achieve cost parity with terrestrial equivalents.
Conventional data centers consume substantial quantities of energy and water, generate noise, and often encounter community resistance due to their impact on local environments. Relocating these facilities to space presents a potential solution to these concerns, although it also introduces new challenges.
Related:
Exploring a space-based, scalable AI infrastructure system design



Fascinating. Your breakdown of Project Suncatcher’s potential is so insightful. The efficiency sounds incredible, but it makes me wonder about the increasing orbital debri this might contribute to long term.