The Silent Revolution
You’ve heard the hype: data centers in space, powered by unlimited solar energy, cooled by the vacuum of space. It sounds like science fiction.
It’s not. It’s already running.
On January 2026, Canadian company Kepler Communications quietly launched the largest orbital compute cluster in history. Forty Nvidia Orin edge processors, spread across 10 operational satellites, linked together by laser communications. Today, it has 18 paying customers. And this week, it announced its newest: Sophia Space, a startup that will attempt something never done before - configuring a distributed operating system across multiple GPUs in orbit.
Welcome to the era of edge processing in space. While Elon Musk dreams of million-satellite data centers, Kepler is actually building one. And the implications for India - for our defense, our startups, and our digital future - are immediate.
Read also: Claude vs. ChatGPT vs. Gemini: The Winner Isn't Who You Think
Not a Data Center. An "Infrastructure Layer."
Forget the image of a giant server farm floating in space. That’s decades away. Kepler’s CEO Mina Mitry is careful with his language: “We don’t see ourselves as a data center company. We are infrastructure for applications in space”.
Think of it as a distributed edge network. Each satellite carries modest compute power. They’re connected by laser links that transmit data at the speed of light, in real time, without touching the ground. The U.S. military is already a key customer, using the cluster to process data from missile defense satellites that track threats.
This is the opposite of cloud computing. Instead of sending petabytes of raw sensor data down to Earth for processing, Kepler processes it in situ. By the time a satellite passes over a conflict zone, the insights are already generated.
Sophia Space’s $10 Million Bet on “Passive Cooling”
The biggest barrier to orbital compute isn't launch costs - it's heat. In the vacuum of space, there’s no air to carry heat away from processors. Today’s data center chips would melt themselves in minutes.
Sophia Space is solving this with passively-cooled computers. No heavy radiators, no circulating fluids. Just clever materials science that radiates heat directly into space. Starting this year, they’ll upload their operating system to Kepler’s satellites and attempt to configure it across six GPUs on two separate spacecraft.
If successful, they’ll de-risk their own satellite launch planned for late 2027. Sophia CEO Rob DeMillo points to an unexpected tailwind: Wisconsin just banned new data center construction. If the U.S. starts restricting on-premise compute, orbital alternatives suddenly look very attractive.
Read also: Inside the $30B Surge: How Anthropic is Quietly Winning the Enterprise WarIndia Is Already in This Race
While Western startups grab headlines, India is quietly positioning itself.
Agnikul Cosmos, the Chennai-based space startup, plans to launch an AI data center prototype in orbit by the end of 2026. Developed with Bengaluru’s NeevCloud, it will focus on AI inference - deploying trained models to analyze new data - rather than the energy-intensive training.
“In space, there is an availability of unlimited solar energy, and cooling is much more efficient because you are exposed to temperatures close to absolute zero,” Agnikul co-founder Srinath Ravichandran told PTI. “Additionally, it is physically safer because it is not easy to access a data center in orbit”.
The system will use a constellation of rocket upper stages in low Earth orbit, each hosting different data center modules. They’ll run in low-power mode during Earth’s shadow, then switch to high-power mode in sunlight. It’s an elegant solution to the power cycling problem that plagues orbital infrastructure.
Meanwhile, Starcloud (formerly Lumen Orbit) has raised $170 million at a $1.1 billion valuation to build space-based training clusters. They’ve already launched an Nvidia H100 into orbit and trained an LLM in space. Their pitch: “We should train future large AI models in space to make use of abundant solar energy, cooling, and the ability to freely scale up”.
The Economics: Why Now?
The global space-based edge computing market was valued at $168.91 billion in 2025. It’s projected to hit $345 billion by 2034. The drivers are relentless:
- Energy constraints on Earth: Hyperscale data centers are crashing against power grids and water permits.
- Falling launch costs: SpaceX’s reusable rockets have dropped the price of putting mass into orbit by over 90% in a decade.
- Latency requirements: For military and financial applications, sending data down to Earth and back up is too slow.
Kepler’s Mitry notes that the economics favor inference over training. “If this thing consumes kilowatts of power and you’re only running at 10% of the time, then that’s not super helpful. In our case, our GPUs are running 100% of the time”.
The View from India
For Indian tech professionals, this shift has three immediate implications:
1. Defense and surveillance: India’s military has long relied on satellite imagery. But raw imagery is useless without processing. Orbital compute clusters will allow real-time threat detection, without the lag of downlinking to ground stations. The U.S. is already doing it. India will need to follow.
2. Startup opportunity: Agnikul and NeevCloud are just the beginning. The space compute stack - radiation-hardened chips, passive cooling, optical inter-satellite links - is wide open for innovation. Indian engineering talent is perfectly suited for this.
3. Career path shift: If you’re a developer or systems architect, orbital compute will demand new skills. How do you write code for a distributed system where latency between nodes is measured in milliseconds, but the nodes are moving at 7 km/s? How do you handle failover when a satellite passes into Earth’s shadow? These are not theoretical problems. They’re the future.
Read also: The AI Tool Everyone Trusted Just Became a Backdoor. Mercor Learned the Hard Way.
The Bottom Line
Kepler’s cluster is not the orbital data center of science fiction. It’s 40 GPUs, 10 satellites, and a laser network. It’s modest.
But it’s real. It’s operational. And it’s the first step toward a future where the most sensitive, compute-intensive workloads run not in Virginia or Mumbai, but 500 kilometers above your head.
India’s space agency ISRO has the launch capability. Its startups have the vision. Now it needs the urgency. Because while we’re debating data center permits in Bengaluru, Canada just put a cloud in the sky.
And it’s never coming down.
Share This With Someone Who Still Thinks Space Is Just Rockets
Tag a friend who works in cloud infrastructure. Share this in your tech WhatsApp group. Post it on LinkedIn with the caption: “The first orbital compute cluster is live. India’s space startups are already in the race.”
The next frontier of computing isn’t underwater. It’s in orbit.
Also read: France Just Declared War on Microsoft. Windows Is Out, Linux Is In.
FAQ
Q: How powerful is Kepler’s orbital compute cluster?
A: It currently has 40 Nvidia Orin edge processors across 10 satellites, linked by laser communications. That’s modest compared to terrestrial data centers, but it’s the largest in orbit today.
Q: What’s the difference between Kepler and Starcloud?
A: Kepler focuses on distributed edge inference - processing data where it’s collected. Starcloud aims for large-scale AI training in space, using data-center-grade GPUs. Both are viable approaches.
Q: Is India building orbital data centers?
A: Yes. Agnikul Cosmos (Chennai) and NeevCloud (Bengaluru) plan to launch an AI inference data center prototype by end of 2026, with commercial operations targeted for 2027.
Q: Why put data centers in space?
A: Three reasons: unlimited solar energy, passive cooling (the vacuum of space is very cold), and no land use permits or NIMBY opposition. Also, for military applications, orbital compute enables real-time processing without downlinking.
Q: When will large-scale orbital data centers arrive?
A: Experts expect we won’t see SpaceX or Blue Origin-scale orbital data centers until the 2030s. The near-term value is in edge processing for defense, surveillance, and scientific sensors.

Have a question about AI or the latest tech trends? We’d love to hear your thoughts!
Please stay on topic and keep it helpful. Note: All comments are moderated to keep our community spam-free.