Series A

ODC announces Series A to scale the sovereign Distributed Compute Grid

The Network
That Thinks.

ODC delivers the Sovereign Distributed Compute Grid for the AI-Native eraunifying communication, sensing, and edge intelligence into a single open-architecture fabric.

Investors

Investors

Three pillars
One sovereign fabric

The Odyssey Platform collapses three historically separate domains into a single, open, programmable stack — purpose-built for AI-native networks.

Autonomous Communication

Software-defined 5G/6G with no vendor lock-in. A programmable RAN stack that decouples hardware from software, enabling mission-critical wireless on open, sovereign silicon.

Environmental Sensing

Turning the RAN into a high-resolution spatial sensor. ODC enables passive and active sensing overlays — detecting, locating, and classifying objects through the same spectrum used for communications.

Edge Intelligence

Real-time generative inference and Agentic AI at the forward edge. ODC co-locates AI compute directly within the RAN, enabling sub-millisecond decision loops without round-trips to centralised cloud.

Built in America
Open by design

ODC is a US-headquartered sovereign technology stack — purpose-built for the United States Government, National Security, and allied telecom operators who require full supply-chain assurance and architecture they control. 

US Headquartered

All IP and infrastructure within the US ecosystem

Open Architecture

No single-vendor dependency — full auditability at every layer

Supply Chain Assured

Vetted, resilient supply chain built to national security standards

USG & National Security Ready

Built ground up with cloud native and zero trust principles

Built on NVIDIA

When we say “The Network That Thinks,” it isn’t a metaphor. It is a direct consequence of running our RAN software stack on the NVIDIA Aerial RAN Computer family — the world’s most powerful AI-RAN hardware platforms.

The Aerial RAN Computer family delivers GPU-accelerated Layer 1 processing and co-located AI inference in a single 1U platform — enabling real-time generative inference, adaptive beamforming, and spectrum analytics simultaneously at the edge.

This is what tethers “the thinking network” to silicon that ships today. Not vaporware. Not a roadmap item.

Built on

Aerial RAN Computer Family

AI-RAN Infrastructure Platform

ODC Certified

Architecture

GPU accelerated Layer 1 and Layer 2

AI Workloads

Co-located Inference

Form Factor

Edge Server

Latency

< 1 ms

Standard

3GPP / O-RAN

ODC Mode

AI-for-RAN, AI-on RAN and AI-and-RAN