Aria Networks Secures $125M for AI-Optimized Networking Solution
In a significant move within the tech industry, startup Aria Networks has raised $125 million in funding to launch its Deep Networking platform, which focuses on optimizing networks for artificial intelligence applications. Founded by Mansour Karam in January 2025, Aria Networks aims to revolutionize the way networking supports AI by using advanced telemetry and a structured approach to network management.
Unlike traditional networking solutions that rely on a switch-centric model, Aria's methodology emphasizes a path-centric approach that leverages microsecond telemetry. This innovative strategy allows for real-time data collection and analysis across network components such as switches, transceivers, and host network interface cards (NICs).
Karam, who previously founded the intent-based networking company Apstra, which was acquired by Juniper Networks in 2020, stated, "For AI to be effective, it must be specialized for its domain, which requires building an architecture optimized specifically for AI."
Understanding Deep Networking
The Deep Networking platform is designed to position the network as an active participant in AI cluster performance rather than merely serving as passive infrastructure. By employing fine-grain telemetry at the application-specific integrated circuit (ASIC) level, along with intelligent agents at each layer of the network stack, the platform aims to enhance overall performance and efficiency.
Traditional network monitoring tools, such as NetFlow, often collect data after the fact at a coarse resolution. In contrast, Aria's platform captures telemetry in real-time with microsecond granularity. Karam elaborated, "We have embedded code inside the ASIC, on the ARM processors, that extracts telemetry to enable adaptive tuning of network parameters without the need for manual intervention."
Key Technical Features
One of the primary innovations of Deep Networking is its ability to adjust dynamic load balancing parameters and data center congestion notifications based on real-time data. The architecture is layered, allowing for immediate responsiveness to link-level events and strategic decisions regarding traffic management across the cluster. This includes using a large language model-based agent that can provide insights to operators in natural language, allowing for intuitive queries about network conditions.
Karam warned that merely integrating a large language model with existing architectures could lead to significant issues, stating, "If you ask it to do anything, it could hallucinate and bring down the network, as it lacks the necessary context and data to operate safely."
New Metrics for Network Evaluation
Aria Networks is shifting the focus of networking evaluation from traditional metrics like bandwidth and latency to two new metrics: Model FLOPS Utilization (MFU) and token efficiency. MFU represents the ratio of achieved floating point operations per second (FLOPS) to the theoretical peak available, with Karam noting that typical MFU values for training workloads hover between 33% and 45%.
Token efficiency is defined as tokens consumed per dollar or tokens produced per unit of time, emphasizing the network's crucial role in achieving high efficiency. Karam explained that a small issue, such as a malfunctioning NIC in a large cluster, could significantly impact overall performance and revenue generation.
According to Aria's analysis, even a modest 3% improvement in MFU across a large cluster could translate to approximately $49.8 million in annual revenue gains, underscoring the financial implications of network performance.
Hardware Offerings
To support its innovative approach, Aria Networks has developed a hardware portfolio leveraging Broadcom ASICs and a hardened implementation of SONiC. This portfolio includes three switch models:
- Aria Switch 800G: Features 64 x 800G OSFP ports based on the 51.2T Broadcom Tomahawk 5 ASIC.
- Aria Switch 1.6T High Radix: A 4RU air-cooled unit with 128 x 800G OSFP ports powered by the 102.4T TH6 ASIC.
- Aria Switch 1.6T: A 2RU switch supporting both air and liquid cooling, equipped with 64 x 1.6T OSFP ports.
Forward Deployed Engineers and Future Directions
Aria is implementing a unique model by embedding forward deployed engineers (FDEs) with customers from the outset. This approach is distinct from traditional professional services, as the FDEs work directly with clients to ensure that their insights and data feed back into the product development cycle.
Karam emphasized the significance of this model: "Everything the forward deployed engineers do ultimately gets engineered back into the products. They are totally aligned directionally with the product and not a separate business."
With a focus on continuous improvement and weekly software updates, Aria aims to enhance its platform's capabilities while maintaining user safety and network reliability. Karam concluded, "Job number one is to ensure your network is always up."
Source: Network World News