SecurityBrief New Zealand - Technology news for CISOs & cybersecurity decision-makers
Story image

F5 & NVIDIA collaborate to enhance AI at network edge

Today

F5 announced the release of its BIG-IP Next Cloud-Native Network Functions (CNFs) on NVIDIA BlueField-3 DPUs to enhance AI capabilities at the edge for service providers.

This offering integrates F5's established network infrastructure features like edge firewall, DNS, and DDoS protection as cloud-native functions accelerated by NVIDIA's BlueField-3 DPUs. The solution aims to deliver improved performance in Kubernetes environments and cater to emerging edge AI use cases.

The collaboration between F5 and NVIDIA targets the demands of AI inferencing at the edge, which requires a robust distributed infrastructure as telecom providers seek to add value. Ash Bhalgat, Senior Director of AI Networking and Security Solutions, Ecosystem and Marketing at NVIDIA, explained the significance: "As demand for AI inferencing at the edge takes centre stage, building an AI-ready distributed infrastructure is a key opportunity for telecom providers to create value for their customers. F5's cloud-native functions, accelerated with NVIDIA's BlueField-3 DPUs, create a powerful solution for bringing AI closer to users while offering unparalleled performance, security, and efficiency for service providers. We're not just meeting edge AI demands; we're empowering businesses to leverage AI to maintain a competitive edge in our connected world."

This advancement promises to aid service providers who face challenges with scaling AI applications across distributed networks, particularly when current systems may lack necessary processing capabilities. The integration of F5 CNFs with NVIDIA BlueField-3 DPUs into edge infrastructures can enhance computing efficiency, reduce power consumption, and lower operational costs.

The edge deployment strategy is designed to place applications nearer users and their data, improving data sovereignty, user experience, and reducing associated costs. It aims to meet low latency requirements crucial for various AI applications such as autonomous vehicles, fraud detection, real-time user interactions including NLP tools, AR/VR experiences, and continuous monitoring in sectors like healthcare and manufacturing.

Building on prior developments, F5 continues to leverage the NVIDIA DOCA software framework, facilitating the integration of its solutions with BlueField DPUs. This framework supports high performance and seamless integration across different networking and security tasks while maintaining compatibility across evolving generations of DPUs.

AI-RAN, or Artificial Intelligence Radio Access Networks, is seen as a transformative approach in mobile network infrastructures. The collaboration between NVIDIA and F5 seeks to accelerate AI-RAN deployments, enabling enhanced traffic management and supporting emerging AI services alongside traditional RAN systems.

Ahmed Guetari, VP and GM, Service Provider at F5, remarked on the growing interest in edge infrastructures: "Customers are seeking cost-effective ways to bring the benefits of unified application delivery and security to emerging AI infrastructures, driving continued collaboration between F5 and NVIDIA. In particular, service providers see the edge as an area of rising interest, in that data ingest and inferencing no longer must take place at a centralised location or cloud environment, opening up myriad options to add intelligence and automation capabilities to networks while enhancing performance for users."

F5's BIG-IP Next CNFs, enhanced by NVIDIA BlueField-3 DPUs, are expected to be available in June 2025. This collaboration underscores a broader trend of integrating advanced computing capabilities at the network edge to meet the increasing demands for low latency and high-efficiency AI applications.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X