Technology
F5 Accelerates AI at the Edge with NVIDIA BlueField-3 DPUs to Boost Network Efficiency
F5 (NASDAQ: FFIV) has announced the deployment of BIG-IP Next Cloud-Native Network Functions (CNFs) on NVIDIA BlueField-3 DPUs (Data Processing Units), further enhancing its capability to deliver optimized network performance, advanced security, and edge AI innovations. This latest development strengthens F5's strategic collaboration with NVIDIA, providing service providers with a cost-effective and scalable solution to accelerate AI deployment at the network edge.
The collaboration combines F5’s network infrastructure capabilities, such as edge firewalls, DDoS protection, and DNS management, with the high-performance computing power of NVIDIA BlueField-3 DPUs. The cloud-native network functions (CNFs) offer service providers a transformative approach to optimize computing resources, reduce operating costs, and enhance AI inferencing performance in edge environments.
Driving AI Capabilities at the Edge
With the growing demand for AI-powered applications such as autonomous vehicles, real-time user interactions, and continuous device monitoring, service providers face increasing challenges in managing data processing across distributed networks. Traditional infrastructure often struggles to provide sufficient processing power for AI inferencing, creating a gap in delivering low-latency services.
By deploying F5 CNFs on NVIDIA BlueField-3 DPUs, service providers can now leverage hardware-accelerated performance, reducing power consumption per Gbps and minimizing operational costs. This approach ensures that data processing, AI workloads, and network traffic management can take place closer to end-users, improving user experiences and driving operational efficiency.
Ahmed Guetari, Vice President and General Manager of Service Provider at F5, emphasized the importance of this collaboration, stating:
“Customers are seeking cost-effective ways to bring the benefits of unified application delivery and security to emerging AI infrastructures, driving continued collaboration between F5 and NVIDIA. Service providers now have an opportunity to transform their networks by shifting AI inferencing to the edge, ensuring seamless user experiences and driving higher revenue potential.”
Transforming Network Efficiency with Edge Computing
The deployment of F5’s BIG-IP Next CNFs on NVIDIA BlueField-3 DPUs will empower service providers to optimize their RAN (Radio Access Network) infrastructure through AI-RAN (Artificial Intelligence-RAN) capabilities. This shift allows providers to utilize existing RAN infrastructure to power both AI services and traditional network functions, reducing infrastructure costs while maximizing resource utilization.
Some of the key benefits for service providers include:
- Real-time decision making for autonomous vehicles and fraud detection.
- Enhanced user experience in AR/VR applications and NLP (Natural Language Processing) tools.
- Continuous monitoring for healthcare devices, manufacturing robotics, and other IoT devices.
Leveraging NVIDIA DOCA for Seamless Integration
F5 has further strengthened its solution by utilizing NVIDIA’s DOCA (Data Center Infrastructure-on-a-Chip Architecture), which provides seamless hardware acceleration for networking, security, and AI inference workloads. Through DOCA, F5 ensures that its CNFs are compatible with multiple generations of BlueField DPUs, offering forward and backward integration. This results in reduced CPU usage, allowing infrastructure to dedicate more power to AI applications.
Ash Bhalgat, Senior Director of AI Networking and Security Solutions at NVIDIA, highlighted the transformative potential of this collaboration, stating:
“As demand for AI inferencing at the edge takes center stage, building an AI-ready distributed infrastructure is a key opportunity for telecom providers. By integrating F5’s cloud-native functions with NVIDIA BlueField-3 DPUs, we are enabling service providers to deliver ultra-low-latency AI services with maximum efficiency, ensuring competitive advantage in today’s connected world.”
Accelerating AI-RAN Deployments
The collaboration also enables F5 and NVIDIA to drive the growth of AI-RAN (Artificial Intelligence-Radio Access Networks), where traditional RAN infrastructure can now support AI workloads alongside network services. This innovation allows mobile operators to:
- Reduce infrastructure costs by consolidating AI processing and RAN management.
- Unlock new revenue streams through AI-powered services.
- Ensure network security with built-in DDoS protection, firewalls, and traffic management.
As a result, service providers can maximize resource utilization and extend their infrastructure’s value by deploying AI capabilities at the edge.
Availability and Market Impact
F5 confirmed that BIG-IP Next CNFs on NVIDIA BlueField-3 DPUs will become generally available in June 2025. Additionally, F5 will showcase its AI and edge network capabilities at the upcoming NVIDIA GTC Event scheduled from March 17-21 in San Jose, California. This development is expected to revolutionize service provider infrastructure, bringing faster AI adoption, enhanced network security, and improved operational efficiency to the telecom industry.
For more information about F5’s AI-driven network functions, visit the official website or connect with F5 representatives at NVIDIA GTC.