F5 has introduced an advanced solution that integrates its proven network infrastructure capabilities with NVIDIA BlueField-3 DPUs to optimize performance in Kubernetes environments and support emerging edge AI applications. This innovation brings key networking and security functions, such as edge firewall, DNS, and DDoS protection, into cloud-native environments as lightweight functions, significantly enhancing the scalability and efficiency of AI-driven networks.
F5’s Application Delivery and Security Platform is already a critical component in many Tier-1 5G, mobile, and fixed-line telecom networks worldwide. Service providers are increasingly facing challenges in scaling AI applications across distributed infrastructures, as legacy network cores often lack the processing power needed for real-time AI inferencing. By leveraging F5’s Cloud-Native Network Functions (CNFs) on NVIDIA DPUs, service providers can overcome these limitations, embedding networking and security capabilities into edge and far-edge environments while optimizing computing resources, reducing power consumption per gigabit per second, and lowering overall operating expenses.
The growing adoption of AI at the edge also introduces heightened security requirements, which F5 and NVIDIA BlueField technologies address through advanced traffic management and low-latency processing. Deploying CNFs at the network edge enables applications to operate closer to users and their data, enhancing data sovereignty, improving user experiences, and cutting costs associated with power, space, and cooling. Low-latency performance is especially critical for AI-driven applications such as autonomous vehicles, fraud detection, real-time natural language processing (NLP), augmented and virtual reality (AR/VR) experiences, continuous monitoring in healthcare, and industrial automation in manufacturing.
This new capability builds upon F5’s existing BIG-IP Next for Kubernetes deployment on NVIDIA DPUs. The collaboration continues to leverage the NVIDIA DOCA software framework, which provides F5 with a robust set of APIs, libraries, and tools to seamlessly integrate its networking and security solutions with NVIDIA’s hardware acceleration technologies. The use of DOCA ensures high performance across networking and security offloads while maintaining compatibility across different generations of BlueField DPUs. By accelerating F5 CNFs with NVIDIA BlueField-3, CPU resources are freed up to run additional applications, improving overall system efficiency.
The expansion of edge computing capabilities also presents significant opportunities for service providers, including distributed N6-LAN capabilities for User Plane Functions (UPFs) and enhanced edge security services supporting Distributed Access Architecture (DAA) and Private 5G networks. AI-RAN, a rapidly advancing concept in the telecommunications industry, aims to transform mobile networks into multi-purpose infrastructures that maximize resource utilization, create new revenue streams through hosted AI services, and improve cost efficiency.
By integrating F5’s BIG-IP Next CNFs with NVIDIA BlueField-3 DPUs, mobile providers can accelerate AI-RAN deployments with streamlined traffic management for both AI and Radio Access Network (RAN) workloads. This solution provides advanced firewall and DDoS protections, ensures multi-tenancy and workload isolation, and allows mobile operators to leverage existing RAN infrastructure for AI applications. By repurposing network resources to power AI workloads alongside traditional RAN services, service providers can achieve significant cost savings while introducing new AI-driven offerings for their customers.
Ahmed Guetari, Vice President and General Manager of Service Provider at F5, emphasized that enterprises and service providers are increasingly looking for cost-effective ways to integrate application delivery and security within AI infrastructures. He highlighted that the edge is becoming a focal point for innovation, as AI inferencing and data processing no longer need to be centralized, opening up vast possibilities for enhancing intelligence and automation within networks.
The general availability of F5 BIG-IP Next Cloud-Native Network Functions on NVIDIA BlueField-3 DPUs is expected in June 2025. For further details, interested parties can visit F5 at the NVIDIA GTC event, taking place from March 17–21 in San Jose, California, or explore additional insights through F5’s companion blog and direct consultations.