Vibhu Kapoor (Epicor): The Risks of AI Without ERP Integration

Author:

F5 has introduced a cutting-edge solution that integrates its well-established network infrastructure capabilities—such as edge firewall, DNS, and DDoS protection—into lightweight, cloud-native functions. These functions are accelerated by NVIDIA BlueField-3 Data Processing Units (DPUs) to enhance performance in Kubernetes environments and support emerging edge AI use cases. This integration marks a significant step forward in optimizing computing resources, reducing operational costs, and enhancing security for modern network infrastructures.

The F5 Application Delivery and Security Platform plays a critical role in global telecommunications, powering a majority of the world’s Tier-1 5G, mobile, and fixed-line telco networks. Service providers are facing increasing challenges in scaling AI applications across distributed environments, as legacy network infrastructures often lack the processing power required to make AI inferencing practical. By deploying F5 cloud-native functions (CNFs) on NVIDIA DPUs, these limitations are addressed, allowing edge and far-edge infrastructures to be leveraged for optimized computing performance. This approach significantly reduces power consumption per gigabit per second while also lowering overall operating expenses.

With the rise of AI-driven applications and functionalities, security has become a crucial concern for service providers. The integration of F5 and NVIDIA BlueField-3 technologies provides robust security measures alongside advanced traffic management, ensuring minimal latency while delivering efficient AI-powered services. Deploying CNFs at the edge brings applications closer to users and their data, reinforcing data sovereignty, improving user experience, and reducing costs associated with power, space, and cooling. Low latency is a key requirement for AI applications, particularly for use cases that demand real-time decision-making. These include autonomous vehicles and fraud detection, interactive AI-driven user experiences such as natural language processing (NLP) tools and augmented reality/virtual reality (AR/VR) applications, as well as continuous monitoring and response mechanisms necessary for healthcare devices and industrial automation.

The inclusion of CNFs on BlueField-3 DPUs builds on F5’s earlier innovations, such as BIG-IP Next for Kubernetes, which was previously deployed on NVIDIA DPUs. F5 continues to leverage the NVIDIA DOCA software framework to seamlessly integrate its networking and security solutions with BlueField DPUs. DOCA provides a robust set of APIs, libraries, and development tools that enable F5 to maximize hardware acceleration, ensuring seamless compatibility across multiple generations of BlueField DPUs. This acceleration frees up CPU resources, allowing businesses to allocate processing power more efficiently for other critical applications.

Edge deployments present numerous opportunities for service providers. These include distributed N6-LAN capabilities for User Plane Functions (UPFs) and enhanced edge security services that support Distributed Access Architecture (DAA) and Private 5G networks. AI-powered Radio Access Networks (AI-RAN) are also gaining momentum, as demonstrated by SoftBank’s recent showcase of their production environment using NVIDIA’s technology.

The collaboration between NVIDIA and F5 aims to unlock the full potential of AI-RAN, a transformative approach that turns mobile networks into multi-purpose infrastructures. AI-RAN maximizes resource utilization, creates new revenue streams through hosted AI services, and improves cost efficiency by enabling mobile providers to support distributed AI computing. By deploying BIG-IP Next CNFs on NVIDIA BlueField-3 DPUs, AI-RAN implementations gain enhanced traffic management capabilities for both AI and RAN workloads while benefiting from robust firewall and DDoS protection. Multi-tenancy and tenant isolation are natively integrated, ensuring secure and efficient workload distribution across various AI and RAN functions. With this solution, mobile providers can intelligently repurpose their existing RAN compute infrastructure to power AI-driven offerings alongside traditional network services. This approach significantly reduces costs while unlocking new revenue opportunities through enhanced AI-powered user experiences.

As demand for unified application delivery and security in AI-driven infrastructures continues to grow, F5 and NVIDIA remain at the forefront of innovation. According to Ahmed Guetari, Vice President and General Manager of Service Provider at F5, customers are increasingly seeking cost-effective ways to integrate AI into their networks. The edge is emerging as a crucial area of focus, as AI inferencing no longer needs to be centralized in data centers or cloud environments. Instead, intelligent and automated capabilities can now be embedded directly into the network, enhancing performance while reducing latency.

The general availability of F5 BIG-IP Next Cloud-Native Network Functions on NVIDIA BlueField-3 DPUs is expected in June 2025. Organizations interested in exploring this advanced solution can learn more by visiting F5 at the NVIDIA GTC event, taking place from March 17–21 in San Jose, California. Additional insights are available in F5’s companion blog and through direct inquiries with F5 representatives.