KServe Providers Offering NIM Inference in Clouds and Data Centers
From NVIDIA: 2024-06-02 09:14:59
NVIDIA NIM and KServe collaboration makes deploying generative AI in the enterprise easier than ever. NIM is available on platforms from companies like Canonical, Nutanix, and Red Hat. KServe, an extension of Kubernetes, runs AI inference efficiently, supporting popular AI frameworks. NIM simplifies API calls for optimal performance and is being adopted by companies.
KServe, part of Kubeflow, automates AI model deployment on Kubernetes. Companies like AWS, Cisco, and IBM have adopted KServe. It runs AI inference like a cloud application, supporting popular AI frameworks. Features like canary rollouts and GPU autoscaling ensure efficient deployment of AI models.
NIM integrated with KServe allows users access on various enterprise platforms like Canonical’s Charmed KubeFlow and Red Hat’s OpenShift AI. Red Hat, Nutanix, and Canonical are pleased to offer NIM through their platforms. Many other software providers benefit from including KServe in their offerings. NVIDIA actively contributes to KServe and open-source projects.
NVIDIA continues its track record on projects like KServe, contributing to open-source software development. KServe focuses on running one AI model at a time across multiple GPUs. NVIDIA plans to actively contribute to KServe and the open-source community. NIM API is available on the NVIDIA API Catalog for deployment of generative AI models.
Read more at NVIDIA: KServe Providers Offering NIM Inference in Clouds and Data Centers