NVIDIA’s First SLM Helps Bring Digital Humans to Life
From NVIDIA: 2024-08-21 09:00:57
At Gamescom, NVIDIA announced ACE now includes an on-device small language model called Nemotron-4 4B Instruct, improving digital human interactions for game characters on GeForce RTX PCs and NVIDIA RTX workstations. The optimized SLM has a lower memory footprint and faster response times, enhancing natural conversation with digital humans.
ACE NIM microservices allow developers to deploy generative AI models through the cloud or on RTX AI PCs and workstations for real-time NPC interactions in games. NVIDIA Riva ASR processes spoken language for accurate transcription, supporting conversational AI pipelines using multilingual speech and translation microservices.
NVIDIA Audio2Face generates facial expressions synced to dialogue in multiple languages for dynamic and realistic emotions in digital avatars. The AI network animates face, eyes, mouth, and head motions based on emotional range and intensity, automatically inferring emotion from audio clips. A2F can be used live or baked into post-processing.
ACE NIM microservices enable AI model inference locally on RTX AI PCs and workstations or in the cloud, offering hybrid inference with the NVIDIA AI Inference Manager SDK. The software development kit streamlines AI model deployment for PC applications, allowing seamless orchestration of inference across devices.
Digital humans like “James” go beyond NPCs in games, connecting with people using emotions and humor. Changes in communication methods over the years led to the creation of digital humans, transforming human-computer interactions in various industries beyond gaming. Users can interact with James in real-time at ai.nvidia.com for a glimpse of the future.
Read more at NVIDIA: NVIDIA’s First SLM Helps Bring Digital Humans to Life