Nvidia enhances digital human realism with new Unreal Engine 5 plugins: Nvidia has released a suite of tools and plugins aimed at improving the creation and deployment of AI-powered digital human characters within Unreal Engine 5, marking a significant advancement in game development and interactive applications.
Key features and improvements:
- Nvidia introduced on-device plugins for Nvidia Ace, a suite of digital human technologies, to facilitate easier development of AI-powered MetaHuman characters on Windows PCs.
- The company unveiled an Audio2Face-3D plugin for Autodesk Maya, enabling AI-powered facial animations that synchronize lips and facial movements with audio speech.
- A new Unreal Engine 5 renderer microservice leveraging Epic’s Unreal Pixel Streaming technology was announced, supporting the Nvidia Ace Animation Graph microservice and Linux operating system in early access.
Expanded capabilities for MetaHuman creation:
- The Nvidia Ace Unreal Engine 5 sample project now includes additional on-device ACE plugins:
- Audio2Face-3D for lip sync and facial animation
- Nemotron-Mini 4B Instruct for response generation
- RAG for contextual information
- These tools allow developers to build databases with contextual lore, generate relevant responses with low latency, and drive corresponding MetaHuman facial animations seamlessly within Unreal Engine 5.
Autodesk Maya integration:
- The new Audio2Face-3D plugin for Autodesk Maya offers high-performance animation functions for game developers and technical artists.
- It simplifies the process of generating high-quality, audio-driven facial animations for any character.
- The plugin provides a streamlined user interface and seamless transition to the Unreal Engine 5 environment.
- Source code and scripts are highly customizable, allowing modification for use in other digital content creation tools.
Cloud deployment and streaming advancements:
- The latest Unreal Engine 5 renderer microservice in Nvidia Ace now supports the Nvidia Animation Graph Microservice and Linux operating system in early access.
- This addition enables developers to run MetaHuman characters on cloud servers and stream rendered frames and audio to any browser and edge device using Web Real-Time Communication (WebRTC).
- The Animation Graph microservice facilitates the creation of animation state machines and blend trees, offering a flexible node-based system for animation blending, playback, and control.
Availability and access:
- Developers can access tutorials for setting up and using the Unreal Engine 5 plugin.
- The Maya ACE plugin is available for download on GitHub.
- Early access applications are open for the Unreal Engine 5 renderer microservice with support for the Animation Graph microservice and Linux OS.
Broader implications: These advancements by Nvidia represent a significant step forward in the creation and deployment of realistic digital humans in games and interactive applications. By streamlining the development process and improving the quality of AI-powered characters, Nvidia is enabling developers to create more immersive and engaging experiences across a wide range of platforms and devices. This technology has the potential to transform not only gaming but also virtual assistants, educational tools, and other interactive digital experiences.
Nvidia releases plugins to improve digital human realism on Unreal Engine 5