At Unreal Fest Seattle 2024, Nvidia announced the release of new plugins designed to allow developers to more easily and efficiently create and deploy of AI-powered MetaHuman characters directly in Unreal Engine 5 for game design on Windows PCs.
The graphics card maker has launched on-device plugins for Nvidia ACE in Unreal Engine 5. There’s also a new Audio2Face-3D Plugin for Autodesk Maya, plus a new Unreal Engine 5 Renderer Microservice with Pixel Streaming.
Getting Started with the Unreal Sample Project for NVIDIA ACE – YouTube
Watch On
There are three new Unreal Engine 5 ACE plugins: Audio2Face-3D Plugin, which enables AI-powered lip sync and facial animation directly in Unreal Engine 5, along with Nemotron-Mini 4B Instruct Plugin, which provides response generation for interactive character dialogues and the Retrieval Augmented Generation (RAG) Plugin, which supplies contextual information to enhance character interactions.
And now Unreal Engine 5 Renderer Microservice with Pixel Streaming, which uses Epic’s Unreal Pixel Streaming technology, supports the ACE Animation Graph microservice and Linux OS in early access, allowing developers to stream high-fidelity MetaHuman characters to any device via Web Real-Time Communication (WebRTC).
the new tool aims to simplify the integration of AI-powered digital humans into games and applications and optimise microservices for low latency and minimal memory usage on Windows PCs. It also enhances scalability for cloud-based deployment of digital humans and provides customisable tools and source code for tailored development needs. Developers will be able to apply for early access to download it.
For Autodesk Maya, the new Audio2Face-3D Plugin facilitates high-quality, audio-driven facial animations, provides source code for customisation and integration with other digital content creation tools and streamlines the workflow between Maya and Unreal Engine 5. The Maya ACE plugin is available to download on GitHub.