Hands-On Review: GeForce NVIDIA RTX ACE and AI Tools Shaping the Future of Gaming

Hands-On Review: GeForce NVIDIA RTX ACE and AI Tools Shaping the Future of Gaming

Since the launch of DLSS in 2018, Nvidia has progressively advanced its presence in the AI sector, expanding its portfolio with tools like Reflex and Broadcast. Initially, DLSS faced challenges such as artifacting; however, with subsequent updates and advancements, it has evolved into an essential feature, striking a balance between ray-traced visual fidelity and the high frame rates that gamers seek.

Looking ahead at the future of AI in video games, Nvidia appears to have numerous innovative features in development beyond just DLSS and Reflex. Recently, I had the opportunity to explore and test some of these exciting new capabilities at the GeForce Nvidia RTX AI PC Showcase.

The event, led by John Gillooly, Technical Product Marketing Manager for Asia Pacific South at Nvidia, showcased the company’s vision for not only gaming but the broader entertainment landscape, complemented by hands-on experiences with these technologies.

NVIDIA ACE: The Future of NPC Interaction

Nvidia revealed ACE (Avatar Cloud Engine) in January 2024. This suite leverages generative AI to create interactive digital characters or NPCs. Essentially, it replaces conventional dialogue trees with more fluid conversations, enabling players to ask questions of in-game characters who respond based on their lore.

During the demonstration, Nvidia presented two variations of ACE: one using the upcoming Mecha BREAK game and the other utilizing the Legends Tech demo from Perfect World.

The first demo from Mecha BREAK relied on the GPU’s tensor core to handle AI processing locally, allowing for quick interactions with minimal latency, albeit with limited conversation options. I was able to ask the character to change my equipped mech or explain stats. However, questions beyond the game’s context often resulted in the reply, “Sorry I couldn’t understand that.”

Additionally, the demo experienced multiple crashes, raising concerns about performance, especially since it was run on RTX 4080 and 4090 systems.

The second demonstration featured NVIDIA ACE within the Legends Tech demo by Perfect World. This one employed cloud computing and ChatGPT to engage with players, ensuring the character remained consistent with their lore while still offering responses to player queries.

In this medieval fantasy setting, for instance, the character could guide players on weapon choice or usage, though they were oblivious to modern technology like cars. The responses were slightly delayed but vastly more natural compared to the previous demo.

Yun Ni in Perfect World's Legends got annoyed when I asked her about RTX 5090 leaks (Image via Nvidia)
Yun Ni in Perfect World’s Legends got annoyed when I asked her about RTX 5090 leaks (Image via Nvidia)

After experiencing the demos, I feel skeptically optimistic. Although this technology represents a novelty today, its potential impact on my GPU or internet performance raises questions. It’s still early in its development, reminiscent of the initial reactions to ray tracing, which many gamers, including myself, preferred to overlook for enhanced FPS over improved lighting in select titles.

Yet, in the past five years, ray tracing has transitioned into a vital feature, with DLSS significantly alleviating performance dips. I am hopeful for a similarly bright trajectory for Nvidia ACE, paving the way for its widespread adoption without substantial performance drawbacks.

Hands-On Impressions of NVIDIA Audio2Face, ComfyUI, and ChatRTX AI Tools

While ACE took center stage, other noteworthy AI tools were also showcased, such as Audio2Face within Nvidia Omniverse, along with ComfyUI and ChatRTX.

The Audio2Face tool is developer-focused, utilizing AI to autonomously generate facial animations that align with dialogue. This tool goes beyond lip-syncing by simulating expressions across facial muscles to produce highly realistic animations.

Although the demo didn’t illustrate specific use cases, it holds significant promise for multilingual games, enabling adaptive facial expressions tailored to various audio languages.

John showcased Star Wars Outlaws running with DLSS, which reminded me I need to finish the game (Image via Nvidia || Sportskeeda)
John showcased Star Wars Outlaws running with DLSS, which reminded me I need to finish the game (Image via Nvidia || Sportskeeda)

Nvidia’s ComfyUI and ChatRTX deliver user experiences akin to other AI tools but highlight their ability to operate entirely using local Nvidia GPUs, thus eliminating latency or server-related issues and offering a significantly faster solution.

Overall, Nvidia’s upcoming suite of AI tools appears highly promising, with the potential to greatly enhance the gaming industry. Personally, I am eager to see Nvidia ACE implemented in upcoming RPG titles, allowing me to immerse myself for hours in richly interactive game worlds.

Source

Leave a Reply

Your email address will not be published. Required fields are marked *