While we can interact with some NPCs in games, the communication is mostly canned. Players have to choose from scripted questions and the NPCs will reply according to the script. But what if you can talk to AI-controlled NPCs with your voice while they process your request and act accordingly?
During the CES earlier this month, TiGames unveiled ZooPunk, a futuristic game world built with NVIDIA ACE and GeForce RTX GPUs. The game allowed players to communicate with NPCs and get them to perform tasks like customizing their ship. ZooPunk gives us a glimpse into the future of generative AI in games.
In the demo posted on January 8, TiGames’ CEO and founder, Zhang Tao, was seen communicating with an NPC in ZooPunk using the microphone in his DualSense controller. The feature was used to customize his landing ship in the game. However, the feature can find more use cases in games.
“We call our game ZooPunk,” Tao said. “It combines animal characters and a dieselpunk aesthetic. A key location in ZooPunk is Tron’s Kitchen. A secret base that floats in the sky.”
ZooPunk follows the adventure of a heroic rabbit called Rayton. When Rayton met a monkey-looking NPC in charge of ship customization in Tron’s Kitchen, instead of the usual prompt to select from a list of questions, Tao was seen speaking into the microphone of his DualSense controller.
“Using a combination of several AI technologies, we’ve crafted an NPC that can communicate naturally with the player,” Tao said and went on to explain how the technology works.
“A speech model listens to the player’s voice and translates their words into text. Then a Language Model parses the text to understand the player’s intention and generate the NPC response.
“We also use AI to give a voice to the NPC and animate their facial expression with lip sync using NVIDIA ACE. With the power of GeForce RTX GPUs, all these models run on the user’s PC, not on cloud.”
Tao went ahead to customize his landing ship using just his voice. The customization process works like every other AI image generator where you describe what you have in mind and the AI brings it to life—except that this time you are describing with your voice rather than blocks of text.
Poking holes into the flaws of generative AI in TiGames’ ZooPunk
Several people were against the technology, as is evident in the nasty comments left under the video. The reasons for the pushbacks were mostly the same for all AI use cases in games including how AI was taking the jobs of real creators and how it has made the NPCs more lifeless.
Some petty comments focused on the wrong pronunciation of some of the words Tao said in the controller’s microphone and how poor Tao’s English was. We can ignore the low IQ comments and focus on the real issues.
The issue of AI taking the jobs of creators is really unfortunate, especially because the creators helped train the AI models directly or indirectly from their works. Even more sad is that many of them never get compensated or acknowledged for it.
ALSO READ: Catly Developer Updates Steam Page With In-Engine Gameplay After AI Accusation
In 2023 and 2024, the video game industry witnessed unprecedented waves of layoffs and studio closures which was partly blamed on the incursion of generative AI into the game development process. Suddenly most corporations—especially those with eyes on maximizing profits—don’t see the need to have lots of employees on their payroll.
Those kicking back against the use of AI in the way TiGames has done with ZooPunk NPCs also have a strong argument. The traditional process of crafting a character or customizing an object, although sometimes tedious, makes the player feel involved. It makes the player sort of own the character or whatever customization they have done.
However, when AI takes the player’s idea and flies with it, it detaches the player from the customization process—albeit players who have always seen the character or item customization process in games as a waste of time wouldn’t mind handing off the process to gen-AI to do it more quickly.
Is there a place for gen-AI in the making of video game NPCs
Those in favor of the use of gen-AI in video game NPCs have also argued that it can make these characters more life-like in their responses. In some games a gun battle can be going off and NPCs will just be walking around as if nothing is happening.
“Thanks to the power of AI and NVIDIA ACE, players can now customize items in the game to express their individuality,” Tao explained. “This changes everything for user-generated content in games.”
“The player doesn’t need to be a great artist. The only limit is their imaginations!”
Several other developers including Ubisoft and Rockstar have previously declared interest in the use of AI in games—particularly on how to use them to give life to NPCs. In 2023, Xbox announced a multi-year partnership with Inworld AI.
ALSO READ: Why Is My Gaming Laptop So Slow? 5 Ways To Fix It!
In 2024, Sony hinted that it could use generative AI to speed up the game development process. Several other partnerships have been announced in the past two years. And it is highly unlikely that big developers will cut back on their spending on AI.
So, hate it or love it, generative AI has become part of video game development. Interestingly, the use of AI in games dates back to decades ago with 16-bit games with randomized levels. However, the scale of use will continue to expand as technology evolves—and bashing companies in their comment section will hardly be enough to stop this evolution.
TAKE OUR POLL
Are you for or against the use of generative AI in all its forms in video games? Share your thoughts with us in the comment box below.