Could you imagine a game that has 5 million recorded lines of dialog? What would this game look and feel like? How many characters and NPCs would be required to populate a world so alive? How would teams keep track of so many lines, stories, characters, and how will they record that many lines within the amount of time it takes to ship a title?
Starfield has Over 250,000 lines of dialogue, over twice of Fallout 4 that clocked in at 111,000 lines. GTAV has over 160,000 lines. Red Dead Redemption 2 features 1,200 actors, 700 of whom share the 500,000 lines of voiced dialogue. These are really impressive numbers and the teams behind this deserve massive respect!
The eventual Metaverse will birth new genres of games and experiences.
Today we are familiar with fast paced multi-player shooters like Fortnite, PubG or Apex Legends, or even Minecraft like worlds. We believe there is a new category of virtual experiences just around the corner that will rise to immense popularity, it will be known for its depth of narrative and endless storytelling, for having 100’s of characters, along with ever expanding quests and stories.
Games like Red Dead Redemption 2, The Witcher 3, Grand Theft Auto and the Assassin’s Creed titles have given us a glimpse into what this might look like, but these games have thus far been confined to the lofty realms of AAA studios that can afford massive teams and lengthy production cycles.
We see a future where nimble independent studios can iteratively build live experiences, leveraging the power of AI, to bring this kind of storytelling to life with just as much immersion as the big studios.
In order to realise this dream, we need a new system of populating live games with living characters that add to the immersion, in a sustainable way. The current system of filming and recording live actors does not scale.
We think one way of shipping games with 10x or 100x more voice acting would be through a combination of AI voices with generative AI models like ChatGPT so that some % of the ‘voice acting’ is done autonomously, by NPCs responding directly to player actions and commands in the game.
In order for narrative and game design teams to trust the responses of these autonomous AI Smart NPCs, the NPC ‘brains’ would need to be fine tuned on the lore of the game, the characters, backstories, and motivations. They would also need to set conditions that decide when an NPC delivers a pre-scripted line versus an AI generated line. We see the role of writers and narrative designers eventually encompassing the management and fine-tuning these memory and context conditions that power AI Smart NPCs, in addition to the writing and narrative design work they currently do.
Replica’s vision is to offer Unreal Engine 5 like capabilities, for voice acting.
Our team was inspired by The Matrix Awakens experience when Epic Games released it last year. The impact of UE5 still rings clear : Now, even a tiny game studio can match the visual fidelity of a well funded AAA team. At Replica, we are excited to be working towards a future where even small studios can dream of matching the storytelling fidelity and depth of immersive characters as well funded AAA teams.
So today, we are thrilled to share the work we’ve been doing. Inspired by the Matrix awakens demo, we are releasing an early Matrix demo powered by Replica Studios AI Smart NPCs.
With a modest windows PC you can experience the demo for yourself.
Sign up to the waitlist to be notified when we release the Smart NPC Plugin
Sign up here
Our playable demo builds on Unreal Engine’s Matrix Awakens sample project and shows off some of the features that will be available in the Smart NPC plugin that we will release later this year. Features of the playable demo include the following :
Talk to any NPC and they respond in real time
Using your microphone, talk to NPCs and they will respond with dialogue in real-time
AI voices with emotion
NPC responses utilise Replica’s range of emotions and adapt in real-time.
Scale unique NPC characters for your world
NPC context and backstories can be customised for a unique tailored experience (not available in the playable demo).
NPCs can talk to each other
Ambient NPC interaction is in-built, so NPCs will converse with each other intelligently as well.
Automated lip-sync and body gestures
Included is a customised blend-shape to respond to phoneme timing output to create effortless and accurate real-time lip sync. Animation blueprints also provide suitable body gestures.