For a recovering gamer like me, one of the most exciting applications of generative AI is dynamic dialogue. I’m not suggesting AI replace writers — goodness forbid. But as anyone who’s sunk hundreds of hours into an RPG can tell you, scripted NPC interactions get old fast.
There are a few startups prototyping AI tech to dynamically generate dialogue. But one of the more promising is Inworld, launched in 2021 by the founding team of API.AI, which developed tools for speech recognition and natural language understanding until its acquisition by Google in 2016. (API.AI later became Dialogflow, Google’s flagship conversational AI design platform.)
Inworld claims to use “multiple” machine learning models to “mimic the full range of human communication.” That’s promising a lot in the context of games, but the startup makes the case that, by allowing developers to link its dialog- and voice-generating tools to animation and rigging systems within popular game engines, including 3D environments, it can help deliver more lifelike and immersive gaming experiences.“The world’s interest in creative applications of AI is rapidly expanding, and Inworld stands in a unique place to be the powerhouse behind the next generation of interactive entertainment,” Kylan Gibbs, Inworld’s chief product officer and a co-founder alongside Ilya Gelfenbeyn and Michael Ermolenko, told TechCrunch via email.
NPCs powered by Inworld’s tech can learn and adapt to new situations, Gibbs adds, navigating chats with memory and recall. (Think an NPC that remembers a player likes soccer, for example, or expressed a strong dislike for another character.) They can also autonomously initiate goals and perform actions, adding an element of the unexpected to game experiences.
Customers can create personalities for Inworld’s NPCs by describing them in natural language, and map each NPCs’ “emotions” to goals and custom-defined triggers. Beyond this, users can input “personal knowledge” to control the information an NPC should or shouldn’t know (e.g. shared lore, world contexts and backgrounds).Inworld NPCs can optionally be set to gather information like a player’s name, role or gender, plus game-specific elements like level or faction. And, depending on how their “relationship fluidity” setting is configured, they can be encouraged to act in an outwardly friendly or hostile way.
Generative AI’s tendency to go off the rails might give some developers and companies pause. But Inworld claims that its safety tech, including controls for profanity, bias and toxicity, keep characters “on brand.” (Inworld allows flexibility around topics like profanity, violence, adult topics, alcohol, substance use, politics and religion, but doesn’t permit things like hate speech or encouraging self-harm.) Inworld also offers a tool, called 4th Wall, that attempts to preserve immersion by preventing NPCs from talking about locations, people, social constructs, professions and time periods outside a game’s lore.
The promise of what Inworld’s creating led Lightspeed Venture Partners to invest more than $50 million in the startup as part of Inworld’s most recent funding tranche, announced today. Stanford University, Samsung Next, Microsoft’s M12 fund and Eric Schmidt’s First Spark Ventures also participated, bringing Inworld’s total raised to more than $100 million at a $500 million post-money valuation.
That’s on top of investments from Disney as a part of the 2022 Disney Accelerator and a grant from Epic to integrate its platform with Epic’s Unreal Engine.
𝗦𝗲𝗺𝗮𝘀𝗼𝗰𝗶𝗮𝗹 𝗶𝘀 𝘄𝗵𝗲𝗿𝗲 𝗿𝗲𝗮𝗹 𝗽𝗲𝗼𝗽𝗹𝗲 𝗰𝗼𝗻𝗻𝗲𝗰𝘁, 𝗴𝗿𝗼𝘄, 𝗮𝗻𝗱 𝗯𝗲𝗹𝗼𝗻𝗴. We’re more than just a social platform — from jobs and blogs to events and daily chats, we bring people and ideas together in one simple, meaningful space.