How the Creator of AI Dungeon 2 Used GPT-2 To Create Neverending Adventure Games
Imagining Infinite Video Games
As a kid, I played text-based adventures on my family’s TI-82 with a cassette tape data drive. I have vivid memories of Scott Adams’ classic Pirate Adventure game, an early text-based video game.
The gameplay was simple. You start by standing in a London flat, and must navigate your way to a mysterious island by typing simple commands like “west,” “climb stairs,” or “help.” When you typed “get rum,” the game responded: “There’s a strange sound… I think it’s me. Hee Hee.”)
Back in those good old days, I constantly crashed into the narrative constraints of early text-based games. The phrases “I don’t know what that is” and “I can’t go in that direction” still haunt me to this day, tracing the hard limits of an imaginary world created and coded by human writers.
Meet the Creator of AI Dungeon 2
Gamer creator Nick Walton released AI Dungeon 2 last week, using the full 1.5B parameters version of OpenAI’s superpowerful GPT-2 language model to build an infinite text-based fantasy game.
In AI Dungeon 2, I played a wizard exploring a dangerous spell library and then my daughter played a noblewoman defending her castle from hordes of invading orcs.
The game never told us “I don’t know what that is” and “I can’t go in that direction.” Thanks to the powerful GPT-2 story engine, AI Dungeon 2 always tried to generate a new scene or a new dialogue, no matter how strange our request.
Personally, I am a sucker for spell books and magical libraries. So I spent a long time holed up reading books inside a ruined castle.
At the moment, you can play the game at this Google Colab link.
As Nick explains on the site, the game is in transition as he copes with “insane download fees” that forced him to seek a new distribution model:
“We are using bittorrent as a temporary solution to host game files and keep this game alive. It’s not fast, but it’s the best we’ve got right now. If you want to help, best thing you can do is to download this torrent file with game files and seed it indefinitely to the best of your ability. This will help new players download this game faster, and discover the vast worlds of AIDungeon2!”
I spoke with Nick about the hardware, dataset, and programming behind the endless AI-generated game world of AI Dungeon 2.
Since our interview, Nick updated his Patreon page with news about an upcoming app version with a paid subscription model.
“The alpha version of the app is working and we have the infrastructure to run the model on the cloud. We need another day or two of testing before we start beta, but we hope to start by the end of the week.”
Here’s the complete text of our interview about the game…
How long did it take to create your training dataset?
The dataset was about 30 MB of text adventure stories that I web scraped from chooseyourstory.com. It took me maybe 20 hours or so to build the web scraper, run it and curate the dataset. I wanted to make sure all the data that I used was in the right format.
Advice that I would give, especially in text, is that (to a degree) a smaller amount of high-quality data is more valuable than a larger amount of low-quality data.
What coding steps did you take to maintain context throughout a GPT-2 generated story?
I used a DGX 1 which contains 8 high powered GPUs to train. The model that I ended up using took around 12–16 hours to train.
I’ve experimented with this for a really long time and tried a lot of things.
Finetuning on the right text adventure data was probably the most important thing I did to keep the narrative straight.
Increasing the memory has also helped. Where in the past the memory was closer to 2 (the last 2 action-result pairs) the context sentence was much more important, where now with the memory set to 10 I’m not sure it actually matters.
I do also do a decent amount of modifying the player’s input to be in the right format and cutting off the models output so that it doesn’t contain action lines (denoted by the “>” symbol)
You also modified your model to cut down on repetition in GPT-2 output. Could you explain that?
In the Salesforce CTRL model they add a penalty to the probabilities of generating a word that has already been generated dividing it’s log odds by something like 1.2.
This helps keep the model from getting stuck saying the same word or set of words. Because of that, the CTRL model was able to use a much lower temperature than GPT-2 (which needed a high temperature to prevent repetition). By adding this penalty to GPT-2, I was able to reduce the temperature down to 0.4 which helps coherence, while still avoiding repetition (for the most part)
How do you feel about copyright and intellectual property in respect to the stories generated by your game and your fine-tuned model?
Just as humans learn much about how to write from other authors, I’m fine with models being trained on other people’s work to learn how to write better. In terms of what’s been output by a model, I have no problem with people using it to write fantasy novels or posting their adventures online.
AI Dungeon 2 isn’t a solo storyteller. The stories people post are funny not just because of the AI, but also because of how humans interact with it to create interesting and funny stories.
What genres or other formats do you hope to add to your dataset for future iterations?
I’d certainly like to add more genres to it, but the next thing I’ll be working on is getting a better hosting solution and finishing my reverse AI Dungeon mode, where the AI is the player and the human is the Dungeon Master.