AI & RoboticsNews

PlotMachines AI system writes long-form stories from outlines

In a preprint paper published this week on Arxiv.org, scientists at Microsoft, the Allen Institute for Artificial Intelligence, and the University of Washington describe PlotMachines, an AI system that learns to transform outlines into stories by tracking plot threads. PlotMachines — whose code is available on GitHub — could bolster the development of systems capable of writing case studies, news articles, and scripts from nothing but phrases describing characters and events, saving companies time and capital.

While story-, article-, and even lyric-generating AI systems exist, they’re mostly tailored to specific domains and adapt poorly to new tasks. Moreover, they’re not particularly skilled at long-form writing; even the most sophisticated models forget plot elements and repeat themselves.

Composing a story requires keeping track of a plot that weaves through characters and events in a coherent narrative, as the researchers explain. This isn’t easy for machines. Because the input provides only rough elements of the plot, it’s incumbent on a model to flesh out how the elements intertwine across different parts of the story.

In the course of developing PlotMachines, the team created several data sets and built on existing story data sets for target narratives, which they paired with automatically constructed input outlines:

  • Wikiplots, a corpus consisting of movie, TV, and book plots scraped from Wikipedia.
  • WritingPrompts, a story generation data set collected from the Reddit subreddit /r/WritingPrompts.
  • NYTimes, a data set containing news articles.
  • Outline Extraction, a list of plot points from Wikiplots, WritingPrompts, and NYTimes extracted using an algorithm.

VB Transform 2020 Online – July 15-17: Join leading AI executives at the AI event of the year. Register today and save 30% off digital access passes.

The researchers next designed PlotMachines, which they describe as a Transformer built on top of OpenAI’s GPT model. Like all neural networks, Transformers contain functions (neurons) arranged in layers that transmit signals from data and adjust the connections’ strength (weights). But Transformers also have attention, which means that every output element is connected to every input element, and the weightings between them are calculated dynamically.

PlotMachines

Above: A story generated by PlotMachines.

Given an outline as input, PlotMachines writes five paragraphs — an introduction, three body paragraphs, and a conclusion — and updates a memory matrix that keeps track of plot elements from the outline. Per-paragraph discourse information helps maintain stylistic differences at the beginning, middle, and end of stories (as does memory that observes what’s been written so far), while context representation ensures previous elements are used in the creation of new paragraphs.

Qualitatively, the researchers say PlotMachines learned after training to start stories by setting the scene (e.g. “In the early 1950s, a nuclear weapons testing continues …. “) and end with a definitive closing action (e.g. ” … the film ends with humperdinck and buttercup riding off into the sunset”). In point of fact, they found a news-generating PlotMachines model trained on the NYTimes corpus so capable that they plan to share it only selectively with the research community, so as to prevent malicious actors from creating and spreading misleading stories.

PlotMachines

In experiments, a variation of the PlotMachines model built atop OpenAI’s GPT-2 architecture, which contained 460 million parameters (variables) in total, achieved better Recall-Oriented Understudy for Gisting Evaluation (ROUGE) and BLEU scores than several baselines, indicating it had superior summarization and machine translation capabilities. In two separate evaluations involving human teams tasked with reading and reviewing PlotMachine-generated stories, it outranked the baselines in categories like “narrative flow” and “outline usage.”

“We propose the task of outline-conditioned story generation: Given an outline as a set of phrases that describe key characters and events to appear in a story, the task is to generate a coherent narrative that is consistent with the provided outline … This requires the model to keep track of the dynamic states of the latent plot, conditioning on the input outline while generating the full story,” wrote the coauthors. “Analysis shows that PlotMachines is effective in composing tighter narratives based on outlines.”


Author: Kyle Wiggers.
Source: Venturebeat

Related posts
AI & RoboticsNews

The show’s not over: 2024 sees big boost to AI investment

AI & RoboticsNews

AI on your smartphone? Hugging Face’s SmolLM2 brings powerful models to the palm of your hand

AI & RoboticsNews

Why multi-agent AI tackles complexities LLMs can’t

DefenseNews

US Army buys long-flying solar drones to watch over Pacific units

Sign up for our Newsletter and
stay informed!