Yes, it’s easy. You just type a short story into a word processor, press the print button and out it comes. The French publisher Short Édition has installed story vending machines in cities and universities around the world.
At the touch of a button the machine prints a short story, for free, on eco-friendly paper the width of a toilet roll. There’s no cost to use the machine and readers can choose from stories that take one minute, three minutes, or five minutes to read.
It’s a great idea, but these stories are written by humans not computers. Could a computer create new, original stories?
Well, you could chop already-written stories into smaller pieces – single events, dialogues, descriptions – then code a computer program to select some at random and string them together, slotting in consistent characters throughout. Here’s an example we prepared earlier:
I had called upon my friend, Mr Quentin Hall, one day in the autumn of last year and found him deep in conversation with a very stout florid-faced elderly gentleman, with fiery red hair. My friend rose lazily from his arm-chair and stood with his hands in the pockets of his dressing gown. I was surprised. I looked at the clock. Both Hall and I had a weakness for the Turkish Bath. As Hall turned up the lamp a light fell upon a card on the table. In his right hand he had a slip of litmus paper. Then he stood before the fire, and looked me over in his singular introspective fashion.
“You have a case, Hall?”, I remarked.
“Very sorry to knock you up, Wilberforce,” said he, “but it’s the common lot this morning.”
“My dear fellow, I wouldn’t miss it for anything.”
We chose, at random, one opening sentence from the collection of Sherlock Holmes short stories, followed by random sentences from the first paragraphs of other stories in the collection, then some random pieces of opening dialogue, with the names of characters altered. Although the names of the characters have been changed and the passage makes little overall sense, the style is still clearly that of Sir Arthur Conan Doyle.
The sense of harmony that comes from writing in an identifiable style was one of the ploys used in Sheldon Klein’s Automatic Novel Writer program created in the 1970s. Klein claimed that his program could produce 2100-word murder mystery stories in less than 20 seconds. Here are the first 80 or so words from one of its offerings.
Wonderful smart Lady Buxley was rich. Ugly oversexed Lady Buxley was single. John was Lady Buxley’s nephew. Impoverished irritable John was evil. Handsome oversexed John Buxley was single. John hated Edward. John Buxley hated Dr. Bartholomew Hume. Brilliant Hume was evil. Hume was oversexed. Handsome Dr. Bartholomew was single. Kind easygoing Edward was rich. Oversexed Lord Edward was ugly. Lord Edward was married to Lady Jane. Edward liked Lady Jane. Edward was not jealous. Lord Edward disliked John. Pretty jealous Jane liked Lord Edward.
The program followed the flow of a stereotypical mystery story, introducing some characters at an English country manor then progressing though a flirtation between two characters, love making, threats, and a murder. For each scene it chose from a stock of pre-prepared sentences, giving consistent names for the characters. The Automatic Novel Writer undoubtedly produced prose in the style of murder mysteries, but the stories it told were rambling and tedious. The extract certainly doesn’t entice you to read the remaining 2000 words of the storys. .
That’s not surprising. Authors don’t just pluck out phrases at random. They form them into a logical order to make an interesting and coherent plot.
Klein’s program had a notion of plot, in its “murder flow-chart”, but that was very rigid. Its language was stilted and repetitive. What’s needed is a way to describe the structure of a whole set of plots that can then be used as a source for varied story structures.
The picture shows a plot generator for science fiction stories. The Science Fiction Horror Movie Pocket Computer was first published in 1971 in National Lampoon, an American humour magazine and a somewhat unlikely place for an article on automated creativity. By following the arrows from top to bottom of the diagram you can produce variations on a sci-fi theme. Here’s one of many:
Earth scientists discover giant reptiles which are friendly but misunderstood and are radioactive and cannot be killed by the Army, Navy, Marine Corps and/or coastguard so scientists invent a weapon which fails but they die from catching chicken pox (The End).
If you pay a large sum of money for a course on How to Write a Great Novel, you’ll probably get, along with much good advice from professional writers, some variant of this as a way to come up with outlines for your novel.
But not all stories fit into the same structure, and what about the characters?
TAILOR is a computer program that generates stories in the style of Aesop’s Fables. It is based on the principle that stories arise from a character’s lack or need, but instead of following a story grammar it sets up a need for one of the characters, places the character in a location, and gives it a plan to satisfy that need. For example, the need to find warmth would cause the character to travel in search of a fire. To complicate matters, and provide some interest in the story, the program introduces other characters who try to impede the plan, by concealing objects or offering misleading advice. In effect, the program winds up the clockwork of a needy character, puts the character into a fictitious world, and records what happens. Here’s an example from TAILOR of a story about an arctic tern named Truman who sets off on a quest to build a home, but is thwarted by Horace, the devious polar bear:
Once upon a time there was an arctictern named Truman. Truman was homeless. Truman needed a nest. He flew to the shore. Truman looked for some twigs. Truman found no twigs. He flew to the tundra. He met a polarbear named Horace. Truman asked Horace where there were some twigs. Horace concealed the twigs. Horace told Truman there were some twigs on the iceberg. Truman flew to the iceberg. He found no twigs. Horace walked to the shore. He swam to the iceberg. Horace looked for some meat. He found some meat. He ate Truman. Truman died.
This is a story that makes some sense. There’s a main character, Truman the tern, with a believable goal, to build a nest. Truman carries out actions that match his character and the setting. But the task is complicated by an adversary, Horace, who hides the twigs. Horace also has a need, for meat. He tells Truman there are twigs on the iceberg and, in a final showdown, confronts the hapless bird and eats him. It’s a good basis for a somewhat gruesome tale.
Impressive! How many duff stories did TAILOR have to generate before it came up with that good one?
TAILOR had no means to distinguish an interesting story from a tedious one. That had to be done by the program’s creators, so naturally they only presented the best ones. They don’t say how many bad stories they rejected to find one good one. However, this raises the issue of tellability.
A tellable story should not merely be interesting but it should give the reader some reward for having finished it: an insight into the human condition, a moral, or just a final twist to the plot. The story about Truman and Horace is, arguably, tellable. It has a neat twist at the end where Horace satisfies his need for food by eating the main character.
But the program doesn’t know that. The TAILOR program has no way of evaluating its own tales. It just generates story after story, some interesting, some pointless, with no insight into how they will be received by the reader. TALE-SPIN is an earlier program based on similar principles to TAILOR and here is an example of a pointless story it generated.
Once upon a time there was a dishonest fox and a vain crow. One day the crow was sitting in his tree holding a piece of cheese. He became hungry and swallowed the cheese. The fox walked over to the crow. The end.
The story may have some curiosity value, but it’s not tellable. It doesn’t capture the reader’s imagination. It starts well by introducing two characters and their traits then sets the scene by describing the crow holding a piece of cheese in his tree. Then, instead introducing the second character to build tension (perhaps to steal the cheese or to offer better food), the story generator just resolves the crow’s need by having him eat the cheese. The cunning fox arrives too late!
Scott Turner’s MINSTREL program generates short stories of what he called “King Arthur and his knights” type. The program stores a set of templates, each with a well-known moral such as “pride goes before a fall”. It then fills in the detail from a stock of interesting story fragments. If it can’t find an existing fragment to fit a specific slot in the structure, it calls up a general strategy to adapt the fragments until they slot into place. MINSTREL also introduces explicit elements of suspense, tragedy, and characterisation to liven up the plot. Its stories have a moral and some suspense.
MINSTREL, developed in the early 1990s, was a landmark in computer-based story generation. Turner designed it to mimic human creativity, based on an author recalling interesting snippets from past stories and adapting them to fit into a new story that has a both a moral and a structure.
THE MISTAKEN KNIGHT
It was the spring of 1089, and a knight named Lancelot returned to Camelot from elsewhere. Lancelot was hot tempered. Once, Lancelot had lost a joust. Because he was hot tempered, Lancelot wanted to destroy his sword. Lancelot struck his sword. His sword was destroyed.
One day, a lady of the court named Andrea wanted to have some berries. Andrea wanted to be near the woods. Andrea moved to the woods. Andrea was at the woods. Andrea had some berries because Andrea picked some berries. Lancelot’s horse moved Lancelot to the woods. This unexpectedly caused him to be near Andrea. Because Lancelot was near Andrea, Lancelot loved Andrea. Some time later, Lancelot’s horse moved Lancelot to the woods unintentionally, again causing him to be near Andrea. Lancelot knew that Andrea kissed with a knight named Frederick because Lancelot saw that Andrea kissed with Frederick. Lancelot believed that Andrea loved Frederick. Lancelot loved Andrea. Because Lancelot loved Andrea, Lancelot wanted to be the love of Andrea. But he could not because Andrea loved Frederick. Lancelot hated Frederick. Andrea loved Frederick. Because Lancelot was hot tempered, Lancelot wanted to kill Frederick. Lancelot wanted to be near Frederick. Lancelot moved to Frederick. Lancelot was near Frederick. Lancelot fought with Frederick. Frederick was dead.
Andrea wanted to be near Frederick. Andrea moved to Frederick. Andrea was near Frederick. Andrea told Lancelot that Andrea was siblings with Frederick. Lancelot believed that Andrea was siblings with Frederick. Lancelot wanted to take back that he wanted to kill Frederick. But he could not because Frederick was dead. Lancelot hated himself. Lancelot became a hermit. Frederick was buried in the woods. Andrea became a nun.
MORAL: Done in haste is done forever.
You can see the plot generator of MINSTREL churning away, moving the story forward in slow ponderous steps, such as “Lancelot knew that Andrea kissed with a knight named Frederick because Lancelot saw that Andrea kissed with Frederick.” The story has tellability, but it doesn’t flow. As a reader, you don’t feel that the narrative carries you along.
That’s it? Story generating programs can write simple moral tales.
That’s the end of the first period of computers as storytellers; then it gets more peculiar and complicated. By the start of the 1990s, three books had been published that were claimed to be written by computers. First off the press, in 1980, was Bagabone, Hem ’I Die Now, a book-length story of infidelity, kidnapping and cultural discovery on a South Pacific island. The book reads like a parody of romance fiction, with a meandering plot and characters drowning in treacle (“Gazing at the canopy of stars beyond the balcony, he retreated into the void of his own loneliness”). It would have gone entirely unnoticed, had the book cover not stated:
And yet – as astounding, as unbelievable as it may seem – the Melpomene, identified as the author of the novel, is a computer. (Yes, a computer and for some details, see “about the author” on the back flap.)
Next came The Policeman’s Beard is Half Constructed, a book of whimsical musings from Racter, a program coded on a hobbyist computer. These brief ramblings led a review in Scientific American to opine that Racter “speaks convincingly of its own mental world” and The New York Times to suggest that computers might take over the task of writing soap operas. In response, some computer experts pointed out that there is a huge gap to be filled between generating catchy sentences and creating soap operas.
Then, in 1993 came the first novel that was a genuine collaboration between human and machine. Just This Once is a 350-page romance in the style of the best-selling author Jacqueline Susann, written by a program named Hal and its programmer Scott French. French had succeeded in capturing the style of Susann, embedding it in code and calling on that to generate a complete novel in the same style.
Does that mean any novelist can capture their style in a computer program, then get it to write new books in the same style?
It took Scott French eight years to design a program that would imitate the style of one author, Jacqueline Susann. For the next quarter century a few researchers in computers and creativity looked for ways to design more powerful programs that would work like human authors, planning, drafting and revising well-formed stories.
Here’s the twist. In 2019, the OpenAI company formed by entrepreneurs including Elon Musk released the complete code for a computer neural network, GPT-2, that can imitate any author’s style without the need for expert programming. Give GPT-2 the first the first sentence of George Orwell’s novel Nineteen Eighty-Four and the computer carries on in a similar dystopian vein (but set in an American city rather than a futuristic London):
It was a bright cold day in April, and the clocks were striking thirteen. We were sitting at a table in a diner just outside of Providence, Rhode Island. The television was on and we were talking about how the end of the world would come, and our possible places of exile. The burgers were thin, the mustard was dry, and the Brussels sprouts were burnt. But it was clear something was wrong. Shortly thereafter, Sam (my roommate) called in sick to work at a construction site where a road was being constructed and returned to his car, to find it running
GPT-2 has not been told the principles of creative writing, yet it seems to know how to start a story. It establishes the setting (“a diner just outside of Providence”), has a consistent narrative voice (“we were sitting”, “we were talking”), introduces characters (“my roommate”), creates the sense of a world out of kilter (“it was clear something was wrong”), and launches into an account of Sam returning to his car to find it running. All this from just one sentence as a prompt.
In 2020, the company revealed an even more powerful version, handily named GPT-3. At this point OpenAI suddenly became less open. It didn’t release the code as it had for the previous version, but licensed exclusive use of it to Microsoft. OpenAI also allowed a few selected companies and academics to access its software. One company advertises a service for writers to “generate content on diverse subjects even if they don’t have prior knowledge”. Another lets copywriters create marketing copy in seconds.
That’s scary! Who needs professional writers when a computer can create a story in seconds?
You should be worried. GPT-3, and programs like it, are starting to take over routine writing tasks like composing blogs and writing news stories. They are good at summarizing magazine articles and academic papers. They can write convincing poetry. The computer games industry is adopting these programs to offer lifelike characters, with personalities and emotions, who can engage in deep conversation with players.
For all its power in imitating a writer’s style, GPT-3 suffers from the same fundamental weaknesses as TAILOR and TALE-SPIN – it has no common-sense knowledge of the world and cannot read what it writes. A human author can plan the outline of a story, consider the traits of characters and how they interact, reflect on how the work has progressed so far, and make wholesale revisions and deletions. A neural network language generator can’t do any of that. Nor can it explain its inner workings in human terms. If, and this is a big if, a program can be designed to generate a cunning plot, compose in a consistent style, check and change the story as it goes, and express this in imaginative language then it would be a threat to human wordsmiths, or a powerful new tool for creative writing.
Many writers will be repelled by the idea of sharing their craft with a computer, just as some hate style checkers and synonym prompts – anything that impedes the flow of words. What’s different is that the new tools will be designed to increase the flow, to help a writer keep going. They will be writers’ assistants that share ideas, show multiple ways to continue, and draw on the entire internet – including every book available online – for inspiration.
As well as offering tools for writers, story machines can probe the mystery of human creativity. For a researcher in artificial intelligence, building a computer model of story writing is a grand challenge. It shows how ideas, language and experience mesh together to create stories that entertain and inspire. When a computer-generated story succeeds in engaging a reader, the program offers a possible mechanism for how we write stories. When the model fails to produce tellable tales, we are left with greater respect for the power of the human imagination and a new search for explanations of human creativity. We make the machines that make the stories that make us.
Sometime soon, computer programs may be able to write stories with rich settings, convincing plots and compelling prose. Computer games companies may merge with media studios to produce automated sitcoms where you can engage in witty banter with the cast, or immersive dramas where you can guide a character through bloody battles and tense negotiations. Artificially intelligent bards may sing ballads of their life in the worldwide web. Technology companies could offer new tools to teach story writing and take over when you get blocked. These won’t come just from building more powerful supercomputers that crunch texts from the internet and imitate a writer’s style. To take the next big step, AI researchers, cognitive scientists and storytellers will need to work together in understanding how the creative mind works and how to tell a good story.
 They have more than 300 dispensers installed around the world that have printed more than 5.6 million stories https://short-edition.com/en/p/short-story-dispenser
 Klein S., Aeschliman J.F., Balsiger D.F., Converse S.L., Court C., Foster M., Lao R., Oakley J.D., and Smith J. (1973). Automatic novel writing: a status report (Tech report 186). University of Wisconsin Computer Science Dept.
 For some computer-generated story extracts we have converted uppercase text to lowercase and added capitals letters to names and at the start of sentences, to make the text more readable. Also, where a story is output by the computer one sentence to a line, in some cases we have let the text run on. That apart, the story is shown as output by the computer program. For stories generated by the GPT program, the punctuation and layout are shown exactly as output by the program.
 Wilson, G. (1971). The Science Fiction Horror Movie Pocket Computer. National Lampoon, November 1971. Published here with permission of the current owners of the National Lampoon brand.
 Propp, V. (1968). Morphology of the folktale. University of Texas Press.
 A story outline isn’t quite the same as a plot. A plot is the sequence of significant events in a story, where one event causes or leads to another. A story outline shows the overall structure of a story, but not necessarily how one event leads to another.
 Smith, T. C., and Witten, I. H. (1991). A Planning Mechanism for Generating Story Text. Research Report 91/431/15, Department of Computer Science, University of Calgary. https://prism.ucalgary.ca/bitstream/handle/1880/46185/1991-431-15.pdf
 Meehan, J. R. (1976). The metanovel: Writing stories by computer. Unpublished PhD Thesis, Yale.
 Turner, T. R. (1994). The creative process: A computer model of storytelling and creativity. Hove: Lawrence Erlbaum Associates.
 Casebourne, I. (1996). The Grandmother program: a hybrid system for automated story generation. In Proceedings of the Second International Symposium of Creativity and Cognition (Loughborough, England, 1996), pp. 146-155.
 Melpomene & Uniwersytet Jagielloński (1980). Bagabone, hem ’I die now. Vantage Press.
 Racter (1984). The Policeman’s beard is half constructed. Warner Books.
 Dewdney, A.K. (1985). Artificial Insanity: when a schizophrenic program meets a computerized analyst. Computer Recreations, Scientific American, January 1985.
Lewis, P.H. (1985). Peripherals; A new brand of lunacy for sale, The New York Times, 14 May 1985.
 French, S. (1993). Just this once. Carol Publishing Group.
 Radford, A., Wu, J., Child, R., Luan, D., Amodei, D., & Sutskever, I. (2019). Language models are unsupervised multitask learners. OpenAI blog, 1 (8), p. 9.
 The first person voice in this story opening is distinctly different to Nineteen Eighty-Four, which continues “Winston Smith, his chin nuzzled into his breast in an effort to escape the vile wind, slipped quickly through the glass doors of Victory Mansions,”.
 Hao, K. (2020). OpenAI is giving Microsoft exclusive access to its GPT-3 language model. MIT Technology Review, September 23, 2020.
 Giving characters life with GPT3, https://fable-studio.com/behind-the-scenes/ai-generation Create and distribute the story of your AI Virtual Being, https://fable-studio.com/wizard .