Why would writers use AI?
More like NaNoWriNO, am I right? … NaNOWriMo? Maybe a robot would make a better subtitle.
In September, National Novel Writing Month (NaNoWriMo)—or, rather, the organization behind the month-long writing challenge—published a proclamation1 that nobody asked them to make. The nonprofit declared that the use of so-called “artificial intelligence” to write a novel was good and fine and dandy. They made this declaration with no asterisks, not so subtly including generative AI in their statement, despite the fact that generative AI models have been built by stealing the work of hundreds of thousands of writers.
In fact, the NaNoWriMo organizers wrote, if you had a problem with somebody using generative AI to create a novel, you were ableist. And classist. And probably racist and homophobic, too.
The statement sparked immediate repercussions for the organization in the form of resignations from its advisory board . . . and it broke containment. The New York Times and The Washington Post covered the story. Popular commentary YouTuber D’Angelo Wallace vlogged about it. Every time I logged onto social media, my feeds drowned in commentary.
But why would you want to use generative AI to complete a challenge in which the purpose is to stretch your creative limits? Wouldn’t allowing somebody else to write the book for you defeat the purpose of NaNoWriMo?
Why would a writer want to use generative AI to write a novel, anyway?
Let’s all get on the same page real quick. “Artificial intelligence” is not, in fact, intelligence. What we refer to as “artificial intelligence” are models: “a program that has been trained on a set of data to recognize certain patterns or make certain decisions without further human intervention.” The creator of an artificial intelligence model feeds data into the program, which the program categorizes into numeric patterns. When users engage with the program, it responds by putting together the most likely pattern in response to the numbers it recognizes in the prompt. The stupidity of artificial intelligence goes beyond made-up information, such as the “hallucination” in which Google’s AI model told users to put glue on pizza to make the cheese stop falling off. Some models can’t even correctly count the number of r’s in the word strawberry.
There’s a lot to loathe about artificial intelligence. It steals writing from books and movies and websites and newspapers and social media posts and YouTube videos. It helps companies spy on you. It ruins search engines. It wastes trillions of dollars. It eliminated numerous jobs, including those in animation, writing, music, gaming, and teaching, and it ruins others. It encourages plagiarizing. It perpetuates stereotypes and sexual harassment. It’s bad for the environment. It scams creators and readers to dangerous extremes. It aids censorship efforts. It spreads false scientific information. It lies and lies and lies. As an added bonus, the vibes of its creators are rancid.
And yet, despite all the horror that generative AI brings, there is an obvious reason to use it to write books: money. It’s the digital solution for the uncle who is an inadequate writer and who, instead of improving his own skills, asks his niece to write a mystery novel using his idea because it’ll make them rich and famous. We live in a capitalist society. Everyone wants to make money.
Self-published authors, in particular, seem to lean on generative AI. In Josh Dzieza’s article “The Great Fiction of AI,” published in The Verge, author Jennifer Lepp admits to using the generative AI elements of the program Sudowrite to help finish her novels as quickly as possible. Under pressure from fans to keep churning out more material, it felt to Lepp—and other authors cited in the article—that this was the only route to “maintain relevance.”
She once surveyed her mailing list to ask how long readers would wait between books before abandoning her for another writer. The average was four months. Writer’s block is a luxury she can’t afford. . . . She had already compiled a database of novels to search when she felt she was overusing a phrase and wanted to see how other authors finished the sentence. She told herself she would use Sudowrite the same way—just inspiration, no cutting and pasting its prose. As an independent author, a small increase in production can yield big returns.
Traditional publishing leapt on generative AI when it saw an opportunity to make a quick buck. 2023 saw the release of an AI-generated poetry collection called I Am Code and an AI-generated murder mystery called Death of An Author. Both books made big media splashes before disappearing fairly quickly. I’ve not heard a single person mention either one at my bookstore. I have no idea if Hachette Audio recouped what they spent on hiring Werner Herzog to narrate the audio edition of I Am Code.
In fact, relying on generative AI as a writer to make money is a … well, terrible long-term plan. Despite what the enthusiastic users of generative AI in “The Great Fiction of AI” claim, it’s statistically unlikely that generative AI will actually help us with productivity. From a craft perspective, Clarkesworld editor Neil Clarke found it easy to catch AI-generated submissions to the science fiction magazine, because they were “bad in spectacular ways.” The submissions will only get worse. Without more content to consume, the programs will begin cannibalizing the content generated by other programs, and “like vampires drinking their own blood, their prose [will] inevitably become successive degrees removed from actual human experience.”
Money is boring. Yeah, yeah, we all live under capitalism, yada yada. It’s the obvious reason to use generative AI. It may explain the use of generative AI by established writers and publishing executives, and some of the use of generative AI by new writers. But I don’t think it’s the main reason aspiring creatives are drawn to generative AI.
That reason rests in something NaNoWriMo once prided itself on: community.
If you ask a random person on the street, “Who creates a book?” they will tell you that it’s the author. The author writes it, after all. It’s their name on the cover.
But that’s not the whole truth.
A book cannot be created without a writer. Much like reading, however, writing is a communal activity. While the initial act of writing might happen in a bubble—alone at a computer, jotting notes down on a phone, scribbling pages in a notebook on a lunch break—a published book is not made by one person alone. The words may come from one brain, but it takes a village to see them into their final form. There’s brainstorm buddies, beta readers, developmental editors, copyeditors. The idea itself may even come from outside the writer. Referring to artwork commissioned by private patrons or large companies, author Lincoln Michel wrote in his Counter Craft newsletter “Art Without Intention,” “Plenty of art both high and low has been made with prompts. The only thing new is the claim that the prompter—the person who didn’t actually make the thing—is somehow the artist.”
In “The Great Fiction of AI,” author Joanna Penn, who sells classes on how to use generative AI to improve your writing, observed that writers’ hesitation to use generative AI comes from “a misguided sense of purity: that writing should come from your unique, unaided brain.”
Books already do not come from one unique, unaided brain.
Books come from community.
It is impossible to write a book in a bubble. Books are seeds planted in a garden; just because a writer only tends to one seed, or a reader only sees one flower, does not mean that the plant grew in a soil somehow unaffected by the plants around it. Writers respond to the world around them, consciously or subconsciously—and as Ursula K. LeGuin points out in Steering the Craft, you cannot learn to be a better writer without reading the works of others in your garden (or your sea, to continue her boat metaphor) and without the guidance of others to improve your weakest craft elements.
From famous writing collectives such as the Socrates School, the Bloomsbury Group, the Dymock Poets, or the Inklings (which is, far and away, the best named of the lot), to library classes for new writers, private Discord groups, and, yes, even NaNoWriMo meet-ups, communities help writers by giving them a place to learn, grow their craft, and share knowledge about the publishing industry. One of the only glorious aspects of the internet is that these communities are now accessible to anybody with a brave heart and a decent Wi-Fi connection.
Claiming that disabled and other marginalized folks are unable to find community without the help of generative AI, as NaNoWriMo implied in their statement, is insulting, inadequate, and unhelpful. As author Sarah Gailey wrote in response to that statement, “The arguments themselves around this so-called classism and ableism wave off the actual existence of writing communities, critique groups, beta readers, and critique partners; they also ignore the creative realities of the impoverished and disabled artists, marginalized authors, and indie authors who have been working all this time without the help of language learning model software that was trained on work stolen from their peers and colleagues.”
Sharing your work with a community of real, live people can be . . . well, scary. My friends will tell you how loath I am to share any of my creative projects before I am certain they are ready to be seen. I would rather delete thousands of words than risk sharing something less than perfect. Leaning on generative AI strips some of the fear of sharing your imperfect work with other real, live people. It keeps the emotions of both the written work and its creation at arm’s length, leaving them inauthentic to both writer and reader. As Simon Peng writes in his Everybody’s Creative newsletter, “These generative programs give people an ‘out’ from working through the most emotionally difficult part of art making. If someone asks you why you drew that picture or wrote that song or built that chair, you can avoid all the messiness of trying to explain how, deep down, you are a person and you feel things and you just wanted to say something good.”
The messiness is a necessary part of the process—even though, gosh, I sometimes wish it weren’t. But the effort is what makes the work worthwhile. As Nick Cave writes, “Even though the creative act requires considerable effort, in the end you will be contributing to the vast network of love that supports human existence.”
The desire for community—to find themselves in that vast network of love—is what secretly appeals to the hearts of writers using generative AI. In “The Great Fiction of AI,” Dzieza writes, “Something about the experience of using AI feels different. It’s apparent in the way writers talk about it, which is often in the language of collaboration and partnership. . . . ‘Using the tool is like having a writing partner,’ [author Orna] Ross said. ‘A crazy one, completely off the wall, crazy partner who throws out all sorts of suggestions, who never gets tired, who’s always there. And certainly in the relationship that I have, I’m in charge.’”
That relationship—and being in charge—may make users of generative AI feel important. Writing communities, like any communities, can sometimes make even the most successful authors feel small if they’re not vibing with the group in the way they want. It may be why so many writers and generative AI users cling to social clout.
Social clout, especially in the digital space, is a huge element of reading and writing. It is one of the explanations for why so many people can’t just write: they have to perform the role of writer, even when the job of both writing and performing seems to make them miserable. (We are, of course, setting aside the complicated performance required by modern publishing to shill your book on social media at every available opportunity.) To end the digital performance would mean losing a position of privilege, which could make the person doing the digital dance feel less important. But clout that comes from generative AI is a house built on quicksand. It collapses the moment readers and fellow writers realize that you do not, in fact, care about the work you’re putting in the world—you simply want the credit for having created something.
In his four-hour video essay on plagiarism—a video I have watched more times than I am willing to admit—YouTuber Hbomberguy ends by exploring part of the reason he thinks people are willing to plagiarize:
We don't exactly live in a world built for humans, do we? There's no guidebook for happiness or success or a sense of place in the world, and the people claiming to have one for you are really just trying to sell you something. We spend most of our little lives struggling to make these feelings fade away or find something to placate them. It's either ennui or being on weed. I know it's a little pretentious, but we're all searching for a sense of meaning and purpose in our lives, and those things are hard to come by. There's a little bit of nothing in all of us, and we'd like to fill it with something. Opening your web browser and seeing someone who seems to have it figured out, making you feel better and entertaining you and seeming to attract an audience on this roulette wheel of a planet, that's powerful. There's someone who seems to understand what they're supposed to be doing, and it's working. That's all anyone really wants. . . . It's very difficult not to want that completeness for yourself—not to just be like someone, but to be them, to attain that sense of knowing. . . . I do worry there are people out there who will never get the chance to become who they are because they're too busy trying to be like someone else who, at best, has it figured out for themselves just a little bit.
Robots cannot offer community. Generative AI cannot fill the hole in our souls where the desire to create—and the desire to be a part of a community that creates—eats away at us. Generative AI does not think. It does not feel. It does not know what you want. It does not even know what you are inputting into it. It knows words turned into data turned into numeric code, in response to which it spits back the best matching set of numeric code turned into data turned into words. It has no original ideas.
Generative AI has no heart. Generative AI has no soul. Generative AI cannot be your friend.
Generative AI cannot make you a better writer.
In the year 3024, WALL-E is real and he’s your best friend. He has great ideas about how to write a story! He’s watched a lot of musicals and has very strong feelings about plants and recycling and spaceships. He may be a little robot, but he’s got heart.
Until that day arrives, nobody wants to read a book written by or with the help of generative AI.
Lepp excuses her use of generative AI in “The Great Fiction of AI” because she otherwise runs the risk of her readers “abandoning her for another writer.” Not every reader can afford to buy every book—what a beautiful world that would be!—but the notion that readers would simply forget about Lepp in favor of whoever else happens to be churning out something similar means that Lepp is both embracing the idea of her own inadequacy—an idea that's at odds with the creation of good books—and dismissing her own community of fans. Yes, her books may be self-proclaimed “potato chip books,” but we all have our favorite potato chips—and we would prefer those chips to be made with real potatoes, not fried synthetic mush.
Ultimately, leaning on generative AI weakens Lepp’s own writing, because Lepp could be in conversation with the novelists and stories she views as competition rather than allowing generative AI to regurgitate similar elements of description within each one. Generative AI is good only at creating “derivative text”—if we can even call what it generates good. As programmers struggle to find more data to feed to the programs, generative AI struggles to create anything that is interesting for more than a few paragraphs. It lacks the intelligence to create novel-length plots. To create character arcs. To create moments that make readers really, truly feel.
In “Art Without Intention,” Michel writes:
When AI generates a metaphor for how it feels to jump in a pool on the first day of summer, it isn’t drawing on experience or trying to communicate any specific emotion. And it is not deciding that this specific metaphor will emphasize the larger themes of the novel or deepen our understanding of the specific character. It produces automatically, without intention. And here, I think, is why so few people have been interested in buying AI-generated work despite all the hype. What is art without intention? Art without a vision behind it? It’s art that has nothing to say being sold by someone who doesn’t care. . . . Anyone can come up with ideas. And almost anyone can vomit up a rough draft. The hardest and most time-consuming part of writing is shaping, honing, and refining the material into a coherent work. This requires making choices both small and large at every point. You are shaping the text to your vision, and realizing your vision through shaping the text. It’s not a process you can outsource if you care about the work. And if you don’t care about your work, why would I care to read it?
Generative AI may have tried to write that, but it would have to first understand what Ted Chiang wrote in The New Yorker. Michel’s newsletter was written in response to “Why A.I. Isn’t Going to Make Art,” in which Chiang observes:
Some individuals have defended large language models by saying that most of what human beings say or write isn’t particularly original. That is true, but it’s also irrelevant. When someone says “I’m sorry” to you, it doesn’t matter that other people have said sorry in the past; it doesn’t matter that “I’m sorry” is a string of text that is statistically unremarkable. If someone is being sincere, their apology is valuable and meaningful, even though apologies have previously been uttered. Likewise, when you tell someone that you’re happy to see them, you are saying something meaningful, even if it lacks novelty.
Something similar holds true for art. Whether you are creating a novel or a painting or a film, you are engaged in an act of communication between you and your audience. What you create doesn’t have to be utterly unlike every prior piece of art in human history to be valuable; the fact that you’re the one who is saying it, the fact that it derives from your unique life experience and arrives at a particular moment in the life of whoever is seeing your work, is what makes it new. We are all products of what has come before us, but it’s by living our lives in interaction with others that we bring meaning into the world. That is something that an auto-complete algorithm can never do, and don’t let anyone tell you otherwise.
That is community. That is writers in conversation with each other. This is writers learning from each other and improving their craft through intracommunity conversations—the same values that NaNoWriMo claims to promote.
Generative AI could not come up with the writing style and plot twists of We Deserve Monuments, one of my all-time favorite YA novels—a novel that author Jas Hammonds edited while attending Lambda Literary's Writers Retreat for Emerging LGBTQ Voices, which “turned out to be a major turning point in [their] writing journey.” Generative AI could not create the highs and lows of romance novels like The Seven Year Slip—a novel that author Ashley Poston rewrote numerous times with the help of her editor and her writer friends, honing in on its emotional arcs and wringing out her soul until she got the story just right. Generative AI cannot create stories, because generative AI is not a person. Generative AI is not a community.
NaNoWriMo once understood the importance of community. It was the basis of the entire organization. The NaNoWriMo home page enthusiastically describes how “writing a novel alone can be difficult, even for seasoned writers” and that the organization helps creators “connect with other writers in a vast community.” But NaNoWriMo demolished its once-beloved forums, citing its problematic moderation staff while offering no alternative, even in person: There is no location on the website where writers can find physical, not digital, NaNoWriMo writing groups. (Rumor is that the national NaNoWriMo organization is at war with its community branches.)
Great news, though: NaNoWriMo’s new generative AI sponsor can help you out in their stead.
Me? I’ll find my community elsewhere.
NaNoWriMo deleted their original statement, otherwise I would link it here.