A.I. Has Taken Self-Publishing By Storm. Writers and Publishers Weigh In On How to Cope

Observer interviewed authors and publishers on their frustrations and hopes with A.I.

A Barnes & Noble shop
People walk outside the Barnes & Noble book store on Fifth Avenue as New York City. Noam Galai/Getty Images

Artificial intelligence (A.I.) has struggled to stay in the good graces of writers and readers this past year. Reports of companies and individuals using A.I. to spread misinformation, infringe copyright and steal authors’ identities have dominated discussions of the technology’s role in the books we consume.

Sign Up For Our Daily Newsletter

By clicking submit, you agree to our <a href="http://observermedia.com/terms">terms of service</a> and acknowledge we may use your information to send you emails, product samples, and promotions on this website and other properties. You can opt out anytime.

See all of our newsletters

OpenAI, the company behind ChatGPT, has been sued multiple times this year for copyright infringement from famous authors such as Mona Awad, Paul Tremblay and Sarah Silverman. Most recently, the Author’s Guild, a labor union representing many prominent authors including George R.R. Martin, Jodi Picoult and John Grisham, accused OpenAI of violating copyright law by using authors’ works to train A.I. models. These lawsuits could have a major impact on the use of A.I. in publishing.

Already, a swarm of A.I.-generated books have entered the market through self-publishing platforms. Self-published authors produce an enormous percentage of books available for sale. Upwards of 34 percent of all e-books are self-published, according to WordsRated, a research firm specializing in the publishing industry. Since A.I. applications powered by large language models (LLMs) became commonly available, self-publishing platforms have seen an influx of content. And publishers face mounting pressure to crack down on A.I. content as controversies continue to sour this technology’s reputation.

However, as A.I. applications become more sophisticated and users learn how to fine-tune their prompts to generate human-like writing, detecting and regulating such content is increasingly challenging even for companies with significant resources at their disposal.

In September, Amazon (AMZN)’s self-publishing platform, Kindle Direct Publishing (KDP), issued a policy that limits the number of self-published books to three a day and required authors to disclose if any part of their manuscript was generated by A.I. It’s unclear, though, how Amazon enforces this new policy. “While we allow A.I.-generated content, we don’t allow A.I.-generated content that violates our content guidelines, including content that creates a disappointing customer experience,” Ashley Vanicek, a spokesperson for Amazon, told Observer in a statement. “We invest significant time and resources to ensure our KDP content guidelines are followed.” Amazon would not comment on how those guidelines were enforced. 

Amazon is the largest book distributor in the U.S., controlling at least 40 percent of all book sales in the U.S., and roughly 1.4 million self-published books are sold through KDP every year, according to WordsRated data. Amazon’s policy change marks one of the first industry attempts to establish guidelines for how A.I. is used in self-publishing. But many industry veterans think the situation is already beyond the hands of any one corporation.

“There’re no good tools out there.”

Self-publishing companies provide tools and services including production, distribution and marketing. Many major publishers, such as Barnes & Nobles and Apple Books, run their own self-publishing platforms. The process of getting a book published varies by company. Generally, an author uploads their manuscript to the publisher’s platform, where it will then be formatted and distributed for sale through their website and book retailers.

“Authors can expect their work to go live within days to hours after submitting,” Kris Austin, CEO of Draft2Digital, a self-publishing company and distributor, told Observer. Draft2Digital distributes self-published books in both print and digital formats to bookstores across the country in exchange for a fee on every copy sold.

Austin said Draft2Digital has also seen an influx of submissions over the last six months, but the company has yet to introduce any A.I.-specific policies. 

Self-publishing companies implement a variety of internal systems that flag instances of plagiarism, poor grammar and indicators of overly generic or nonsensical writing. Draft2Digital requires manuscript to be original, meet a minimum page count, and fit the genre under which it’s submitted. Submissions flagged by its internal system will be reviewed a staff member, and then a deeper investigation may reveal A.I. was used in generating the work.

Like Amazon, authors are asked to disclose if A.I. is used in their work on Draft2Digital’s platform. If plagiarism or nefarious information (such as inaccurate medical information in nonfiction books) is discovered, the books will be taken down from the service. But that system only works in identifying obviously low-effort or nefarious works. For identifying A.I.-generated books that are of high quality, “it’s pretty much impossible,” Austin said. “There’re no good tools out there.”

Just as A.I. is creating new challenges for many industries, it’s also driving a new industry of detecting where it’s used. A crop of companies have emerged offering software designed to catch the use of A.I. in fiction writing. Some of the most popular tools include Optic, CopyLeaks and GPTZero. However, detection software will always race to keep up with text generators. As long as generative A.I. improves, detectors will never be foolproof, and examples of inaccuracies have already been reported.

Barnes & Nobles, which runs its own self-publishing platform, Barnes & Nobles Press, said it doesn’t sell any A.I.-generated books in its stores. “If we ever do so, we will be sure to inform customers the way in which the book has been ‘written,’” a Barnes & Nobles spokesperson told Observer. “Online, almost all books that are published are available. We will always tell customers, to the best of our abilities, if books are A.I. generated.” Barnes & Nobles Press did not say if it uses A.I. detection tools.

Nick Thacker, vice president of Draft2Digital’s Author Success division, said many authors are already incorporating A.I. in their work. Since every author uses A.I. differently in their work, he is doubtful that there’s a tool that can catch A.I.-generated text across an entire book catalogue.

“I don’t know that there’s ever going to be a tool that’s 100 percent at detecting this stuff,” Thacker said. “And we’re seeing so many false positives with any tool right now, upwards of 50 percent to 60 percent.” When asked for examples of such tools, Thacker said he’s most familiar with Contentdetector.ai. When testing Contentdetector.ai, Observer found articles entirely written by humans were rated as “likely human” but with a 25 to 30 percent likelihood of being generated by A.I.

A.I. provides a “huge boost” to productivity

Thacker is an author himself, with over 40 action-thriller titles to his name that landed him on USA Today’s best seller list. Part of Thacker’s skepticism in detecting A.I. is stems from the fact that he uses the technology in his own work and finds it’s almost impossible to notice A.I. use in the end product. Thacker said he frequently uses A.I. transcription software to dictate his spoken words into text for an initial draft and then uses ChatGPT to clean it up. He said he can get exactly what he’s looking to write with this method alongside his large network of editors and beta readers, and that A.I. has given him a huge boost to productivity.

Thacker said, while there is a lot of fear of writers losing out on work, there’s just as much optimism among both new and veteran writers in how this technology can speed their workflow. He and his colleagues are already using a variety of A.I. tools to produce book cover art, brainstorm ideas and dictate text. For the average reader, he said, they’ll almost never notice if any part of the book they’re reading is generated by A.I. “It’s just not present as much for the consumer,” Thacker said.

The mounting legal cases against OpenAI by writers have made the future use of A.I. uncertain, and publishers are waiting to see what policy changes they will have to make depending on the outcomes of these lawsuits. As a writer, Thacker said he’ll follow what the judicial system recommends, but his greatest hope is that authors and publishers will receive clear guidance from the courts on how A.I. technology can and cannot be used for their work.

Nick Thacker
Nick Thacker has authored more than 40 action-thrillers. Courtesy of Draft2Digital

While there’s a growing sense of urgency among publishers to regulate the use of A.I., a new industry is forming to help A.I.-generated works reach the market. Companies like Sudowrite, which offers A.I. generation tools tailored specifically for fiction writing, have emerged promising authors the ability to speed their workflow and create books that consumers want to buy.

Elizabeth Ann West is the CEO of Future Fiction Academy, an online school that runs labs and workshops teaching fiction writers how to incorporate A.I. with their writing. Future Fiction Academy currently instructs over 200 students, the majority of whom have already published.

West, who has published more than 30 novels since 2011, first experimented with A.I. in 2014 when using Dragon Naturally Speaking, a transcription software, to put her spoken words into text. Now, using ChatGPT, “I can get 5,000 to 10,000 words in five minutes with sequencing prompts,” West said. Sequencing prompts are prompts put into ChatGPT that generate multiple options at a time.

For her own work, West uses a mix of A.I.-powered dictation transcription software and ChatGPT to generate entire scenes of her story which she would later edit. In June of 2022, she published a novel through KDP, For the Love of a Bennett, which contained a significant amount of content generated through Sudowrite. “I would say it was definitely more than fifty-fifty,” West said.

A.I. chatbots are not capable of generating entire books at once, but can generate bursts of content of a few hundred words with every user input. The longer a story goes on, the more likely a language model will lose track of the plot and show inconsistencies or generic prose in the writing. At Future Fiction Academy, West works with authors to show them what happens on the back end of these LLMs and how to set ChatGPT’s parameters to produce not just eloquent prose but professional book descriptions and pitches for publishers.

For writers looking to use A.I., West thinks Amazon’s new limit to self-publishing is a good start, albeit coming a bit late. “I think that Amazon’s making really good policies,” West said. “I just wish that they had that limitation earlier because that would have helped prevent scammers early in the Kindle Unlimited program.”

West said Future Fiction Academy’s students include both new and experienced writers. Some authors are excited to experiment with a new tool, but most are seeing how quickly A.I. is developing and worried what their future will look like if major publishing companies start investing in the technology.

“Most of the people who are coming to me are people who have known me and worked in the industry for years,” West said. “They’re coming because they’re like me and they can see the writing on the wall. If we don’t figure out a way to harness this [technology], this is going to put us out of business.”

A.I. Has Taken Self-Publishing By Storm. Writers and Publishers Weigh In On How to Cope