A race to the bottom?

The rise of AI-generated literature

Featured in


BEFORE HER RECENT interview with British novelist Julian Barnes, journalist Katie Razzall prompted an AI chatbot to pen an opening line in the author’s style, which read as follows:

He had always believed that memory behaved like a courteous guest – arriving when invited, leaving when ignored – but lately it had begun to loiter, hands in pockets, humming tunelessly in the corners of his mind.

Although the line contains the pithy pretence of nonchalance one might expect from Barnes, there remains something inanimate about it. Barnes himself was unimpressed. He deemed the sentence ‘crass’ – nothing more than ‘a pastiche’. Yet whether Barnes believes he was successfully imitated is incidental. The real question is whether a reader would recognise the difference – and if they did, would they care?

As outlandish finances engorge the balance sheets of AI firms and inflate promises for the future, a chatbot’s perceived intelligence is often conflated with its ability to create. Last year, OpenAI lit up imaginations with a metaphysical short story penned by one of its unnamed large language models (LLMs). This signalled the tech industry’s mandate for a chatbot that could compete with human creativity. After all, if a chatbot’s fiction is indistinguishable from a human’s, it becomes more difficult to convincingly reject differences between machine and human intelligence.

The obvious outcome of this development heralds a cheap alternative to writers, colossal advances on economies of time and a product that can be generated almost as fast – if not faster – than it can be consumed. Amazon leads marketplaces already saturated with machine-farted missives, but these are widely, to borrow Barnes’ word, crass attempts at story. However, generative AI is an aggregating tool that only promises to get sharper. As tech bros assure us that nothing is safe, it is highly likely that such promises will affect genre fiction – with its penchant for trope and formula – the greatest and swiftest.


EYEBROWS WERE RAISED when former NRL player, Bachelor star and overnight BookTok sensation Luke Bateman recently signed a two-book deal with Atria Books (an imprint of Simon & Schuster). Bateman reportedly only produced around 10,000 words in total, including writing sample, synopsis and chapter outline, and had no publications to his name. While much has been written on Bateman and the ethics of the deal here and elsewhere, little importance has been given to the fact that the contract signals the willingness of major publishing houses to prioritise internet fame over literary acumen. Publishers want a public figure whose notoriety will sell books. They aren’t overly perturbed about whether that public figure can write them. The Bateman saga should be viewed as a crucial weathervane indicating the direction of the Australian publishing industry, particularly in the midst of aggressive AI advances. Perhaps future bestsellers will be plotted by an algorithm and penned by a tireless aggregator of language before being sold under an influencer avatar. This could be mistaken for the prediction of someone wearing a tinfoil visor if it weren’t for the fact that it’s already happening.

Vauhini Vara’s Bloomberg essay ‘The AI romance factory’ more or less details such a circumstance. Manjari Sharma released her novel Keily through subscription-based reading platform Galatea. The book achieved commercial success and Galatea was keen to turn Keily into a series. Galatea was able to produce the subsequent novels with little requirement for Sharma’s consent, as per the fine print in her contract. The novels were contracted to ghostwriters who returned completed manuscripts within weeks. These ghostwriters were – alarmingly but unsurprisingly – entitled to use AI. Fans of the first novel complained that the subsequent Keily novels experienced a sharp decline in quality. They were, however, available in rapid production time.


THE AUSTRALIAN SOCIETY of Authors cautions that ‘inferior AI-generated content’ – the narrative and written quality of which will be paper-thin, homogenous and bland – will ‘flood the market’. This poses a risk to emerging writers, who will represent a more expensive option to the publisher than LLMs and therefore have fewer opportunities. With a decline in the development of quality storytellers who challenge limp, computer-generated literature, the products decanted from the AI hive mind may soon come to dictate consumers tastes.  

One may be tempted to allay these concerns by pointing to the generosity of readers and the literary sector’s dedication to community, but the longevity of these factors is difficult to quantify. In a recent New Yorker article, Vara describes how ‘excellent readers’ mistook AI-generated work trained on her writing as her own. The experiment demonstrated how precise the machine can be and that some readers may, despite all our bemoaning, enjoy AI-generated fiction. Perhaps this suggests that the barrier to the widespread adoption of creative AI-generated work is one built upon stigma rather than reasoned criticism or the celebration of human achievement.

It seems increasingly inevitable to me that two markets will soon emerge from the literary sector: one market for cheap, AI-generated content and another for the current, traditional model of publishing. I believe that publishers must value both to be sustainable. In spite of the evolving marketplace, publishers still have an obligation to guide emerging authors, editors and publishers through the peaks and pitfalls of a career in a notoriously complex and veiled industry.

While in London late last year, I was introduced to a debut writer who had just accepted a two-book deal with a major publisher. Over dinner, she mentioned some of the details of her recently negotiated contract, including the stipulation that any royalties earnt from her first novel would only be paid out upon the submission of her second. This was surprising, as the advance certainly wasn’t enough to sustain her while she wrote that second novel. Fortunately, her agent scoffed at this proposal and stated that such a stipulation should not be suggested again during proceedings. Agents are common in the UK but less so here. This leaves optimistic emerging writers vulnerable to falling into similar traps. Australian writer Ashley Goldberg recently stated that he believes Australian authors struggle to assert themselves at the negotiation table due to a fear of being passed over by their prospective publishers. The mystery enshrouding the publishing industry and the desire to be published means it’s likely that emerging Australian writers would accept undesirable conditions (such as sacrificing royalties). Without that income stream, it’s much more difficult to write the second novel. Failure to submit could tank a career; this is at odds with a publisher’s duty of care to their writer.


A PUBLISHER’S DUTY of care extends to the industry’s bottom line, too: readers. Poor-quality literature leads to fewer readers, fewer sales and, ultimately, poorer LLMs that will eventually have no more original work to study. This will become increasingly common if fewer humans are writing. LLMs famously regress when trained on their own outputs, which is why many tech companies have opted for an ‘ask forgiveness, not permission’ approach to stealing writers’ work.

A copyright exemption was proposed last year by the Productivity Commission Report which, if granted, would have permitted LLMs to mine work without being held to copyright infringement laws. Understandably, writers were – and continue to be – critical. Over 50,000 creatives signed an open letter last year against the unlicensed use of their work to train AI. Author Jennifer Mills called the theft ‘heartbreaking’ and urged the government to intervene. Jeanette Winterson astutely stated: ‘tech bros need to pay for what they want. They pay lawyers and lobbyists. Pay artists.’ Ultimately, the government rejected the proposed exemption.

While writers were lobbying to retain the rights to their work, Writing Australia – the new body for Australian literature – demanded that government actions be responsive to the ‘fast-moving and rapidly evolving landscape’ of AI. Their focus on the principles of transparency, ethical engagement, and inclusion seem to indicate their acceptance of AI’s inevitable widespread adoption into the industry, but also the need to safeguard artists while approaching this new normal.

Atria’s move to reward Bateman’s popularity over the skill of more established writer’s is cause for worry – AI-generated work may be rapidly developing into a product both cheap and of sufficient quality to satisfy a demographic of readers that traditional publishing risks losing. To cope, publishers must reconcile the immediate losses to writers against the long-term harm to the industry. What’s obvious to me is that – if we are to sustain the industry and protect the interests of all parties – collaboration between Big Tech, government and Australian publishers is necessary. This problem is not going away, so we’re all going to have to find a way to deal with it.

Share article

More from author

Behind the bestsellers 

GR OnlineThe argument that independent and small book grocers have a more impressive and widespread range has held for a long time, but in May, Big W was awarded the prestigious Book Retailer of the Year at the Australian Book Industry Awards (ABIAs)... Recognition on this level legitimises the sales of books for half their value.

More from this edition

Stay up to date with the latest, news, articles and special offers from Griffith Review.