Why AI won't be writing feature stories soon

Or why I gave up using AI for that.

Why AI won't be writing feature stories soon
Photo Credit: Unsplash/Dominik Scythe

Can we use AI to write entire stories? Just this week, I heard from two publications whose leadership wondered if they could use generative AI to write editorial articles. Could they use AI to produce more stories or perhaps even replace some writers, they wondered.

As a long-time contributor who has written for some two dozen publications over the last decade and a half, I have examined this topic with some interest – and no small amount of trepidation, since ChatGPT became publicly accessible.

Over the last 20 months, I've applied AI across the wide range of content I work on, from editorial articles, company blogs, and thought leadership pieces, to marketing collaterals. And my conclusion is this: AI won't work for editorial content. Here's why.

red theater chairs
Photo Credit: Unsplash/Rick Barrett

The monotony of AI

Trained on the sum total of humanity's digitalised knowledge base, AI is an exceptional averaging machine. Throw in a disparate collection of unrelated sentences, and it churns out a homogenised paragraph that, at first blush anyway, reads "okay".

Unfortunately, the very strength of generative AI is also its Achilles heel. Left to its own devices, it invariably produces the same bland content. It's like a novice cook who has just discovered the magic of MSG as a flavour enhancer – and now insists on using it liberally on every single dish. The food might look different, but everything tastes the same.

Don't take my word for it. Run this experiment yourself. Have AI generate 10 different articles on a specific topic. You will end up with a set of articles that, while coherent, look just like every other piece produced. In other words, unremarkable, completely forgettable stuff.

Sure, you can guide AI with more prompts, point it to historical work, and enrich it with additional quotes. But then the question arises: Did you hire prompt engineers or writers? And as I noted yesterday, I’ve found that getting AI-generated copy up to scratch often takes more effort than writing it from scratch.

low angle photography of motocross player performing motocross flying style
Photo Credit: Unsplash/Web Donut

The limits of personal competency

One technique that's frequently mentioned is to generate an initial draft using AI, and then have humans spruce it up before publication. It sounds great in theory, and I would be the first to agree that skilful editing can make an already good piece of writing exceptional.

The problem is this strategy works only if the individual assigned to editing the AI-generated piece is skilled in both editing and writing. Put it this way, only when one has climbed the mountain would he or she be able to guide others to the summit.

Having done a fair amount of editing in recent years, I'll add that editing is a different skill set than writing. That's why large news organisations have copyeditors who specialise in correcting, refining, and polishing submissions to make them shine.

But assuming a publication has that individual who can do it all, why assign them to edit an endless stream of insipid AI-generated content to make them palatable, instead of writing that next hit piece? Even if the former makes sense, I can guarantee you that this individual will only stay as long as it takes to secure a new job with a more enlightened employer.

shallow focus photography of fireworks
Photo Credit: Unsplash/Dawid Zawiła

That missing spark

I am not dismissing the use of AI completely. Instead, I am focused solely on editorial excellence here, not how good AI is at churning out new permutations of a headline, generating ideas, or crafting personalised email messages – AI is exceptional in those areas.

For instance, I've read success stories of AI being used to quickly generate scores of articles for SEO. Others have achieved success using AI to tirelessly write and send out individually tailored email messages to tout their products. If that's your cup of tea, AI away.

But if your audience is smart, savvy readers who are specialists in their respective domains, you can be sure that they will pick up on the use of AI in short order. And if you are hoping to build trust, the wholesale use of AI to write articles will only destroy that trust.

As I wrote last week, our mistake is we use human standards to gauge generative AI, which leads us to overestimate what it can do. Just because it is good at certain use cases does not mean that it excels in all areas. Indeed, AI-generated content often lacks the nuanced understanding and connection that human writers bring to the table.

woman on sea in front of sun
Photo Credit: Unsplash/frank mckenna

Where do we go from here?

Based on what I see of today's best AI models such as GPT-4o and Claude 3.5, I am confident that exceptional editorial content can only come from human writers for their creativity and emotional connection. And in a world where everyone can access generative AI, the onus is on writers to get even better at their craft, infusing their work with a unique voice and nuance that AI cannot replicate.

This is not to say AI cannot be used to generate long-form content. I've seen at least one author harness AI to dramatically speed up the writing of a book (AI-First Nation) without the verbose and inane outputs inherent to AI. Then again, Laurence Liew is also a maverick with an understanding of AI that goes far beyond that of the average writer.

Of course, AI is getting better by the day with the next generation of AI models currently being trained on 100,000-strong GPU clusters. Who knows what the future will bring?

Enjoyed reading this? Sign up here to get a digest of my stories in your inbox every week.