Digital Innovation Digital Publishing
5 mins read

AI could help publishers prove the value of their journalism: Here’s how

Getting your Trinity Audio player ready...

The issue with AI as a tool for creation is that it raises the question of how much human interaction is required for something to be considered ‘human-made’. Publishers should be looking for ways to demonstrate the provenance of their journalism, from inception through creation to publishing.

There’s lots of talk – again – about whether AI will replace journalists. The answer is no, at least not until it can walk around, conduct interviews, and start drinking too early. After all, journalists have been using AI to help put their stories together for years at this point. It’s just a tool, and handymen weren’t replaced by their hammers.

But what it does do is allow for the creation of content at scale. While the outlets that have disclosed they’re using generative AI are all at least paying lip service to using it responsibly, it has the potential to make the pink-slime problem much more prevalent.

If you thought competition for ad spend was already fierce, it just might be about to become even more so. It doesn’t take much to pump out endless iterations on the same story already; just imagine how much easier generative AI will make it. But that’s content, not journalism.

So while AI won’t directly replace journalists, it could potentially make the economics of digital publishing even worse for the publishers that employ them. But it doesn’t necessarily need to be that way. In fact, the ease with which content can be generated provides an impetus for publishers to change how their journalism is presented – and in such a way that it could help their bottom lines. It could push newspapers to put provenance at the heart of digital publishing.

Time and transparency

I’ve sat in on a few sessions and roundtables over the past few weeks where creatives and artists have argued that ‘human-made’ will be the key differentiator in the age of AI. That having a human thumbprint on the work will increase its value, because it is truly original and unique.

The issue with AI as a tool for creation is that it raises the question of how much human interaction is required for something to be considered ‘human-made’.

Unlike, say, filters in Photoshop or spellcheck in Docs, AI can reduce human interaction to a simple, single prompt, at which time the AI takes over. Is that ‘human-made’? What if an artist draws a very rough sketch then prompts Stable Diffusion to turn it into a masterpiece in the style of Rembrandt? The user did some composition, after all, so is that now ‘human-made’? What about if they then refined it and asked the AI to use a specific colour palette instead, and to remove parts of the image? Is that, after all that human interaction, now ‘human-made’?

The point is that it will be very hard to prove that any art is ‘human-made’, as the point at which that becomes ‘made by AI’ is a moving target. The same is true for journalism: is a journalist writing a 400 word piece and asking Bard to expand it to 1200 words a ‘human-created’ article? What about just a prompt to rewrite another piece from the web?

To get around accusations of just generating art from a prompt, I’ve seen artists on Twitch stream the entire process of creating the art. Starting from a blank canvas, they document the process from conception to completion. That video is the provenance of the piece – it’s proof of the work that was put in, and even if they use AI during the process at least it’s documented.

There’s no reason pieces of digital journalism couldn’t do something similar. Just as Twitter shows the timeline of edits made to a tweet, articles could provide timestamps of their development and creation. Those timestamps, or however the article’s provenance is demonstrated, is proof of the time spent by the journalist, if nothing else.

There are problems to be worked through, obviously, particularly in a world where articles need to be heavily legalled before they actually go live. And inevitably there would be ways to fake time being spent on an article. But even providing evidence that the article was being worked on for hours, days etc. rather than the seconds required to generate one from whole cloth – that has to be worth experimenting with. And it wouldn’t just benefit the publishers who want to demonstrate their content is worth subscribing to, either.

Premium and perceptions of value

News publishers have always made the case that their articles are so inherently important to audiences that it adds value to the advertising that sits alongside it. You’ll hear that described as ‘premium’, tacitly suggesting that other content online is worth significantly less.

That’s the basis for private marketplaces (PMPs) like Ozone, which sell advertising on the understanding that the context of a newspaper’s website provides uplift to the brands that advertise on it.

But for the most part that value is based solely on that context rather than the articles themselves. There have been attempts to provide ‘scores’ for articles based on other criteria – most notably overtone.ai – but the vast majority of articles on the internet are still undifferentiated in value.

At the same time the twin issues of disinformation and trust are also being exacerbated by generative AI, as I wrote about here, so the considerations for news publishers are both commercial and societal.

So with those pressures, what can publishers do to reassert the value of their articles and combat the issues of disinformation? The answer is to bring the provenance of their journalism to the fore. By integrating the existence of that provenance – even if it’s just evidence an article was being worked on for a while – into news quality scores, it helps differentiate human-created articles from the pink slime of AI-generated content.

As I mentioned earlier, it is far from a perfectly realised idea. It might even be made redundant if the makers of AI tools require disclosure of the level of their involvement, or through regulation. It would require dev time and resources to be made available at a time when those are increasingly scarce, and for the uptake of a news scoring index from the big search players.

But in the era of AI-generated content, it would help publishers prove that their articles are genuinely made by humans – and therefore more valuable to both their readers and advertisers.


Find out how publishers are using AI in their organisations in our new report, Practical AI for Publishers

To access the report for free, please fill in the form below.

Your details will be used to send you the Practical AI report, as well as information about future Media Voices reports and United Robots communications. Please note United Robots and Media Voices are joint data controllers for this report.

Republished with kind permission of Media Voices, a weekly look at all the news and views from across the media world.