Digital Innovation Digital Publishing
5 mins read

AI: An invaluable sixth sense for journalists?

Getting your Trinity Audio player ready...

When China showcased the world’s first news anchor created with Artificial Intelligence at its annual World Internet Conference, the publishing world hardly responded positively. The Guardian entertained the possibility of an even more controlled and censored news coverage in China; CNN pointed out how major news organizations like the Associated Press were already using sophisticated computer algorithms, instead of journalists, to write thousands of automated stories a year; The BBC interviewed academic experts who found the anchor unnatural and not engaging enough, but acknowledged that it was a good first effort.

Present among this commentary was a question that comes up every time a new AI-powered system is introduced – is it going to replace humans? More specifically, in this case, could journalists lose their jobs to AI?

According to Wang Xiaochuan, Chairman of Sogou, the company that developed the AI news anchor with Xinhua, “AI technology is divided into perceptual technology and cognitive technology. Perception is sound, there are images, in the direction of perception technology, the machine basically has the opportunity to be as good as people. However, in the direction of cognitive technology, the reasoning, knowledge, and thinking behind the machine, the logical thing with language as the core, the processing power of the machine is limited. In this case, when it comes to people’s advanced activities, the machine can’t do it now.”

In other words, journalists are needed despite the advances made by AI technology. But there is little doubt that AI tools have the potential to transform the way they work.

AI tools help scale up journalists’ ability

Machines can be more rigorous and comprehensive compared to reporters. An AI tool can import data from a variety of sources, and identify trends and patterns. It can then, using Natural Language Processing, put those trends into context, and even create a fairly accurate description.

It can systematize data to find a missing link in an investigative story. It can identify trends and spot the outlier among millions of data points that could be the beginning of a great scoop. For example, BuzzFeed News used machine learning to identify surveillance aircraft run by the US Marshals and military contractors, and The Atlantic used it to figure out whether Donald Trump is writing his own tweets.

Algorithms offer something like a data-driven sixth sense that can help align journalistic attention. Whether they are monitoring political campaign donations, scrutinizing thousands of documents for an investigation, keeping an eye on the courts,  identifying newsworthy patterns in large datasets or surfacing newsworthy events based on social media posts, AI tools can help speed up and scale up journalists’ ability to scan the world for interesting news stories.

“Machines to mine data, and humans to tell stories”

Reuters uses artificial intelligence tools called News Tracer and Lynx Insight to drive its journalism with human judgment and machine capability. News Tracer helps journalists jump on breaking news stories on Twitter, and weed out unreliable sources. Lynx Insight augments human journalism by identifying trends, key facts and suggesting new stories reporters should write.

The AI tool sifts through the 700 million daily tweets in real time and flags any potential breaking news stories which meet the newsworthiness and veracity requirements programmed into the algorithm. For example, it will look for clusters of similar tweets to run against ‘newsworthiness rating’ and then verify the source against numerous factors on their profile such as followers, attached media, links and tweet structure.

The tool can execute the initial journalistic procedure on a scope and speed that is impossible for a human. It then presents these findings to the journalists, who can then complete the final line of verification themselves to ensure the source and story are fit for publication.

We can use machines to mine data, and use humans to tell those stories. Machines can help humans get a head start on news.

Reg Chua, Executive Editor, Reuters

Newsworthy developed by Journalism++ in Sweden is yet another tool that helps journalists find potential news stories. The algorithm monitors open government data and identifies statistically interesting leads based on anomalies, outliers, and trends in numerical data streams such as real estate prices, weather patterns, and crime reports, among others. Its users can subscribe to leads from various datasets to receive an alert when an interesting statistical pattern is found.

For example, the figure below shows a lead produced by Newsworthy. It indicates that the number of asylum applicants from Pakistan to the EU dropped between 2016 and 2018. It includes a brief description of the anomaly, and a bar graph to provide visual evidence and context. It also contains the data on which the lead is based, for reporters to verify results for themselves.

“Hype, spin, and bias”: Limitations of AI

A major challenge in building and using these tools effectively is defining in algorithm form what is newsworthy. It is difficult to pin down for an algorithm because newsworthiness depends on a range of individual, organizational, social, cultural, ideological, economic, and technical forces.

A story may also need to fit with a publication’s editorial focus, agenda, or other organizational requirements, as well as with audience expectations. Newsworthiness is not a quality that is built into an event. It is the story’s context and how it’s perceived by humans, that determines its newsworthiness. And algorithms are not yet capable of determining that.

Journalism still needs people to perform the difficult task of weeding through the hype, spin, and bias in ambiguous statements, but computers can help a lot with monitoring and spotting claims that might be deserving of journalists’ attention. It’s “a useful tip sheet” that alerts us to things that we would have otherwise missed.

Glenn Kessler, Editor of the Fact Checker blog at The Washington Post

What next?

This is where human intelligence comes into play. Journalists determine the newsworthiness of a story by analyzing the data against the relevant context. Journalists also need to consider how much they can trust leads generated by algorithms. AI can make mistakes; after all its programmed by humans. Glenn Kessler of the Post recounted an instance when the machine attributed several Sarah Sanders quotes to Bernie Sanders. He was able to spot the error after looking at the transcript.

Reporters and editors should consider learning how these systems operate and how they can be used to enhance their journalism. That also means learning to become comfortable working with algorithms and tech people.

According to Helen Vogt, Former Head of Innovation at the Norwegian News Agency, “Many old-school journalists seem unable to talk to tech people: they do not understand what developers do, so they often disrespect their work. Learning a bit of python code would probably help. A course in simple programming for journalists is something I definitely recommend.”

AI isn’t going to take away journalists’ jobs anytime soon. It can’t make trusted contacts, chase endless empty leads based only on a hunch, and fearlessly stand its ground against powerful forces. It’s a tool with scope for further improvement and the potential to help journalists cover the increasingly complex, globalized and information-rich world we live in, more efficiently and effectively.


Download WNIP’s new Media Moments 2018 report, which dives deeper into this year’s developments in publishing, and looks at what opportunities 2019 could usher in. The report is free and can be downloaded here.