Digital Innovation Digital Publishing
4 mins read

“Establish a little beachhead”: How publishers can use comments to engage their most loyal and valuable readers

beach landscape
Getting your Trinity Audio player ready...

Many publishers are focusing on engagement as a strategy to attract and retain subscribers. The comments section can be an effective tool for building engagement. Not only that, it can also help publishers understand their readers better and tailor their offerings accordingly.

There are compelling reasons why it’s worth investing in comments on your site. While they’re usually a small percentage of your total audience, commenters are often your most loyal and most valuable readers. They spend longer on the site, they come back more often, they share more links to your site, and they’re more likely to pay for subscriptions and other services. They’re also potential sources for ideas and stories.

Andrew Losowsky, Head of Coral (an open source project helping publishers build better communities around their journalism) at Vox Media

“A chilling effect on conversations”

However, the comments section is frequently hijacked by trolls and spammers creating an unwanted and toxic environment which engaged readers start avoiding. It has become so bad that many publishers have shut down their comments sections in recent years.

Toxicity also has a chilling effect on conversations, making people less likely to join discussions online if they fear their contribution will be drowned out by louder, meaner voices. The Pew Research Center found that 27% of Americans have chosen to not post something online after witnessing harassment.

CJ Adams, Product Manager at Alphabet subsidiary and tech incubator Jigsaw

Moderating comments can be a round-the-clock and resource intensive process. It can be hard for even a sufficient number of moderators to read every single conversation thread.

Many publishers are already strapped for resources. They would prefer their reporters and editors to create content rather than wade through the cesspool of negativity in the comments section. That’s unfortunate as it deprives publishers of a powerful tool.

Using machine learning to detect toxic comments

But closure is far from a good solution—the publisher also loses the opportunity of building a thriving community of engaged readers. Jigsaw has built a tool called Perspective that uses machine learning and has been trained on millions of comments to detect varying levels of toxicity.

It scores comments according to their perceived toxicity levels. Launched in 2017, Perspective has been used by The New York Times, The Guardian, The Economist, Wikipedia and recently by Reddit and the comments platform Disqus.

In 2016, The Times had 10% of its articles open for commenting. They were moderated by a team of 14 who would review around 11,000 comments per day. The publisher kept comments off for articles that had high chances of inviting vitriol.

It partnered with Jigsaw and started using Perspective late in 2016. The tool has helped the publisher manage comments faster, and with fewer resources. Since then, its comments section has expanded to include 30% of its content.

CJ Adams, Product Manager for Jigsaw, stated last year that the Times “was able to triple the number of articles on which they offer comments, and now have comments enabled for all top stories on their homepage.”

“Machine-assisted human moderation”

There have been hiccups though—Perspective has inadvertently discriminated against groups by race, gender identity, or sexual orientation. That’s because moderating conversations with AI is a very complex task. Algorithms can be biased and machine learning systems have to be continuously refined.

Adams acknowledges Perspective’s flaws saying that they came from the data on which it was trained. He told PCMag, “In the example of frequently targeted groups if you looked at the distribution across the comments in the training data set, there were a vanishingly small number of comments that included the word ‘gay’ or ‘feminist’ and were using it in a positive way.

“Abusive comments use the words as insults. So the ML, looking at the patterns, would say, “Hey, the presence of this word is a pretty good predictor of whether or not this sentiment is toxic.””

Consequently, comments like, “I’m a proud gay man,” or, “I’m a feminist and transgender” may have been erroneously labeled as toxic.

Adams says, “This tool is helpful for machine-assisted human moderation, but it is not ready to be making automatic decisions. But it can take the ‘needle in a haystack’ problem finding this toxic speech and get it down to a handful of hay.”

That is why Perspective’s API is integrated into publishers’ community-management and content-moderation interfaces to assist human moderators. It helps by automating the sorting process, substantially reducing human moderators’ work.

For example, at Spain’s leading newspaper El Pais, comments are assigned to moderators according to their toxicity levels. Experienced moderators deal with the more toxic comments. The publisher has been able to bring down toxic comments by 7% and increase commenting in general by 19% after they started working with Perspective.

The road ahead

Jigsaw continues to expand Perspective’s scope—in recent months, models of Perspective trained to work with the Spanish and French languages have been released.

The company also launched a Chrome extension called Tune. It lets readers choose the varying levels of polite or aggressive comments they would like to see. The different levels range from “quiet” to “blaring” and allow varying amounts of toxicity. There is a “Zen mode” as well, which turns off comments completely.

Bassey Etim, Former Community Editor for The Times told PCMag, “If Perspective can evolve in the right way, it can, hopefully, create at least a set of guidelines that are repeatable for small outlets.

“It’s a long game, but we’ve already set up a lot of the foundation to be a part of that reader experience. Then maybe these local papers can have comments again and establish a little beachhead against the major social players.”