Digital Innovation Digital Publishing
7 mins read

Complex content analytics: the night vision journalism needs

Getting your Trinity Audio player ready...

Imagine, if you will, that you’re at a shooting range. Imagine too, that everything’s in complete darkness. Even if you manage to hit the target in such conditions, knowing how you’ve accomplished it, let alone how you can replicate it is tricky to ascertain. 

The same is true in publishing. If it’s been said once, it’s been said a thousand times: you can’t manage what you don’t measure. So how do we gain journalistic night vision? What are ‘the right metrics’ and how can we utilize them to measure and nurture our readers’ attention? 

Pushing page views up a hill

Although things in newsrooms are improving, there is still a slight tendency to lean on analytic tools that measure what is easiest to capture (things like ‘reach’, ‘impressions’ or ‘unique visitors’, to name a few). No doubt, averaging more unique Pageviews can be interpreted as a sign of positive development, but the question that remains is this: is an increase in Pageviews really the end-goal for journalists and publishers? 

We hope not.

Considering the sheer amount of news noise and content hyper-production that accelerated with the advent of social media, we’ve reached a point where basing an entire strategy on simple, quantitative performance metrics is nothing short of Sisyphean. Unlike the 20th century, a time when the number of newspapers sold was indeed a measure of success, today’s digital era quickly starves those who are gunning for volume.

Jennifer Brandel, CEO of Hearken, warned about “digital bubbles built on illusory metrics” in a Nieman Lab article at the close of last year. Comparing anything to a bubble or like an illusion doesn’t inspire particular confidence, and is certainly not – one would assume – what we should be aspiring to.

No, what’s required are analytics which are reliable. Something tangible and concrete.

The primary job of good editorial analytics is to give publishers solid information in a comprehensible, almost visceral way – not to sacrifice one for another. And yet, even though they’re measuring something as complex as human behavior, the analytics industry tends to oversimplify metrics in favor of easier consumption by the end-user. 

What we mean when we say ‘the right metrics’

Truth be told, there are no right or wrong metrics per se. All data measurements can grant insights, it’s just that some aren’t very – well – insightful

When things went digital, publishers found themselves traversing through uncharted territories. Over time, they have discovered new ways to monetize content, but financial success and business longevity was, is, and will continue to be reserved mostly for those who are data literate. 

This may be a hard pill to swallow for some, but nevertheless, it explains why trusting simple metrics comes with its own fair share of shortcomings. To a certain extent, yes, metrics such as Pageviews, returning visitors or time spent on page can be useful, but they are not enough to give a more nuanced understanding of an audience’s behavior, let alone help publishers to pinpoint their most loyal readers.

Here at Content Insights, our content analytics solution relies on Content Performance Indicator (CPI), a complex algorithm that takes into consideration dozens of different content performance metrics and examines their relations through three behavioral models: ExposureEngagement, and Loyalty. While this may sound a bit high-brow, allow us to make a few comparisons to help you better understand how analytics tools such as Google Analytics measure things, and why we all need to demand more from our metrics.

On closer inspection…

Let’s look at Pageviews first, since it’s a metric found in basically every analytics suite on today’s market (not to mention one we’re all familiar with through our personal social media activity). Many publishers tend to interpret it as a metric that shows the number of people who viewed or even read an article. That is not necessarily the case, though. A Pageview is an instance of a page being loaded in a browser and it doesn’t actually require a person’s active participation (for instance, if you open a page, get distracted by something in real life and return 23.5 minutes later, a page view is still registered).

Our metric Article Reads, on the other hand, records the number of times a person actually started attentively consuming an article, not just opened them. It focuses on behavior, that takes into account attentive time spent on a page, as well as the way people interact with the page (e.g. clicks, text selection, scrolls – evidence that there’s some kind of interaction). So think of an article read like this: somebody opened a page, spent at least 10 seconds on it, the page was in focus and, most importantly, there was an actual human behind the screen.

Some publishers also rely on scroll depth in an attempt to measure reader engagement, but this approach is flawed for one reason: it tells us very little about the reader’s activity. Just because readers scrolled through the entire page doesn’t necessarily mean they’ve read the content. What this metric fails to explain is whether or not readers were actually engaged, let alone how. And even if you zoom out the screen until you have an overview of the entire article, will lack of scrolling subsequently indicate lack of engagement?

By contrast, Read Depth found in our CPI unveils how deeply readers have gotten into consuming the observed article. To calculate Read Depth, we look at a variety of engagement metrics, which compare whether the read depth average difference between two samples of content is really significant or if it is due to random chance. If the difference is indeed significant, it could suggest that a certain section or topic attracts more readers’ attention than others.

Lastly, Time on Page is a metric that measures the amount of time a certain page was left open in a browser. Unless you deploy some event trackers that will record engagement hits, what it fails to show is whether or not the person behind the screen was actually consuming the content until closing the tab (or if they were staring at pigeons out of the window). Again, time on page measures a browser event, but doesn’t really differentiate between if a person was alert or taking a nap. 

So, how are you supposed to measure real, attentive time that your articles generate? Attention Time is a behavior metric that can aid you in that pursuit. What this metric does is calculate the time when the content is in focus and if there’s sufficient activity indicating the reader is indeed alive and reading (and not asleep in front the monitor or off making a cup of Joe). Rest assured, “idle time” won’t be included here – only the actual time users spent consuming your content.

Of course, this is just the tip of the iceberg when it comes to complex metrics – it’s a field that is evolving and becoming more nuanced and complex with each passing month. Data teams like ours are fixated on refining (and in some cases – like the way we look at Loyalty – redefining) how we measure that all-important user behavior.

The fact is, we’ve moved far past the point where analytics packages prioritized ease of use over the quality of service. Now, they must do both – and they can. There’s no excuse for relying on simple metrics when there are products on offer that deliver so much more.

If newsrooms aren’t willing to go beyond simple metrics, in this day and age, it would certainly be a debilitating oversight. 

Night vision: building a newsroom culture that goes well beyond Pageviews

According to a new American Press Institute report, published earlier this year, “metrics should be viewed as an opportunity for experimentation, rather than a report card measuring a journalist’s performance.” Indeed, experimenting with metrics and storytelling allows journalists to make premeditated alterations or refinements to their coverage so that both readers and journalists might get something truly meaningful from it. 

The point made there (that simple metrics are summative, not constructive), seems yet another reason to err on the side of caution. From a journalist’s point of view, analytics have the potential to not only report how content is being read (which is gratifying if nothing else), but also pinpoint how it’s resonating. It’s a subtle shift with massive implications.

Much has been said about the need to engage our communities, bolster loyalty and get readers invested – and it’s a far-reaching, rallying cry. Just as engaged, solutions and civic journalism have created an alternative approach to the reporting process (and one which – with any luck – ultimately affects reading numbers positively), so too do more comprehensive analytics play a critical role in ensuring that you understand why content performs the way it does. It’s all part of the puzzle.

Skipping this step in your workflow might put you in danger of resorting to the ‘spray and pray’ approach – something which does very little to explain the logic behind your success nor does it help you to ascertain which areas could be optimized. Since assumption is the mother of all errors, maybe try answering these questions first: 

  • What stories bring the most value to your readership?
  • What kind of content moves your readers enough to guide them down the subscription funnel?
  • What signals show that subscribers are adding back to the community?
  • What kind of traffic has long-term significance for the newsrooms’ business operations and editorial goals?

Some of the answers to these questions are probably lurking somewhere in an analytics dashboard – and journalists who are well-versed in interpreting data will have an even greater chance of understanding how their work fits into the larger picture. An encouraging thought, indeed, but don’t forget, analytics tools are just that – tools – and not the heart of the operations.

It wasn’t so long ago that nuance in data was something that could only be uncovered by those with an advanced degree in something tech-y. Not so now. Insights are there for those who ask the questions (and who’ve installed the relevant app or plugin). 

We’re no longer shooting in the dark, and if you are, it’s more likely you’re standing blindfolded in a well-lit room where everyone else can see what they’re aiming for. Why make life harder for yourself?

by Marko Dorić

Republished with kind permission of Content Insights, the next generation content analytics solution that translates complex editorial data into actionable insights.