Guest Columns
4 mins read

Turning a blind eye to bots to protect ad revenue? Think again.

woman with chin on hand
Getting your Trinity Audio player ready...

OPINION

There is a widely recognized, yet rarely discussed issue quite literally infiltrating the publishing industry: bots. Bots that mimic real users are constantly filtering through the flow of traffic on publishing websites across the globe.

Despite the fact that bots can present serious security threats – like being used to coordinate massive attacks – many publishers turn a blind eye to them, concerned that addressing and blocking bot traffic could impact impressions and ultimately sink ad revenue.

What most publishers don’t realize is that failing to manage bot traffic can actually hurt revenues by putting the business at risk for potential cybersecurity issues, hurting user experiences, turning away advertisers, and losing a company’s overall competitive footing.

So how can businesses tackle the growing bot issue? The key is to first truly understand and identify which bots are causing a hindrance and then putting programs in place that manage them.

Identifying & Sorting Bots

While the rise of bots may be a relatively new one, it doesn’t mean they are rare. In fact, they are becoming a massive source of overall web traffic. For many websites, bots represent up to 60 percent or more of overall traffic.

As bots continue to evolve, detecting them – and sorting the good from the bad – isn’t a guarantee. Akamai found that of the web traffic coming from bots, less than half are accurately detected as such. What’s more, is not all bots are equal, meaning there is not a one-size-fits-all solution to managing them. For example, while many bots can be trouble, some are actually helpful, like the ones that improve your SEO ranking on a search engine.

As bots increasingly adapt to look more like a human site visitor, investments in automated bot detection and management can be helpful for publishers.

Behavioral characteristics are an area of bot management to consider focusing heavily upon. Mouse movement or clicking patterns are a big indicator in separating a bot from a human. Device fingerprinting can also help, analyzing traits like web browsers, plugins and even screen size to determine the likelihood of a bot.

Other important tools include the ability to categorize bots based on reputation, helping to ensure you are managing bots based on either their malicious or helpful intentions. Subsequently, customized management policies for each categorization are crucial to best master bots.

By implementing tools like these, publishers are removing barriers that prevent stability and growth in an ever-changing industry.

Enhanced Data to Improve User Experience and Ad Targeting

In order to provide the experiences users want, publishers can properly leverage data about how visitors spend their time using their site, including what content moves the needle. But how can publishers accurately assess visitor data if their data is automatically flawed by bots?

By removing malicious bots that artificially boost traffic numbers, publishers can help improve analytics that give insight into real user behavior. While having this data doesn’t automatically boost user experience – it’s about what organizations do with it – removing malicious bots that inflate traffic is a critical step in giving publishers the accurate data they need to enhance the real user experience and drive real user growth. 

Along the same line, publishers providing accurate data about their audience to advertisers is a must. Advertisers win by getting data that allows for successful targeting and publishers win by becoming a more attractive website for brands looking to properly spend their limited budgets.

Simply put, managing bots for an improved user experience and better ad targeting comes with a reward. Happier customers and advertisers allow for a healthier bottom line.

Reduced Security Risks

In today’s world where data breaches dominate headlines, users are more skeptical than ever about the security of the websites they interact with. Online publishers that fail to manage bots increase their cybersecurity risk – including higher exposure for sophisticated cyberattacks and data breaches, not to mention the potential loss of advertiser and customer trust.

A growing cyber threat involving bots, for example, is the credential stuffing attack. In a credential stuffing attack, hackers tap bots to use stolen login information and gain access to online user accounts. This trend is growing, as Akamai found 55 billion credential stuffing attacks from November 2017 to March 31, 2019, a majority of which are executed by botnets or all-in-one (AIO) applications. Such bots are programmed to target accounts in account takeover (ATO) attacks.

Publishers investing in bot management and blocking solutions can vastly reduce their cyber risk surface for these and other bot-based attacks.

Mitigating Web Scraping

Bots can cause significant damage from a competitive standpoint, specifically with website scraping, in which companies use bots to navigate a competitor’s website to pull valuable data, content and other information. For example, competitors looking to price match your company or use your value-added content for inspiration may use bots to pull this information.

Competitors, of course, have the ability to scan your website manually for this information, but this approach allows them a quick way to stay up to date with changes in real-time. Blocking such bots won’t necessarily prevent competitors from finding the information they need at some point, but it can help remove the efficiency advantage that comes with website scraping and reduces the ability for them to steal or copy valuable content virtually instantly.

Thriving in a Bot-Filled World

The prevalence of bots in publishing is certainly strong, and the challenge is likely to become more common as the industry evolves. By considering further investments in bot management processes, publishers can help improve user experience, minimize security risks, and maintain a competitive edge in an always-changing industry. Ultimately, this means a more attractive offering for advertisers and consumers alike.

Tara Bartley, Senior Manager, Global Industry Marketing, Akamai

About Akamai: Akamai’s intelligent edge platform surrounds everything, from the enterprise to the cloud, so customers and their businesses can be fast, smart, and secure. Top brands globally rely on Akamai to help them realize competitive advantage through agile solutions that extend the power of their multi-cloud architectures.