Traffic quality serves as an umbrella term for a host of related topics: fraud, bots, URL masking and viewability. Catching crooks and cleaning up the ecosystem is a focal point for buyers and sellers in 2015 – and while it won’t be easy, the war to keep inventory clean can be won.
One scary stat to place this in perspective: a recent study found bots generated 11% of display ad impressions and 23% of video ads. All together, advertisers could lose more than $6 billion globally to ad fraud in 2015.
That’s a number that has the industry taking action: Multiple IAB task forces; the ANA and 4A’s collaborative coalition; the ANA and White Ops “Marketer’s Coalition”; comScore’s Industry Trust Initiative; and major ad tech player’s individual initiatives are working to put an end to fraud in the industry.
In 2014 there was an increased focus on traffic quality across the industry. Google purchased spider.io largely to get access to anti-fraud tech, and AppNexus announced it would invest in a certification program to counter buyer feedback about quality. Industry organizations continued to form while third-party specialists like Integral Ad Science and Pixalate emerged to tackle quality measurement.
Pixalate recently unveiled the Global Seller Trust Index ranking the industry’s top players. For the second consecutive month, OpenX ranked number one in the critically important category of inventory quality. This recognition isn’t something that happened overnight. Several years ago, OpenX made the strategic decision to invest heavily in inventory quality. The investment and pay-off is evident by the continued high rankings in the Global Seller Trust Index and feedback from partners that OpenX runs the industry’s “cleanest exchange”.
Not all platforms are created equal – each has its own take on how to combat (or ignore) fraud. OpenX has a deep commitment to ensuring all elements of traffic quality are being addressed. One key aspect of our traffic quality initiative is our approach to data.
Our fully integrated system uses a combination of big data and client-side techniques to identify fraudulent and suspicious activities. We aggregate huge amounts of data and look for anomalous patterns that suggest non-human behavior. As an exchange, we have the advantage of code on the publisher page at the time the request is submitted, which means we get the earliest possible look at inventory and then decide if we want to expose buyers to that inventory. This differs from others, where verification happens after the ad is delivered.
Major publishers are increasingly being pressured by advertisers to put safeguards in place to reduce fraud and robotic traffic. Although the buy-side continues to make investments to protect themselves against ad fraud, they count on publishers to be equally aware of non-human traffic and in-view rates, two crucial pieces of information for pricing inventory appropriately
Publishers must ensure that their exchange and SSP partners share the same standards for traffic quality and are also making the necessary investments to prevent fraud. They should work with trusted third-party vendors for monitoring inventory quality and setting up real-time filters.
For more about OpenX’s leading traffic quality initiatives, check out OpenX CEO Tim Cadogan’s discussion with The Wall Street Journal.