programmatic guaranteed - programmatic 3-0

How To Fix Media Asset Standards So They Don't Suck

Standardization can seem like a technical topic but it is simply the domain model (the rules that 'govern' ). In the past standardization seemed difficult because the individuality of media assets created an illusion that a standardized domain model for media assets was impossible.

Some think that applying a standard to media would mean forcing a set of descriptions that everything should fit within. Like the way the IAB created a set of context standards that seek to capture all contexts. But what if the standard is not about restricting what descriptions are acceptable but rather about standardizing the way descriptions themselves are created. In other words, any type of descriptor can be added to a media asset, so long as it is done in a way that the system, can understand. Like Mendeleev’s periodic table, the system has a place for newly discovered elements, even though we never knew they existed.

In Chemistry, 19th century researchers faced the same problem. ‘We have all these elements, but how do we standardize our understanding of their respective relationships?’ To solve this problem in chemistry, a new domain model had to be created to bring rational order. While it may have seemed impossible before the problem was solved, afterwards it seemed completely obvious.  Dmitri Mendeleev solved the problem in 1869. His genius was in building a model that simultaneously defined how the elements are similar to each other and how they are different from each other, in more than one way.

Like Mendeleev, media solutions need to conform to a standard that defines how we order the information in the domain, not limit what information can be contained in the domain. On a side note, some of the marketers making periodic tables of marketing are doing it wrong. They don't actually understand why the periodic table is so brilliant.

Most folks in media that we have talked to question the ability to standardize media assets because they can only imagine domain models based on similarity or difference.  The beauty of Mendeleev’s arrangement is that the relationship between and element and its neighbors to the left and right are always the same. The relationship to the elements above it and below it are also always the same. In other words, the elements are arranged such that they show how they are different from each other by showing how they are like each other. Media standards need the same type of domain model.

Periodic Table trends
Periodic Table trends

To build this domain model we have to understand the relationship between two different media assets (two ‘atoms’) and figure out where they belong in the domain relative to each other (their ‘location’ on the ‘periodic table’).  Most media technology was not build to handle transactions, so it did not need to standardize the domain model for media assets, it standardized the communication of transaction requests, e.g. RTB protocol.  For the exchanges and other technologies that were designed for managing transactions, that part of the system was simply ostriched. Why? Because the second price auction does not need to know what is being auctioned to be successful, it simply manages bids and price floors. This was a design feature, not a bug. Since second price auctions are being used to sell an impression in real time, it doesn’t much matter what the impression is, it only mattered who will pay the highest price.

When we want to trade avails (read media futures) we and the auction itself need to understand what is being bought and sold. Since there is so much uniqueness in the domain, we decided to reverse the perspective. Instead of trying to define what makes one piece of inventory unique, we define how it is different from everything else. We can work by inclusion or exclusion, the results are the same. So in technical terms, each piece of information describing a media asset is a vector. Each media asset is a collection of vectors.

In practical terms the ‘standard’ is a way of ordering the information submitted to the system. This means the system incentivizes conformity without demanding it. For example, there can be two competing methods of defining context, but both buyers and sellers have strong incentives to choose what is best for the market. So, if two competing standardization systems will yield the best outcomes, the domain must support both. If one standard will yield the best outcomes, the domain must support that as well. The standards create an incentive to find the optimal solution, the domain does not define the optimal solution.

Like Mendeleev, media solutions need to conform to a standard that defines how we order the information in the domain, not limit what information can be contained in the domain.

PII issues will never go away with real time bidding

Houston, PII has a problem
Houston, PII has a problem

PII issues have long been a point of discussion among us all.  In all that talking and discussing, we never uncovered the root cause of why PII issues are such a dominant force in the current real time bidding market architecture. I propose that taking another point of view at the problem reveals that it is a direct outcome of the market architecture and not a side-effect of some other economic inefficiency. The current market architecture in real time bidding is a ‘call-and-response’ system. One side, the seller calls, and the other side, the buyer responds. This means that the entire market is dominated by the way sellers define their demand. In simple terms, if no one is selling what I am specifically looking to buy in the market, how do I market my demand?

This means sellers need to express their supply so that buyers will bid. Economics teaches us that in this situation, the seller is best served by providing as much information as possible on this impression, so that the maximum number of potential buyers is achieved. In other words, there is an economic incentive to say as much as possible about the impression.

The problem with this market architecture is that sellers can’t search for buyers before the inventory shows up. If a seller could search for demand and elect to meet some of that demand with their supply, the only information transferred during the transaction is that this audience member and ad placement unit meet the criteria of the buy order. So, if a seller never meets demand that violates PII standards, all transactions will be free of PII issues. In a market structure where demand can be transparent, the incentive is to share as little information as possible. This is the opposite market structure incentive from that of the real time market architecture.

The real time market architecture segregates supply and demand to the ‘call’ side or ‘response’ side, the market self-defines itself as an asymmetric market. For some inventory acquisition strategies like retargeting, this is a great, and most probably the most optimal, market structure. But, for big brands this market asymmetry is bad. We all know that they buy huge swaths of audiences across all media to build their brand. For these buyers, the real size of the transactions ($) they want to make is not accounted for in the second price auction. That auction does not know or care that you have a $25K budget for this line item.

This is a problem. By leaving this demand out of the price calculations we are effectively only looking at the tips of icebergs; and we still have tons of PII issues. If you are a marketer or a publisher navigating your boat through these treacherous waters, no wonder you’re fed up.

The #1 Way To Improve Yield Optimization, Thanks Dr. Laffer

Most yield systems 'think' about a graph of their revenue versus impressions andsee a line going up from left to right. The idea is that every single impression has the opportunity to generate income.

This view can only be true if each additional sellable impression is a real person that gives just as much attention to each new ad impression. With the exception of a microscopic minority, this is impossible. We all know that overloading the user with ads means they actually wind up ignoring all of the ads.

In reality, ethical publishers know full well that jamming your pages full of ads, and playing games and inflating page loads doesn’t last. Increasing supply undermines future pricing power and increases the opportunity for others to arbitrage the publisher.

In part, yield optimization is to blame. The real culprit is the second price auction. It’s baked into that auction method and it can’t be removed.

Let’s take another view. If we apply what we know, we know that the publisher’s revenue curve actually shows that if we keep pumping impression production up artificially, the total revenue we can generate through the market goes down.

Graph 2
Graph 2

So, the real question is what is the minimal amount of impressions that will maximize the publisher’s revenue. The curve illustrates that if there are zero ad impressions, there is no revenue—obviously—to the publisher. But if a publisher had no content and just ads, that won't generate revenue either, as there is no longer any incentive for a person to consume that publisher’s media.

If you’re a bit of a policy nerd like me, that sounds like a theory that came in to legend on a napkin “…officials Dick Cheney and Donald Rumsfeld in 1974 in which he reportedly sketched the curve on a napkin to illustrate his argument[1]” This is the legend of the Laffer Curve, an idea that made a big impact on tax policy throughout the 1980s.

To answer the revenue maximization question above we have to figure out the shape of the curve and how close we are to the top. Sounds simple enough right? Now, let’s think about any technologies in advertising that do that? Is there a yield optimization system that does that?

If you are managing media inventory today, can you tell me what dot on the curve above best represents your organization? If you can’t, your organization is managing the world through the lens of the line graph and not the curve.

[1] https://en.wikipedia.org/wiki/Laffer_curve

The Untapped Potential Of All This Data

Originally Published in AdExchanger We’ve all marveled at the new technology solutions entering the programmatic marketing and advertising ecosystems, along with the vast quantities of data produced. Everywhere a business problem could be quantified, in terms of something that can be counted or measured, a solution has sprung up, specializing in things like behavioral modeling, viewability and attribution.

I believe the untapped potential of all this data is to predict the future. If we took the 100,000-foot view across both marketing technology and advertising technology, we find a very interesting pattern. The land of mar tech and ad tech data is divided in to three main areas, but only one is populated with almost zero technology.

Some technologies can be found in the area I call “The Past.” There are lots of technologies that are in “The Present.” There are hardly any in “The Future” – there are very few forecasting technologies.

I predict the third generation of advanced mar tech and ad tech solutions will focus on telling publishers and marketers what will probably happen in the future. Predicting the future is an exercise of understanding how past data about what happened, when it happened and why it happened can be used as the colors to paint a picture of the future.

The Past: What Happened?

Today’s technology for data capture and storage is like a 100-megapixel camera – it provides a super accurate picture of what happened in the past. We have 100% certainty that what we measured happened. It’s not like there are other possible outcomes in the past.

Data storage solutions provide this vast understanding of anything that we chose to measure. If a data point was created and saved, it can exist forever. The evolution of this area of technology is focused on the expansion of what data is measured and captured. The ever-growing sea of data is a beautiful sight to behold for the analytical among us.

The Present: What’s Happening?

The last massive wave of technology innovation in mar tech and ad tech happened in this category, which focuses on the collecting and disseminating data about the present. The technology that collects data about what the present looks like is less sharp than data about the past. To understand the present, we need to pull a lot of data together really fast so we can act on it.

For data about the past, the effectiveness of analytics that bring data together is not limited by the amount of time that it takes. For that reason, the present is a little less sharp. We don’t have time to look at all the data together.

What’s more, as the sea of data being collected grows, the amount we can actually act on becomes an ever-decreasing portion of what we actually have. It’s more like a 20-megapixel camera.

Technologies that work to understand what is happening and take action include yield management, creative optimization and supply-side platforms. This area of technology is evolving with a focus on the expansion of delivering and processing an ever-growing data set to answer a question in less than a second.

The Future: What’s Going To Happen?

In this category of mar tech and ad tech, the fewest solutions exist. There are no companies on the LUMAscape dedicated to forecasting; I only know of one startup. Forecasting features in current technologies are treated the way municipal politicians treat sewage infrastructure: Nobody wants to talk about it, it’s hard and dirty work, but no one can live without it.

The future will never be as clear as the present or the past, but in this space of mar tech and ad tech technologies, innovation and investment have significantly lagged the market. Predicting the future is hard. It’s never like the past – it’s fuzzy and out of focus. Our current tools for predicting the future are, at best, like a 0.5-megapixel camera. It’s really hard to tell what will happen.

This is where a ton of untapped potential exists. Leveraging all this data being collected everywhere to build better modeling tools will help bring the future into focus. No one can predict when this market shift will gain significant traction, but I think we will see the future as an increasingly important topic of conversation for industry innovation and thought leadership in the next few years.

Picking The Programmatic 3.0 Marketplace That Will Make You A Winner

On today’s cutting edge of media trading "programmatic 3.0" are a number of solutions that allow buyers and sellers to make a deal now for inventory that will be delivered in the future. This new technology segment has emerged, in the already crowded field of programmatic solutions. The challenge for publishers and media buyers has been to distinguish the difference among the approaches vendors are bringing to the table to support programmatic deals for inventory delivered in the future. Because these different approaches all seek to address buyers’ and sellers’ pain points, they all present very similar value propositions. In reality, these approaches are very different.

The three approaches that have emerged in the market so far are the marketplace for traditional avails, real-time statistical arbitrage, and biddable impression futures. If we simply looked at the value propositions of each they seem nearly indistinguishable. If we dig deeper to understand what and how they work, the differences become clearer.

A marketplace for traditional avails

This approach is currently the most common and has been adopted by some of the big players that pioneered the real-time space. This approach allows a seller to expose products from their ad server to a marketplace with a “buy it now” price. It focuses on automating the trafficking of media buys and making the media that was sold direct, since the dawn of digital media, discoverable in a marketplace.  This is the approach used by the likes of Rubicon (iSocket/ShinyAds), AppNexus (Twixt/Yieldex Marketplace), and AdSlot

Real-time statistical arbitrage

In this approach the media is not bought from the publisher, but rather from an intermediary that takes on the risk of promising to sell the inventory at a fixed price after buying the impression at a variable price through an auction.  The approach focuses on technology that can forecast what will probably be available in the real-time environment and its estimated auction clearing price. This approach is used by the likes of Media Gamma and was attempted by MetaMarkets and Media Crossing prior to their pivots.

Biddable impression futures

This approach focuses on allowing buyers and sellers to agree to transact media where all the impressions meet a specified set criteria that can include a publisher’s product, 1st or 3rd party segments, context, viewability, or any other criteria the counterparties agree to. This environment is an order management layer that abstracts supply and demand into a separate technology layer to optimize the way in which supply and demand are presented, priced, and matched. This approach does not handle actual impressions or bid requests the way the real-time environment does. This is our approach at MASS Exchange.

In the wild

Let’s look at a concrete example across all the approaches. A publisher is asked to sell a 300x250 unit on the landing page of their automotive section, targeted at males, 18-35, in-market.

In the marketplace for traditional avails, a publisher must manually create the product in the ad server so that it appears in the marketplace via its API integration into the ad server. While doable, selling targeted impressions, not an audience over-indexed inventory, is possible but the efficiency of the marketplace is quickly outweighed by the massive manual process required to set it up. Further, none of the current marketplaces for traditional avails are auction based. So, it’s like Amazon for traditional avails.

Using the real-time statistical arbitrage approach the vendor targets an audience in an open market or may be able to acquire it through a private market, but the negotiation and pricing of the deal between the vendor and the end buyer is handled manually like any other traditional direct media buy. Further, scaling this approach to buy inventory from specified publishers means that yet another technology is inserted into the cost structure of the media and requires the vendor to have PMP deals with each seller so that a specified publisher’s inventory can be resold.

In the biddable impression futures environment, all of the combinations of audience, placement, and viewability attributes that a seller wishes to expose to the market are discoverable and priced with an asking price. In this environment, avails are biddable and transacted through and auction that only clears if the buyer is willing to pay the seller’s asking price.  This approach scales wonderfully as inventory definitions, pricing, availability, negotiation, and trafficking can all be automated. This approach provides tools that can scale across all the part of the transaction process, from start to finish.