Recently, there has been a lot of discussion of what Real Time Bidding is, but relatively little discussion of its implications – outside of the obvious, obviously.
Now that everyone knows what it is, what does that mean?
Here is my take: In a world of RTB, there isn’t a lot of need for publisher side optimization technology. You simply let everyone bid on every impression, maximize the value of that impression, and move on with your life.
Technologies like Rubicon and Pubmatic have focused on helping publishers squeeze value out of their daisy chain via optimization technology that re-orders or aligns the daisy chain more rapidly or efficiently than a human could. What is a daisy chain? A daisy chain is industry parlance for the order in which networks look at user information. For example, I might construct a daisy chain like this:
- Network A is willing to pay $0.50 for first 3 impressions of a user
- Network A is willing to pay $0.25 for impressions 4-6 of a user
- Network A is willing to pay a 50% rev share for all impressions thereafter
- Network B is willing to pay $0.40 for impressions 1-5 of a user
- Network B is willing to pay a 60% rev share for all impressions thereafter
Networks typically like to extend reach, so they offer a little bit extra to see people the first few times. After that, payments get a lot lower. Building this daisy chain is straightforward to start:
- Network A gets impressions 1-3 of a user
- Network B gets impressions 4-8 of a user
- Network A gets impressions 9-11 of a user
But what about the rev shares? This is where it gets tricky. Typically, the publisher side optimizer tries to look at historical payout performance to calculate the real-time value of the rev share and insert it appropriately into a daisy chain. As RPMs rise, so does their position in the daisy chain. If RPMs fall, the network slowly slides to the bottom, winning fewer and fewer impressions.
Unfortunately, I have always found that this has a key shortcoming: absent cookie data, you can never really know how a piece of inventory and/or user is valued. For example, I will tell you right now that Ad.com probably isn’t interested in your high-frequency edgy user generated content. It is a fact. You will get a couple of sheckels, not much more. UNLESS…. that user has certain historical behavioral or interest-based attributes that our advertiser base finds attractive. Then we will pay a ton. For every single impression of that user. A ton. But the only way to know is to decode our cookie, which you can’t do because browsers won’t let your domain look at them, Pubmaticon.
Now, they might claim: If every network we worked with simply shared cookie information with me, by piggybacking cookies or something, then shared bidding information with, I could reconstruct a massive model that predicts these things. And that is true, but it would be wrong. Advertisers have things like frequency caps and budgets that artificially constrain campaigns in real-time. Now, Rubmatic might say, if that information was shared in real-time, then they could schedule it, but really, what we see here is that we should simply bid out every impression, let everyone sniff the cookie and bid, and then you are done. Once you go there, you do not optimize your daisy chain, the daisy chain simply goes away.
Who needs to optimize a daisy chain if, for every impression, you get the exact amount every network is willing to bid. You just sell to the highest bidder and move on.
Now, Rubicon has raised $33 million and Pubmatic raised a $7 million Series A and “undisclosed” Series B, so they are counting on exits in the hundreds of millions of dollars. To get there, they are probably going to transition to more of a network/exchange/ad space aggregation model, leveraging their publisher relationships. The publisher side optimization opportunity will be short-lived.