2. 2014 Budget Review
Presentation agenda
1) The importance of cross-device tracking
2) Our mission as marketers
3) ROI components
a) Spend
b) Variable contribution
4) Algorithmic attribution
a) DIY
b) Evaluating vendors
5) Take-aways
jonathan.isernhagen@wyn.com @jon_isernhagen
3. 2014 Budget Review
Device ubiquity
>60% of US adults
use 2 devices/day
>20% use 3 devices
>40% jump devices
during one activity.
jonathan.isernhagen@wyn.com @jon_isernhagen
https://www.facebook.com/business/news/Finding-simplicity-in-a-multi-device-world
>53% of those with 2 devices jump between them
>77% of those with 3 devices jump among them
22% land on a tablet, 58% land on a laptop
4. 2014 Budget Review
Cross-device tracking (the importance of)
Definition: “…the myriad ways platforms,
publishers and ad tech companies try to identify
Internet users across smartphones, tablets and
desktop computers.”
Important because:
1) Gives visibility to devices’ roles in purchase path
2) “retargeting on mobile is impossible without it.”
3) Algorithmic attribution is inaccurate “ “ “ “
jonathan.isernhagen@wyn.com @jon_isernhagen
John McDermott, http://digiday.com/platforms/wtf-cross-device-tracking/
5. 2014 Budget Review
Presentation agenda
1) The importance of cross-device tracking
2) Our mission as marketers
3) ROI components
a) Spend
b) Variable contribution
4) Algorithmic attribution
a) DIY
b) Evaluating vendors
5) Take-aways
jonathan.isernhagen@wyn.com @jon_isernhagen
6. 2014 Budget Review
CEO’s/our duty to the (publicly-traded) company
• Brand awareness /
sentiment?
• Client loyalty?
• Employee satisfaction?
• Traffic to the site?
• Shopper movement
down the funnel?
• Transaction volumes?
• Shareholder value?
• Customer focus /
personalization?
• Community
involvement through
charitable actions?
jonathan.isernhagen@wyn.com @jon_isernhagen
Maximize which one of the following metrics….
7. 2014 Budget Review
Example: Apple, Inc.
“Apple's Board of Directors oversees the Chief
Executive Officer and other senior management in the
competent and ethical operation of Apple on a day-to-
day basis and assures that the long-term interests of
shareholders are being served.
Source: http://investor.apple.com/governance.cfm
jonathan.isernhagen@wyn.com @jon_isernhagen
8. 2014 Budget Review
Increase spending until $1 out brings $1 back…
….spend-spend-spend-spend-spend-stop
jonathan.isernhagen@wyn.com @jon_isernhagen
9. 2014 Budget Review
…which is the point at which incremental ROI = 0%
- 9 -
ROI = (VCM – Spend)
Spend
Abbreviation Term Definition
ROI Return on Investment
Indicator of investment profitability.
Positive = good.
VCM Variable Contribution Margin
The amount of profit driven by a given
transaction.
Spend Channel Spend
The amount spent driving traffic to the
site during the period in question
Calculated over a specified time period of investment and return.
jonathan.isernhagen@wyn.com @jon_isernhagen
10. 2014 Budget Review
Our goal: an ROI Dashboard
Channel* Desktop Tablet Mobile
Brand 20%
Brand SEM 62% 51% 38%
Display -5% -12% -7%
Display - Retargeting 26% 25% 29%
Email 250%
Meta search 18% 22% 10%
Non-brand SEM -30% -18% -40%
SEO 500% 520% 390%
Social media -5% -15% 15%
- 10 -
*of impression/click, not necessarily of consumer conversion
jonathan.isernhagen@wyn.com @jon_isernhagen
11. 2014 Budget Review
Presentation agenda
1) The importance of cross-device tracking
2) Our mission as marketers
3) ROI components
a) Spend
b) Variable contribution
4) Algorithmic attribution
a) DIY
b) Evaluating vendors
5) Take-aways
jonathan.isernhagen@wyn.com @jon_isernhagen
12. 2014 Budget Review
ROI = (VCM – Spend)
Spend
Spend
- 12 -
Which spend do you include?
jonathan.isernhagen@wyn.com @jon_isernhagen
14. 2014 Budget Review
spend
spendspendspend
How changes in spend can mess up ROI
- 14 -
$
time
profit
impact impact
spend
impact
Jan Feb Mar
jonathan.isernhagen@wyn.com @jon_isernhagen
15. 2014 Budget Review
Variable Contribution Margin (“VCM”)
- 15 -
ROI = (VCM – Spend)
Spend
1) What is the profit from each transaction?
2) Which channels deserve part/all of the
credit for driving each transaction?
jonathan.isernhagen@wyn.com @jon_isernhagen
16. 2014 Budget Review
VCM: the profit on each transaction
= Transaction revenue - variable non-marketing expenses:
• Revenue:
– Supplier Commissions;
– GDS incentives;
– Overrides (lumpy: average/booking must be assumed)
– Media (not transaction-driven, but has to be modeled in
somewhere)
– Attached bookings / Lifetime value: try to gauge value without
double-counting
• Expenses:
– Website hosting/capacity costs
– Data processing expenses
– Other expenses which vary by transaction or site activity volume
- 16 - jonathan.isernhagen@wyn.com @jon_isernhagen
17. 2014 Budget Review
Which treatment(s) triggered the purchase?
"Half the money I spend
on advertising is wasted;
the trouble is,
I don't know which half.“
-John Wanamaker
Father of Modern Advertising
jonathan.isernhagen@wyn.com @jon_isernhagen
18. 2014 Budget Review
Presentation agenda
1) The importance of cross-device tracking
2) Our mission as marketers
3) ROI components
a) Spend
b) Variable contribution
4) Algorithmic attribution
a) DIY
b) Evaluating vendors
5) Take-aways
jonathan.isernhagen@wyn.com @jon_isernhagen
19. 2014 Budget Review
How badly do you want to know?
Raylan: You'll pay to find that out.
Boyd: What are you packing?
jonathan.isernhagen@wyn.com @jon_isernhagen
20. 2014 Budget Review
Choosing your attribution strategy
Do you want/need
true attribution*?
Do you have
data guys?
Do they have
bandwidth for this?
Yes
No
No
Use your site
metric solution’s
attribution
Yes
Yes
Hire a vendor
Can you access
your data?
Yes
No
*Do you have:
1) Large enough budget?
2) Multiple channels?
3) Belief in ROI “knowability?”
DIY
No
jonathan.isernhagen@wyn.com @jon_isernhagen
21. 2014 Budget Review
• Sends
• Opens
Collecting the necessary data
Analysis
space
• Clicks
• Visitors
• Transactions
• VCM
• Media impressions
Site metrics tool
Back office system
• Channel-specific
phone #s
Email service provider
Call Center IVR
• Impressions
• Clicks
• Spend
Display ad server
e.g. SAS, Revolution Analytics,
SPSS, Teradata Warehouse Miner
SEM bid management tool
• Impressions
• Clicks
• Spend
• GRPs
• Spend
Television plan
• Impressions
• Clicks
• Spend
Meta search feeds
Spend
Manual spend entry table
jonathan.isernhagen@wyn.com @jon_isernhagen
22. 2014 Budget Review
Connecting the necessary data
Transactions Profiles Customers
Sessions
∞ 1∞ 1
Clicks:TransactionsClicks ∞ 11 ∞
∞ 1∞ 1
Calls
Sends
Opens
∞
1
∞
1
Email
IVR
GRPs
TV plan
Impressions
Ad server
Impressions
∞ 1
∞ 1
∞ 1
Back officeSite monitoring tool CRM system
1 ∞
jonathan.isernhagen@wyn.com @jon_isernhagen
23. 2014 Budget Review
Profile deduplication is crucial
Profiles Customers∞ 1
CRM system
1) App registration information tied to desktop profile
2) Logged-in customer information on multiple devices
3) E-mails sent to same address opened on multiple devices
4) Third-party services with network visibility
jonathan.isernhagen@wyn.com @jon_isernhagen
24. 2014 Budget Review
Impressions are also important
This example overcredits impressions (excludes other channels)
but it gives some idea of the sub-surface portion of the
impressions iceberg.
jonathan.isernhagen@wyn.com @jon_isernhagen
26. 2014 Budget Review
Criteria for attribution vendor evaluation
1) Independence / media neutrality
2) Independent data collection
3) Cross-device natively
4) Brand search & affiliate conversion controls
5) Programmatic capability
6) Ad viewability
7) Time to onboard
8) Cost
jonathan.isernhagen@wyn.com @jon_isernhagen
27. 2014 Budget Review
Independence – Media agnostic
Vendor Independent Score
Abakus Yes 1
AOL/Convertro No 0
C3 Metrics Yes 1
eBay/Clearsaleing No 0
Adometry/Google No 0
Marketing Evolution Yes 1
Marketshare Yes 1
Rakuten DC Storm No 0
Visual IQ Yes 1
- 27 - jonathan.isernhagen@wyn.com @jon_isernhagen
Vendors owned by Media companies are not considered
neutral and scored with a value of 0. Independent
vendors are scored with the value of 1.
28. 2014 Budget Review
Independent data collection
Vendor Collect own data Score
Abakus No 0
AOL/Convertro Yes 1
C3 Metrics Yes 1
eBay/Clearsaleing Yes 1
Adometry/Google Yes 1
Marketing Evolution No 0
Marketshare No 0
Rakuten DC Storm Yes 1
Visual IQ No 0
- 28 - jonathan.isernhagen@wyn.com @jon_isernhagen
Vendors which collect their own data via tags are scored
with the value of 1. Vendors which do not collect data
are scored with a 0.
29. 2014 Budget Review
Cross-device visibility native to the platform
Vendor Cross-device visibility Score
Abakus No 0
AOL/Convertro Yes 1
C3 Metrics Yes 1
eBay/Clearsaleing No 0
Adometry/Google No 1*
Marketing Evolution No 0
Marketshare No 0
Rakuten DC Storm No 0
Visual IQ No 0
- 29 - jonathan.isernhagen@wyn.com @jon_isernhagen
Vendors which provide cross-device native to the platform for no
additional fee are scored with a value of 1, other vendors who
either do not provide the service or who require an additional
vendor are scored with a value of 0. (*coming in mid-2015)
30. 2014 Budget Review
Brand search & affiliate conversion controls
Vendor Conversion controls Score
Abakus No 0
AOL/Convertro Yes 1
C3 Metrics Yes 1
eBay/Clearsaleing No 0
Adometry/Google Yes 1
Marketing Evolution No 0
Marketshare No 0
Rakuten DC Storm No 0
Visual IQ No 0
- 30 - jonathan.isernhagen@wyn.com @jon_isernhagen
Brand search, Affiliates and other vendors dominate activity at the
bottom of the funnel. Vendors which are able to control for this
activity within the model are scored with a value of 1, vendors
which have not addressed this issue are scored with a value of 0.
31. 2014 Budget Review
Programmatic capability
Vendor Programmatic Score
Abakus No 0
AOL/Convertro Yes 1
C3 Metrics Yes 1
eBay/Clearsaleing No 0
Adometry/Google Yes 1
Marketing Evolution No 0
Marketshare No 0
Rakuten DC Storm No 0
Visual IQ No 0
- 31 - jonathan.isernhagen@wyn.com @jon_isernhagen
Programmatic capability requires independent view tags and
integration with trading desks and Ad Networks. Vendors which
have this capability are scored with the value of 1, vendors which
do not are scored with a value of 0.
32. 2014 Budget Review
Display ad viewability audit
Vendor Display audit Score
Abakus No 0
AOL/Convertro No 0
C3 Metrics Yes 1
eBay/Clearsaleing No 0
Adometry/Google Yes 1*
Marketing Evolution No 0
Marketshare No 0
Rakuten DC Storm No 0
Visual IQ No 0
- 32 - jonathan.isernhagen@wyn.com @jon_isernhagen
Greater than 50% of all display ads are never seen. Vendors which
have integrated viewability which accounts for cross-domain iframe
ads are scored with a value of 1, other vendors which cannot
determine viewability are scored with a value of 0. (*4Q15)
33. 2014 Budget Review
Time to onboard
Vendor Time to onboard Score
Abakus 1 month 1
AOL/Convertro 2 months 0
C3 Metrics 7 days 1
eBay/Clearsaleing 1.5 months 1
Adometry/Google 3 months 0
Marketing Evolution 1.5 months 1
Marketshare 1 month 1
Rakuten DC Storm 3 months 0
Visual IQ 3 months 0
- 33 - jonathan.isernhagen@wyn.com @jon_isernhagen
Time to onboard is crucial as recommendations from any platform
cannot be considered until the platform is fully live. Vendors which
are able to onboard in less than 2 months are scored with a value 1.
Vendors requiring 2 months or more are scored with a value of 0.
34. 2014 Budget Review
Cost
Vendor Cost/year Contract period Score
Abakus $50K - $150K Yearly 1
AOL/Convertro $60K - $1M Yearly 0
C3 Metrics $60K - $150K Monthly 1
eBay/Clearsaleing $60K - $850K Yearly 1
Adometry/Google $275K - $300K Yearly 0
Marketing Evolution $150K - $1M Yearly 1
Marketshare $200K - $1M Three-yearly 1
Rakuten DC Storm $170K Yearly 0
Visual IQ $325K - $1M Yearly 0
- 34 - jonathan.isernhagen@wyn.com @jon_isernhagen
Vendors with a minimum yearly fee of less than $100,000 are
scored with a value of 1. Vendors with minimum yearly fees
exceeding $100,000 are scored with a value of 0.
36. 2014 Budget Review
Running the process
1) For each site visitor
a) Assemble visit history
b) Create variables to represent:
i. Channel impressions
ii. Channel clicks
iii.Past purchases
2) Regress or use machine learning algorithm
a) Ascertain which channel touches predict booking
b) Give VCM credit to causal channels
3) Calculate ROI
a) Use each channel’s VCM and spend
b) Where ROI is positive, spend up
c) Where ROI is negative, cut spend or change tactics
jonathan.isernhagen@wyn.com @jon_isernhagen
37. 2014 Budget Review
ROI Dashboard
Channel* Desktop Tablet Mobile
Brand 20%
Brand SEM 62% 51% 38%
Display -5% -12% -7%
Display - Retargeting 26% 25% 29%
Email 250%
Meta search 18% 22% 10%
Non-brand SEM -30% -18% -40%
SEO 500% 520% 390%
Social media -5% -15% 15%
- 37 -
*of impression/click, not necessarily of consumer conversion
jonathan.isernhagen@wyn.com @jon_isernhagen
38. 2014 Budget Review
Presentation agenda
1) The importance of….
a) Cross-device tracking
b) Channel ROI calculation
2) ROI components
a) Spend
b) Variable contribution
3) Algorithmic attribution
a) Marshalling the data
b) Evaluating vendors
4) Take-aways
jonathan.isernhagen@wyn.com @jon_isernhagen
39. 2014 Budget Review
Take-aways
1) Cross-device shopping is here to stay (until/unless
phablet experience massively improves).
2) ROI is the one true KPI
a) Algorithmic attribution is the only way to calculate it
b) “ “ is becoming more affordable
c) Each time you find yourself agonizing over a channel
marketing spend decision, revisit your choice not to do
algorithmic attribution.
jonathan.isernhagen@wyn.com @jon_isernhagen
40. 2014 Budget Review
Cross-device tracking: four methods
• Deterministic (e.g. Facebook, Twitter): publishers
and platforms ask their users to sign in to their
websites and apps on every device they use
• Probabilistic (Drawbridge, Tapad): ad tech
companies…aggregate information about ads
served on smartphones, tablets and desktops,
and then use statistical models to infer who is
using which device….with 60-90% accuracy.
jonathan.isernhagen@wyn.com @jon_isernhagen
John McDermott, http://digiday.com/platforms/wtf-cross-device-tracking/
41. 2014 Budget Review
Cross-device tracking: four methods (cont’d)
• Householding: Where different devices can be
seen on one IP range and are combined with
home data, behavior and more, they can be
inferred as the same user.
• Data links: Apps that can hear TV sounds, QR
codes, NFC and more data links can join up
devices to TV, print and outdoor for a cross-
channel approach (more than cross-device).
jonathan.isernhagen@wyn.com @jon_isernhagen
Robert Webster, http://crimtan.com/cross-device-tracking-dont-believe-the-hype/
42. 2014 Budget Review
Algorithmic attribution per Visual IQ:
Top Down (MMO) & Bottom Up (Fractional Attribution)
42
SUMMARY LEVEL DATA USER LEVEL DATA
CHANNELS Offline + digital cross channel Digital media channels
ROLE
Strategic: Optimize spend across
channels
Tactical: Generates granular
media recommendations
OUTPUT Cross channel insight Full fractional attribution
CAPTURES Seasonality and external factors
Interplay between digital touch
point and channels
PREDICTABLE
GRANULARITY
Conversions at aggregate level Propensity to convert at user level
jonathan.isernhagen@wyn.com @jon_isernhagen
43. 2014 Budget Review
“Given all the history we know, how likely is this shopper to convert soon?”
Datasong answers with a 2-stage model:
• -Are Brand shoppers (BSEM, BSEO, DTI) likely to book? Yes
• -Did your last TV spot cause their brand loyalty? Maybe
Algorithmic attribution per DataSong:
Survival Modeling
Model 2
Accuracy 81%
Model 1
Accuracy 68%
0
0.5
1
0 0.5 1
NonConverters Converters
0
0.5
1
0 0.5 1
NonConverters Converters
jonathan.isernhagen@wyn.com @jon_isernhagen
44. 2014 Budget Review
Algorithmic attribution per DataSong:
Survival Modeling
1) Axes show the time since last channel exposure
2) Dots represent Converters vs. nonConverters
3) Orange line represents the relationship between
2 variables, e.g.:
a) time since last email and
b) time since last affiliate visit
4) Responders: everyone above the line. We count the
folks above the line and see what our accuracy is
5) Attribution: once satisfied with a model(s), we’d:
a) take a given order,
b) see the time since last email and affiliate and based on
the timing, and;
c) where we are on the orange line, so;
d) we have a means to allocate which was more causal.
0
0.5
1
0 0.2 0.4 0.6 0.8 1
NonConverters Converters
jonathan.isernhagen@wyn.com @jon_isernhagen
45. 2014 Budget Review
Survival Modeling: DIY
Recommended by one
of our PhD statisticians.
“SAS usage is not
necessary.”
jonathan.isernhagen@wyn.com @jon_isernhagen
46. 2014 Budget Review
- 46 -
Algorithmic attribution per Google:
Interaction Method (“Shapley value”)
Brand
SEM Email
2%
Conversion
Brand
SEM Email
3%
Conversion
Every combination (“coalition”) of clicks is tested.
“How important is each player to the overall cooperation?”
http://en.wikipedia.org/wiki/Shapley_value
Display
jonathan.isernhagen@wyn.com @jon_isernhagen
Hinweis der Redaktion
Hi, my name is Jonathan Isernhagen and by now you know me and the novelty has completely worn off.
I want to start out by discussing the importance of cross-device tracking and our mission as marketers.
Then dive into the components of channel ROI
Which leads directly into algorithmic attribution.
Facebook commissioned a study of more than 2,000 wired adults by market research agency GfK. They discovered the following:
According to John McDermott, cross-device tracking is defined as…
I want to start out by discussing our mission as marketers.
Then dive into the components of ROI
Which leads us into attribution
I keep using this slide because it always seems necessary.
These are all areas of focus which other speakers should be our companies’ top priorities.
By show of hands, what is the mission of the CEO of a publically-traded company….?
As an example, here’s the first paragraph of the governance section of the first page of Apple’s investor relation site.
The CEO who fails to do his or her best to maximize the value of the company’s shares of stock is literally breaking the law. And all of us work for the CEO.
The way Marketing supports the CEO in this effort is by spending in each marketing channel up to the point where one additional dollar out nets us exactly one dollar of incremental profit back.
This is the point for each channel where the incremental ROI flips negative.
We calculate ROI using:
Spend: how much we spend on a marketing channel over some period of time, and
Variable contribution: the amount of profit that spend has created.
When we do this calculation for each channel, our finished product is an ROI dashboard.
This is the foundation upon which every CMO’s channel allocation decisions should be made.
I want to start out by discussing our mission as marketers.
Then dive into the components of ROI
Which leads us into
The first component is spend, but what spend do we include?
Marketing is a lot like farming in that we are always scattering impressions and clicks into the warm, moist, fertile brains of our consumers and hoping to harvest transactions.
It is different in that we have no winter. We are always planting.
This is okay if our spend is constant.
If you’re calculating March’s ROI, you know March is taking credit for transactions which were inspired in February, January or before, but you also know you’re starting some goodness in March which won’t pay off until later.
However:
if you spent less in March than in previous months, your ROI will look artificially positive. This is what farmers call “eating your seed corn.”;
likewise, if you spent more in March than in previous months, your ROI will look artificially negative.
Variable contribution is a two-step process. You have to:
determine the profit of each transaction, which is an accounting exercise, and then;
which channels deserve the credit for that profit.
Variable contribution margin is Revenue minus non-marketing expenses:
On the Revenue side, you want to include all the goodness a booking drives.
On the Expense side, include all non-marketing variable costs associated with completing a transaction.
Part 2 of the VCM exercise requires someone to gauge which marketing treatments caused each purchase, which is an exercise in customer psychiatry.
This can be done, but….
I want to start out by discussing our mission as marketers.
Then dive into the components of ROI
Which leads us into
VCM is difficult and expensive to calcuate, and doing it correctly requires a mathematical credit-sharing method called “algorithmic attribution.”
Depending on several factors, it may not be worth your while to do this.
I’ve tried to map out your decision tree:
Question #1 is: do you really need to know your true attribution?
Are you spending at least $10M/year on marketing?
Are you distributing it across multiple channels?
Do you believe this stuff is knowable, and is your organization flexible enough to actually shift spend around?
The next question is: Can you access your data in a form that lends itself to modeling?
If you say “No” to any of these, then use the attribution available from your site metric product.
VS, SiteCat and GA all have them.
If it’s been “yesses” up til now, your next question is whether you have the data guy in-house to do the modeling, and whether he or she has bandwidth.
if either of these are “no”s, then you should think about hiring an attribution model vendor.
….whoever does the modeling will need to pull all available impression, click, booking and profit data into a heavy analytical workspace like R or SAS…
…he or she but usually he will then have to connect the data together to create a rectangular table of causes and effects suitable for regression modeling.
This all has to be brought together in a way that connects customer behavior to what they saw, so you can try to tie cause to effect.
The hardest part of this is deduplicating customer profiles. Several data-handling companies specialize in this activity. Blue Kai, AgileOne and Compete all claim to do this well.
SocialCode ran an ad campaign for a consumer packaged good company looking to get people to redeem an offer. Thanks to View Tags, it found that of the total 5,924 people who redeemed the offer, 5,127 had only viewed the advertisement, compared to 797 users who clicked through to the offer directly from the ad.
http://techcrunch.com/2012/11/20/facebook-view-tags-ads/
When I was at Travelocity we used Forrester’s 2012 Wave survey of attribution providers as our starting point to find a suitable attribution vendor.
The market has changed a lot since then:
Visual IQ is still the market leader, but by a narrower margin;
Google, which used to be dead last, acquired Adometry and the combination is has improved;
Convertro was purchased by AOL and is almost at parity with Visual IQ.
C3 Metrics gave me an attribution vendor evaluation matrix which one of their retailer clients put together, which they have authorized me to share. They ask: is the vendor:
A seller of media;
Able to pixel to collect data directly;
Able to cross and deduplicate visitors across devices;
Able to bias in favor of bottom-funnel, conversion-linked channels;
Able to control brand search and affiliate conversion management tools robotically;
Able to audit display ad viewability;
Able to onboard new clients quickly, and;
Affordable?
Karen Laine of Adometry assures me that Adometry is firewalled from the rest of Google and that their clients’ data will never get into Google’s hands, and I like and trst Karen, but there’s something about giving the fox access to the chicken coop which still seems risky. This concern applies also to eBay and AOL/Convertro.
There is something to be said for a vendor being able to pull in historical impression, click, transaction and profit data from your enterprise data warehouse, but there is also something to be said for a vendor having the ability to pixel your site and then gather all the data they need directly, even if you had no data warehouse and/or it’s a mess. AOL/Convertro, C3, eBay/Clearsaleing, Adometry/Google and Rakuten all have this ability.
If you can’t pull together the channel actions of one visitor across multiple devices, your picture will be incomplete. All vendors can have profile deduplication done by a third party like Blue Kai before feeding data into their models, but the ones who have the deduplication built in can be quicker and less costly.
C3 believes that search activity at the bottom of the funnel is disproportionately responsible for conversion, and should be credited accordingly. My own preference is to leave all of the data in and let the model discover which clicks and impressions are correlated with good booking behavior.
It can be advantageous if your attribution vendor can interact directly with tools like Marin and advise them in near-real time if they’ve been responsible for a sale. Your whole system can then opportunistically and precisely react to spikes or dips in customer demand.
Since >50% of displayed ads are never seen, it can be useful to audit which ones actually appeared before the eyes of humans and had a chance to impact conversion, though presumably the unseen ad “impressions” which aren’t filtered out just weight down their channel.
All things being equal, it’s better to get up and running faster vs. slower. If this speed comes at the cost of being able to use historical data, and if you sell a product which has a very long consideration/purchase cycle (e.g. cruises), then it may be worth a delay to be able to pull several years of data into the model.
I can’t vouch for any of these prices (not would I be able to if I could because I’m under NDA with many of these companies) and they may be biased to make C3 look cheaper, but I doubt it. Visual IQ is in the high right corner for a reason and describes themselves as the Cadillac of this industry, so they probably are a little the most expensive. The cater to people who need to make extremely accurate spend decisions with gargantuan marketing budgets.
C3 Metrics was always going to be the winner of the methodology they provided, but I think it’s a fair comparison so far as it goes. What it doesn’t include, and what I think trumps all other considerations, is: what type of lift does it cause? You’re renting the product to improve your marketing ROI, so all of these other criteria are just proxies for that positive impact. Visual IQ, Adometry and Convertro have all advised me that their new customers typically enjoy a 20%+ improvement in ROI upon initial release, followed by smaller improvements in subsequent months.
Regardless of whether you do your own model or hire it done, you take the total contribution each channel has driven, subtract spend and divide the total by spend, and then look at the ROI. If positive, spend up. If negative, spend down and/or change tactics.
Our finished product will look something like this.
Despite all of the complexity in the background, the finished output can be presented to the most distracted executive as a grid full of Harvey balls.
Ask and answer “why should we care,” then;
Cover the tactical steps of setting up your retargeting program and adding cross-device-ness to the mix, then;
Discuss how to measure the ROI of all of your marketing efforts, which may involve selecting an algorithmic attribution vendor.
Regardless of whether you do your own model or hire it done, you take the total contribution each channel has driven, subtract spend and divide the total by spend, and then look at the ROI. If positive, spend up. If negative, spend down and/or change tactics.
There are four major methods for cross-device tracking:
If you have a big enough web presence that people are willing to log into your site however they visit, you don’t actually have a cross-device tracking problem.
If not, there are companies that apply predictive models to served ad data to make predictions about who is using what device.
3) The same types of models can be applied to IP addresses and to infer the same user, and;
4) Apps that can hear TV sounds, QR codes and campaign-specific hyperlinks also enable cross-channel tracking.
Visual IQ is the self-described Cadillac of the attribution marketplace. Their method combines:
A top-down econometric model including all of the transaction influencers that are not directly trackable, including television, seasonality, PPI, and;
A bottom-up model which incorporates individually trackable impressions and clicks.
And they periodically synchronize the outputs of these models and adjust their recommendations.
Another method, used by the vendor we chose for a pilot test, uses a technique lifted from the medical world called survival modeling. How likely is a given customer to book given everything else that’s happened?
The time proximity of bookings to channel clicks is noted, as well as other things like seasonality and repeat purchase history.
DTI, SEM and SEO are treated as behaviors instead of channels.
Typing your brand name is highly correlated to booking.
Is TV advertisement highly correlated to typing your brand name? Maybe not.
Model is accountable to a recorded outcome: response
In this chart:
The axes are the time since the last exposure to each of the two channels under consideration.
The dots represent converters vs. NonConverters
The orange line represents the relationship between 2 variables (say time since last email and time since last affiliate visit)
Everyone above the line is considered a responder. We count the folks above the line and see what our accuracy it
This curved version of the graph shows a more complex model and relationship between the 2 variables. It has a higher accuracy
Once we’re satisfied with a model(s), we’d take a given order, see the time since last email and affiliate and based on the timing, and where we are on the orange line, we have a means to allocate which was more causal.
For those of you with stats people who would like to try survival modeling, our PhD stats guy recommended this book, which he says is written such that SAS usage/knowledge is not needed.
On of the data-driven attribution methods of which I’m aware is being beta tested by clients of Google.
Look at all the clickstreams, not just those ending up with a booking;
Compare conversion when a click is present in a certain sequence, then when it’s absent. A click could pull conversion backwards.
This method attributes channel credit and picks up on the importance of sequence and timing.