Consumers’ appetite for mobile has become insatiable; whether it’s smartphones, tablets, wearable tech or connected automobiles, mobile is the fastest growing platform we've ever experienced. These devices are inherently social, yet extremely personal; serving almost as an extension of who we are as individuals. They have given marketers the ability to get closer to their consumers than ever before, but with great power comes great responsibility.
During MLA & Celtra’s Morning of Mobile Privacy, presenters Jules Polonetsky, Jaka Jancar and Alan Chapell shared thoughts on avoiding privacy pitfalls in mobile to garner consumer trust and building digital audiences in a cookieless world. Download the presented slides and feel free to contact the MLA directly with any questions you may have about mobile privacy.
8. Regulators Respond
Sen. Rockefeller legislation to set mandatory
Do Not Track Rules
New FTC COPPA rules
EU Cookie Directive
And states get into the privacy game…
11. Over thirty years, consumers
have consistently fit into 3
groups:
17% privacy fundamentalists
56% pragmatic majority
27% marginally concerned
- Westin Surveys
12. Consumers Are Complicated
“We’ve found that individuals assign
radically different values to their
personal information depending on
whether they’re focusing on protecting
data from exposure or selling away
data that would be otherwise
protected.”
Acquisti, L. John, and G. Loewenstein,
“What is Privacy Worth?,” tech. report,
Heinz College, Carnegie Mellon University,
2009.
13.
14.
15. We are OK with sharing when what
is being done is for me, not to me
23. IT’S THE END OF THE WORLD AS
WE KNOW IT…AND I FEEL FINE!
PRESENTED BY:
Jaka Jancar
Chief Technology Officer
Celtra
24. Celtra & privacy
• We’re RM company, new to privacy
• It’s hard to know what’s OK and what’s not
• Why don’t offline businesses have as many
privacy concerns?
25. What is different online?
• Unknowingly generated data online is much
greater in number and level of detail
(e.g. web activity, app usage, geo, …)
• Evilness levels are identical, just capabilities
larger
• Yet, expectations of privacy are the same
26. People: a healthy dose of ignorance
• Normal people don’t proactively think about
privacy
• “Being informed”, “having options” …meh
• They just don’t want to be negatively
surprised
• And that’s fine!
27. Ad tech: “cover your ass” strategy
• Constant obsession
• Putting up a show
–
–
–
–
–
Non-PII, probabilistic identifiers
Not storing data
Transparency: disclaimers
Control: opt-outs
Self-regulation
• Good for staving off media and regulators, but
that’s all
28. Thought: it’s not Ad Tech’s role
• We have our goal: to improve efficiency
• It might conflict with users’ privacy
• We might not be the best to represent users
29. Who, then?
• For most egregious of cases, and for a
framework, governments are probably fine
• For “good taste” — who has the most to lose?
– Not ad tech companies (what are those?)
– Platform vendors (OS, browser)
– Publishers (at risk)
• They are who users entrust with their digital
lives, and who they will resent if betrayed.
30. Rash platform changes don’t help
• Disabling 3rd party cookies
• Removing UDID
• Enabling DNT by default on IE10
All-or-nothing, not legitimate (user will)
31. Good example: IDFA
• Under user control
(resettable, configurable privacy level, platform-wide, even crossdevice)
• Not all-or-nothing
(never fully removes identifier and always allows basic uses)
• Limits uses, not collection
(more data available in the future?)
• Allowed uses very clear + legitimate
(no philosophical questions, no excuse to ignore or bypass)
32. Recap
• Privacy push will come from
users → platforms -> publishers → ad tech → advertisers
It will not be magically born in the middle
• Parties near the user will be the “good taste”.
There is a big opportunity for differentiation.
• Permissions will move from “cover your asses”
complete opt-out, to more granular ones
• We will lose freedom, but gain simplicity and
accuracy in exchange
33. What will we do at Celtra?
• Focus on empowering users
• Very specifically:
– Always collect <identifier, permissions>
– In absence of permissions, assume very little is OK
by default
34. CREATING DIGITAL AUDIENCES IN
A WORLD WITHOUT COOKIES
PRESENTED BY:
Alan Chapell
President
Chapell & Associates
35. To Access These Slides
Contact Alan Chapell
President
Chapell & Associates
achapell@chapellassociates.com
Hinweis der Redaktion
A lot of this is their problem – headline for data leakage..jonathanmayer – common sense media with leibowitz – website research finds online data leakage…richard smith and double click (1990) / jonathanmayer ( 2009) then get the wall street (PII Hutzpah stupidity or are we just too busy?Ballah – who showed….we in the industry have brought a lot of this on ourselves…kept swearing this was anoRely on privacy policy as main way of engaging with consumers….(1) keep leaking the data – we don’t learn from our mistakes- ourprimary communication with our customers is VIA a privacy policy -
Focus on enabling RM and improving it's user experience and engagement.Only lately, we've started working on some new reporting and creative features that require us to identify individual users. For the first time now, we have to think about users' privacy.With technical and UX questions, it's usually clear intuitively what is good and what isn't, and if it is not, it's easy to test. With privacy questions, it's much harder to decide what's OK and what isn't. Often, you never get to a clear-cut answer.One thing we tried is to help ourselves by drawing parallels to the real world: mapping tracking, fingerprinting, cookie syncing to the real world. That usually doesn't work out well. At best, you're a creepy stalker, at worst, a felon.
Real world just doesn’t scale and is more transparent. Users have been protected by that
* Apart from a small minority with interest in the subject, people don't often think about their privacy proactively, online or offlineAPPLE ANALOGY FOR PRODUCTIONDon’t want to be negatively surpriset -> expect people to have good sense
Small leap to identity. * On our side, companies like us are constantly thinking about these problems. * We self-regulate, show restraint, even if we know it doesn't work: * We say we don't collect PII. * We also know: 87% of the US population can be uniquely identified by just using their ZIP, gender and DOB. [src] * Non-PII + Non-PII can be PII * We provide opt-out mechanisms, which have extremely poor usability * We offer users transparency, control -> but nobody cares about that (like ToS, privacy policy). * It might be good for staving off media and regulators, but users just don't want to be negatively surprised. * "Cover your ass strategy"
Conflict of interest.Like companies don't usually serve the sell side and the buy side.Not a cynic.Not zero sum, but when downsides much lower than upsides, conflict of interest cannot be denied.
Facebook, Google+, LinkedIn mails, ISPs sniffing trafficPhones are most private of devices, and Apple or Google definitely do not want them to be perceived by users as a tracking device in their pocket.Of course, this is nothing new. Publishers are already today careful. I'm just saying their role will be increased. Not just about the data they leak, but enforcing the good taste upstream. APPLE ANALOGY FOR PRODUCTION * The same companies also have the most to gain: If people notice incidents repeatedly, privacy can become a major differentiator.
* Disabling of third party cookies -> flash cookies, ETag/cache abuse, fingerpriting. No visibility, no direct controlly, shitty and fragmented opt outs * Hiding of UDID -> Start of using MAC addresses and others, again fingerprinting. * Enabling DNT by default on IE10 -> Some started to ignore it altogether.If there is legitimacy, workaround are not legitimate
* There is a big opportunity for differentiation based on privacy across the chain. * We'll lose some capabilities, but there is a lot to be won.
* Don't throw sand in the eyes (unless client requested). * Be transparent for the activists and legislators, but know it isn't important to most people. * In practice: don't collect id, but id+allowes usage (e.g. IDFA). In absence of explicit usage, assume very little is ok downstream (default to conservative).