The way we use deep personal data needs to change. Indeed, I’ve been saying this for a few years now. As everyone knows, megacorps like Google and Facebook[1] have built their insanely profitable businesses around selling products based on personal data. And other folks want to be as successful, too, so they lazily try to build business built around the collection and selling of personal data (data monetization! drink!).
What’s with the ads?
Advertising is one of the first ways folks think about selling data. Advertising built publishing and TV, and then built Facebook and Google to gargantuan sizes. Their success made everyone think of a data monetization strategy – how to sell personal data to someone who will pay for it.
Now everything is a data collection vehicle to sell ads, even cars and toothbrushes.
I can’t stand ad-based models. The farthest back I remember being against advertising-based business models was when I was advising an innovation team developing a phone app back in 2004.
Choosing an ad-based model for your product is lazy, especially if your whole business depends on it (what the f- you smokin’?). For some, it can work (ahem, Google, FB, Instagram, etc.). The challenge is that ad-based models require control of a channel (or at least a heavy presence), large numbers of dupes to hand over data (scale), and a reason to buy ads on the platform (frankkin’ eyeballs, dude!). As the list of failed ad-based companies show (including newspapers and magazines), it’s a hard model to succeed with.
Data first
Ads are not the only way to monetize data. The second way is to sell insights from the data to someone else. Indeed, in digital health, this is big – collect health data and sell to insurers, hospitals, or pharma. Once again, those insights require a significant number of users to make it valuable to data buyers. And, just like selling data for ads has nothing to do with the user giving data, often digital health apps do not return value to the user giving data to the same level as the value they give to the data buyer.
Basically, data models seem to sell the data separate of returning any value to the user giving the data.
Value to the user
Ok, one _could_ say that ads are valuable to the user. Timely coupons! Discounts! Personalized ads! I mean, they must work for Google and others to make so much money.
Dunno. I think these days, ads are just so much noise that most are adept at filtering them out.
Also, I admit that, while I was always against ad-based business models, until a few years back I was supportive of data brokering (selling data to someone else) business models when advising companies.
Though, when I used to suggest collecting user data and selling to others for insights, I always wanted to do it in a context where the value is returned to the user. For example, collect data for a pharma to either pay the user or help the user get well (though, in the end, I think that’s a bit of a fantasy).
Tighter value loop
Value to the user needs to be the metric in all things regarding personal data. Hence, where I have evolved is to use data in a tighter value loop, where the data collected is used to improve the service that is collecting the data. For example, Apple and Netflix have a lot of data on us, but our data drives their business, not some product for a third group. John Hancock and Progressive track people to reward for good behavior.
I’ve been advising companies to think of a data strategy that benefits only them and their user, not some other group, not selling data to the highest bidder. I’m all for collecting as much intelligence on your customer as possible. But use it for the benefit of the user, not the abuse. Personally, I want the products I use to know me and help me. My data should be tightly coupled with my benefit, not someone else’s.
The data genie is out of the bottle
Because of the race to collect data in the past decade, personal data is everywhere. And with the general trend for everything to be a data collection point to sell data to others, this is going to be hard to reign in back all the bad behavior that has gone too far, with risks too great.
Is the only option to delete your apps or use a tool like Jumbo or quitting the digital world altogether?
Too late: your face is everywhere and being used to make money for someone else.
The good news is that the government and other folks are waking up to all of these personal data privacy issues. I’m not so sure if the average Facebook user really gets it, though.
Still hopeful
I know we can make this work. The digital world of Google and Facebook is not the first ad-driven, data-collecting, and data-selling world. There are other indistries (like healthcare and pharma) who need to protect data and use it only for clearly consented uses. So there are frameworks to build off of.[2] Why would it be any different in the digital world?
Parting thought
I keep thinking of ‘peak oil’ and how Sudi Arabia is planning for a post-oil world. Are we at ‘peak data selling’ and will Google be planning for a post-data-selling world?
I think they better.
[1]Facebook rant: I’ve never liked Facebook. And seems like they are scum, anyway. They say they don’t sell data, but story after story shows they play fast and loose with personal data, from Cambridge Analytica, to secret schemes, to developer access, to silent tracking. And Facebook really doesn’t care, even in the face of public and government scrutiny. They are the poster child for f-ing up with personal data. Why the frak does anyone still use facebook?
[2] I keep thinking of a certificate or label of trust that would be a quick way to tell someone that their data is safe. For example, a few parameters, such as if data is collected and what Y/N, data for internal use Y/N, code and organization audited for privacy and cyber controls (also, consent and re-consent constraints) Y/N, data for external use (ads, sell, access) Y/N. Here’s one attempt at this data use label.
Image by Chris Sansbury from Pixabay