When marketers can use a consumer’s personal data to influence their behavior, is there a need for top-down regulation of the way that data is used? Should governments even be able to understand these data complexities? And what can the industry do to better protect our personal data in an increasingly profit-driven world?
Thales Teixeira (former Harvard Business School professor), Sonia Carreno (President, IAB Canada), Selma Chauvin (VP International Marketing, UKG Group), Mike Audi (Founder, TIKI) and Yannig Roth (VP Marketing, Didomi) discussed these questions and more Didomi) in our recent Yes We Trust Summit, a worldwide, 100% digital privacy event initiated by Didomi to help people understand and inspire trust in the internet age.
You can watch a replay of the discussion here.
Summary
- How Facebook changed - and changed everything
- Can data regulation come from the top?
- How much regulation is too much?
- The need for global data protection standards
- Seeing the big picture around personal data use
- Conclusion… consent, clear and simple
How Facebook changed - and changed everything
Most people consider the internet to be free. But you can’t get something for nothing. After all, companies like Facebook need to make money. Obviously, the intent is to maximise the number of eyeballs on the content for the longest period of time possible. So these companies use personal data to make as much money as they can.
People joined Facebook to share stories with their friends. Somewhere along the way, it became about profit, and it started becoming an amplification platform. Over the years, Facebook has become a media company, making more professionally created content available. Importantly, though, it doesn’t vet the source of this content. These third parties – some of them politicians, some trying to influence politics - are able to manipulate our discourse. This is, of course, something Brittany Kaiser knows all too well, and discussed in her Yes We Trust Summit keynote.
Facebook is just one part of a much larger ecosystem, of course. It’s everyone’s responsibility to use different sources of information, and to form opinions based on different people and facts. In the same way, it’s everyone’s responsibility to balance the media they consume with how much data they’re prepared to give away to access that media. But, many people just don’t know what to do.
Can data regulation come from the top?
The unfortunate reality is, it’s practically impossible for a government to regulate such a fast-moving and complex issue – let alone several governments, with people using different services in different countries. As communication borders break down, information will move at an accelerated rate, making the prospect of government regulation even more challenging.
We need to find a pattern which will allow us to decentralise decision-making to allow people to understand what they’re making decisions about, and make those decisions more effectively. We need to close the gap between consumers’ expectations about how their data is used, and the reality.
Ideally, we’d find a way to properly inform people of how their data is used, as we do with cookie banners. But many people find this annoying, they don’t want to see it every time they visit a website. It doesn’t have to be annoying, though, and it doesn’t have to be complicated. You just have to present the right question at the right time in a way that’s easy to understand – this is what happens, and this is why – do you want to do this?
How much regulation is too much?
Top-down regulation should really only come into force when there are significant material impacts to a broad swathe of consumers, such as leakage. But there’s another issue where, even if no information is leaked and there might not be any material damage, a company learns so much about a person that that person starts to feel uncomfortable.
Arguably, there should be tools in place to allow a user to say that they no longer feel comfortable. But should there be a law against it? It may be part of a company’s business model, using data to personalise the customer experience, for example, and to deliver tailored recommendations. If it’s not creating harm, then it could be described as innovation. And overpowering regulation can stunt innovation, so is best avoided.
It’s about trust, too. Do consumers trust the company that collects their data? Not only what they collect, but how they use it now, and intend to use it in the future. People don’t realise that a list of data really helps companies to sell you things, it helps them to understand you and act exactly as they want. People have to ask themselves whether they want companies to be able to do this.
The need for global data protection standards
People are often worried about the leakage of information like email addresses. In reality, however, the most important data is the aggregate of all the things you clicked on and watched and viewed. It determines who you are and how you act. And, based on that, it can influence culture, decisions, and people. But it’s often overlooked, and is typically the least secure and most vulnerable.
If the protection of this data is to be addressed from a regulatory perspective, it needs to be a globally accepted standard. The transparency and consent framework (TCF), for example, is a European standard concerning the ability to give consent on websites and apps, and is on its way to becoming a global standard.
Consent can take various forms depending on location. Sometimes it can be early on in the process, sometimes later on, sometimes it can be long and boring, other times innovative. But love it or hate it, consent is central to regulation.
Seeing the big picture around personal data use
The missing piece in all of this is the question of whether consent is meaningful. Most people don’t understand the nuances of how their data is collected and used for targeted advertising, for instance.
Are consumers sufficiently educated to make decisions on what types of data should and shouldn’t be used? Do they understand Facebook’s algorithms well enough to know when they’re being artificially influenced?
The industry needs to enable consumers to understand the full picture. Education is vital. As we start talking more about what we can infer from the data we leave behind, people will become more aware.
But we’re still in our infancy. We must understand that consumers don’t necessarily appreciate the connection between advertising and consent. There’s still inherently a gap in people’s understanding of the value they get from the free internet as a result of advertising.
We need to address this now, because the alternative is a subscription model. And that will wipe out democratic and free access to a wide range of information.
Conclusion… consent, clear and simple
As much as we talk about educating people about consent, we’re not entirely sure what we want to ask people for. Marketing departments are constantly innovating. We don’t know what we’ll do with their data in the future. We need to consider, therefore, whether we educate them on the data we’re collecting now, or on all the potential uses tomorrow.
But it doesn’t have to be complicated. Amazon’s been keeping track of users’ clicks for 15 or 20 years. Back then, it didn’t know why. Over the years, it’s realised that this data is valuable. If they told users they’d like to track their clicks – even though they weren’t sure what they would do with the information – those users could simply opt in or out.
It may be complicated behind the scenes but, importantly, the choice should always be simple for the user.
Didomi believes in giving control over their data, with clear and easy-to-understand consent and preference management solutions. By leveraging compliance, brands can turn data transparency and privacy into competitive business advantage. These are the beliefs that led to us to becoming a founding sponsor of the Yes We Trust Summit.
Watch the best of video from the inaugural Yes We Trust Summit here: