Data Protection Plus Personalisation: a Fundamental Contradiction?
By Christopher Reher, MD Germany, Platform161
This is an adaption of a piece which originally appeared in German in Upload Magazine.
We live in exciting times, times in which our society is becoming ever more digital, ever more data-driven and we experience this change every day, both consciously and unconsciously.
Digital assistants increasingly help us carry out tasks of all kinds faster and make our lives easier. The future is seen in self-driving cars, AI is on everyone’s lips and ad banners show your next journey before you even search for it.
But many people are moving increasingly uneasily through this ever more digital world, some are starting to feel anxious about technology in general. Perhaps they’re right to be concerned, but will they also end up being left behind as a result?
Data scandals shake the media and subsequently society. Politicians feel pressured to act – sometimes more to appear active, than ultimately to help the consumer at all. Lacking the complete picture, their solutions don’t always contribute to a country’s economy, or wider digital vision, and may even prove detrimental to both.
Ultimately, neither do they often help society at large. In an ideal world, governments would be providing the public with the tools they need to navigate their new digital surroundings. And, above all, to control them to their benefit.
What we’re talking about here is of course a huge topic, covering everything from infrastructure, to social cohesion, to ethics – not to mention of course personal security. And in such a mutable environment as technology, can the law even keep pace?
This article focuses on the tension between our need for personal protection, and our yearning for individualisation, convenience and a simpler life.
Are personalisation and data protection, especially in the area of advertising, fundamentally contradictory – or are they more likely to be harmonious than is generally assumed?
Square Wheels, Personalisation & Progress
In one sense, personalisation can be seen as an essential part of human history – and progress. As part of a deep-rooted drive to make his environment as pleasant and convenient as possible. To exaggerate for a second, if this were not the case, we’d still be transporting water on carts with square wheels and living in straw huts.
Perhaps it’s not such a stretch after all, to compare this type of thinking with a cookieless internet. An internet where you have to reset preferences every time you visit a site, or likewise log in again and again. Or never again have a retailer remember what you added to your basket the last time you were online.
In the field of online advertising, effective personalisation is when a user is shown an ad at just the right time, with the product he or she needs, where it is most relevant.
This scenario is of course the optimum outcome for the advertiser too, because if serving an ad is more likely to satisfy the needs of the customer and lead to a purchase – then the investment was a wise one. Money is generated, employees can be paid, partners for their investment are rewarded and above all, new products can be produced. It’s effectively a socio-economic virtuous circle.
Today we are still miles away from this optimal situation, and there are multiple reasons why.
A Short History of Personalisation
First and foremost, personalisation, i.e. the adaptation of conditions to human needs, requires exact knowledge of these needs. Data or information must therefore be available in such a way that accurate conclusions can be drawn from it, and further steps can be defined without too many errors.
In the past, this information was obtained through surveys, analyses of purchasing behaviour and other such types of market research. By definition, the resulting data was only conditionally precise, since only a small part of people’s actual needs could be mapped out, and the rest was filled with at best, experience, or at worst, conjecture.
Modern data processing (remember ‘big data’?) provided a solution here. For the first time, it was possible to process significant amounts of data in near real time. Combined with the continuing growth of the internet and the increasing digitisation of everyday life, something like a valid amount of data was finally available – both personal and non-personal – allowing a comprehensive, clear and complete view of the needs of the customer.
This combination led to an unprecedented potential for individualised services, and led to improvement in areas as far apart as fashion, food, banking, health, TV and entertainment but also in advertising and consumer finance.
The core question that arises, however, is bringing the data and the processing of it together in a way that personalisation works consistently, while always respecting privacy and data protection.
A short recap of data protection law history should help our understanding here.
A Short History of Data Protection
Data protection in its modern interpretation has been a relevant topic at least since the census judgements of the 1980s. Then, the law was primarily seen as a means of protection from interference by the state. Its main aim was to regulate how, when and, above all, within what limits the state was allowed to collect, process and store the citizen data.
Data protection was regulated partly opaquely and partly separately both at federal and state level, and a European context was only given to a limited extent.
This changed with the entry into force of the European Data Protection Directive, in which a uniform legal construct was created for the first time at pan-European level.
Likewise, the Europe-wide establishment of uniform data protection regulations was intended to strengthen the fundamental rights of citizens in general and to achieve a quasi-definition of a fundamental right to data protection in particular.
The next important law at European level in this context was the e-privacy directive. At its core, it strengthened the rights of citizens of Member States and considered the growing importance of data protection needs and aspirations. In particular, the regulations on advertising and cookies it contained represented a novelty and are of considerable relevance for the topic dealt with here.
These European regulations were reflected – at least in Germany – in national laws such as the Telemedia Act (TMG), the Federal Data Protection Act (BDSG), the respective state data protection laws and in parts of the Telecommunications Act (TKG).
However, the latest – and probably most relevant – development took place only a few years ago with the General Data Protection Regulation (GDPR), which has been directly applicable since 25 May 2018.
With the GDPR, a paradigm shift actually occurred, since legislation no longer primarily concerned protection against interference by the state, but elementary social mechanisms and future technologies were given rules and an indirect, extremely broad standard for data protection was created.
If the GDPR is broken down to its core, the meaning of what the legislator intended with it is the empowerment of the individual and human participation in the new digital world. The law is intended to prevent unauthorised persons from accessing users’ data, selling it, profiling them using it without their knowledge, or facilitating criminal dealings with it.
It considers the individual’s right to maintain control over his or her own interests and to be able to participate actively in the same process. The citizen is therefore both given powerful rights, and at the same time is made to hold themselves responsible.
In summary, then, data protection, especially the GDPR, is a mechanism that for the first time enables citizens to actively and individually participate in digital society and puts them on an equal footing with both business and government.
Now that we have established the emergence of data protection is in its present form, we still have to resolve where personalisation fits and why the data protection versus personalisation tension occurs.
Because if you follow the current prevailing opinion both in politics and the legal system, then it quickly becomes clear that companies involved in profiling and targeted advertising are the investment bankers of the digital age. Ever-present, and maybe even at heart indispensable for the current economic cycle, but also something that is extremely complex to pick apart and therefore incomprehensible to many. But the similarities don’t end there – advertising technology is generally also viewed with fear and suspicion, at least in part due to that same lack of understanding. But the industry as a whole is largely judged based upon the actions of a minority of market participants. And many also regard it as something that should be regulated into submission – or so much that it can no longer function at all.
We are therefore facing a dilemma: on the one hand, users seem to enjoy individualised services and see their lives continuously simplified by technology. On the other hand, we have complex regulations, which are supposed to ensure that this personalisation only takes place within a very limited framework, or only if the citizen is fully informed and consents also to the process around it.
This is a challenge that currently employs a large number of motivated and knowledgeable people, with various results.
On the one hand, we have the data protection authorities (DPA), who are acquiring more and more knowledge and are becoming more active in enforcing the GDPR. This is particularly the case with the guidelines of the supervisory authorities, which illuminate the various aspects of legal interpretation and should thus show other actors the right direction when implementing the measures.
At the same time, however, there are still inconsistent interpretations of the regulations by those agencies for the marketplace. And sometimes contradictory or even unrealistic positions are taken.
The business community is working feverishly to implement the new regulations and to involve the user as much as possible in the process. Because, let’s not forget, every euro that is spent advertising to someone who responds negatively is wasted. And that applies equally when the user has not consented to receive it. As we already established, this may damage the entire economic cycle. It can therefore only be in the interest of all those involved in the economic sector, to establish the smoothest and clearest possible communication with the user.
The standard of IAB Europe’s Transparency & Consent Framework in Iteration 2.0 should also be specifically referred to here. Among other things, this standard can provide proof of the actual will of the user in data processing and thus the potential permission for a profile to be created.
So-called Consent Management Platforms (CMP) have of course also developed, which specifically deal with how the user can be made aware of the various data processing operations, so that they can ultimately make informed decisions regarding the processing of their data.
Here, however, we come to the biggest sticking point – the users themselves.
The user’s right to decide on the processing and use of his or her data clearly rests on an ability to easily and clearly understand how all of this actually works. But in practice, this is easier said than done.
In practice, we simply click away from dialogue boxes, ignore them or even feel harassed by them. However, the digital age can only function if users become aware of their responsibility and use their newly gained powers. Only with this clear input in the direction of business and politics can the necessary discourse and further technical development evolve.
It is understandable that, as a rule, data protection and personalisation are seen as diametrically opposed.
However, if you break it down, as I’ve tried to here, then one must conclude that the two actually complement each other perfectly.
The moment a responsible citizen, actively and in an informed way shares their data in order to receive certain services or benefits, then personalisation can provide exactly what they want immediately and precisely.
Data protection and personalisation are therefore a suitable pair, at least if all those involved want it to be so and make the effort to make it possible.