Privacy Paradox, Privacy Business and Privacy Enhancing Technologies - Not Even a Penny Given for More Privacy
Potential success of any privacy enhancing technology is dependent on the government and supervisory authority officials decisions and their perception about fulfilling privacy by design requirements. It is because, due to the Privacy Paradox, people will not equip themselves with these technologies. They rather expect that privacy will be provided by the government as a public good.
Keywords: Privacy by Design, Privacy Paradox, Privacy Enhancing Technologies, GDPR
We need a pragmatic approach to privacy protection. Maximalist goals and visions of a perfect future should be dropped. Instead, we need a more realistic and goal-oriented approach, which considers issues such as the Privacy Paradox—the importance of privacy for people on the one side, and the lack of will to pay for it connected with the expectation that privacy will be provided by the government on the other.
Therefore, we need to better enforce privacy by design requirement from the GDPR, which also means constant implementation of the state-of-art privacy-enhancing technologies to business operations. We should not expect that people will take care of their privacy by themselves. Neither we should expect that they will pay an effort to use such technologies or that they will pay for them. Due to privacy by design requirements from GDPR government officials have an enormous impact in deciding which privacy-enhancing technologies will be broadly implemented.
Privacy is at risk – so what?
We all value our privacy—or do we? Privacy talk is everywhere right now. Numerous legislative actions to tackle the issue have been taken and are ongoing (GDPR and future ePrivacy Regulation). The risks related to unconstrained personal data processing are well described, especially in the context of the 4th Informational Revolution, and the development of IoT-AI applications. We know that GAFAM (Google, Amazon, Facebook, Microsoft) seems to sometimes know more about us than we do about ourselves. And they use this knowledge to maximize profits and monopolize digital markets.
It is however hard to pinpoint any particular harm in the online environment. They are not discrete events. The privacy harms rather stem from the architecture of the online environments, including choice architecture, and long-term influence on our well-being and our ability to make informed decisions. But these harms may also relate to our subjective feeling of security of our private information and how this information about us is used. It makes protecting privacy harder.
Still, if I ask you what do you do to protect your privacy and well-being online what would be your answer? Do you consequently not allow web pages to track you by cookies and do you consider every consent request form for personal data processing? Do you send requests to delete your data from the companies abusing your personal data, or do you ever claim any other rights of yours from GDPR? Do you use VPN or some personal data management system? And foremostly, do you pay for these services? I think we know the answer.
The Privacy Paradox
The problem is that there might be an inherent trade-off between privacy and the utility of online services. Simply, the more personal data and other information are available the better service design is possible. And we want and need these services for our everyday lives. Thus, for most people, the ease with which they can use online services justifies a potential lack of privacy. Still, they expect that no harm will be done to them.
Numerous business initiatives are aiming to protect our privacy online, but all enjoyed rather moderate success. It cannot be said with confidence that the market value of such initiatives is extraordinary—they are rather a niche. Their customer base is affluent and highly technology-literate people having time and money to care for their privacy online (and criminals as well). But nothing such as mass-scale downloading of privacy-enhancing software for individual use occurred yet. Unless privacy is provided by the government, people do not care and expect the government to secure privacy as a public good.
This is the Privacy Paradox. What Privacy Paradox describes is that we do care about our privacy, but we are not ready to pay for it (meaning also not being ready to give up on numerous services abusing our privacy). Therefore, we expect that privacy will be a public good. It means that it will be something like roads or public transportation, or protection against crime—funded from our taxes, provided by the government, and protected by law.
What should we do then?
What does it mean both for business and privacy scholars (legal, psychologists, sociologists, engineers, computer scientists, and others)? Broad privacy-enhancing efforts should be calibrated on raising individual awareness, which means exerting more political pressure through civic action and pushing the government to take action. Also, we should strive to make the requirement of privacy by design more broadly adopted, including the adoption of state-of-art privacy-enhancing technologies in everyday operations where privacy is put at risk.
In the GDPR, privacy by design requirement stipulates that:
“Taking into account the state of the art, the cost of implementation and the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for rights and freedoms of natural persons posed by the processing, the controller shall, both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organizational measures, such as pseudonymization, which are designed to implement data-protection principles, such as data minimization, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects.”
Therefore, research and development on privacy-enhancing technologies should be continued and encouraged. But, at the same time, we should be conscious that we need to target our awareness building and marketing efforts on the government and relevant supervisory authorities, rather than asking people to equip themselves with privacy-enhancing technologies. It will be government officials that will decide what state-of-art privacy by design is. Thus, which technologies are to be implemented by default. From this perspective, the Privacy Paradox also means that it is not the market that will decide which privacy-enhancing technologies are to be implemented, because there is no market-driven demand for them, but the perception of the government officials.
This blogpost was written by Jan Czarnocki. He is a doctoral researcher and Maria Skłodowska-Curie Fellow at the KU Leuven Centre for IT & IP Law. He studies IT law, with a focus on privacy, data protection (biometric data in particular) and digital policy within the PriMa ITN Horizon 2020 project.