Data Protection & Privacy
The Brussels Privacy Hub Working Papers are intended to circulate research in progress for comment and discussion. The Working Papers focus on all areas of data protection and privacy research and can contain empirical research on privacy issues and analytical work on privacy governance and regulation in the EU; global flows of data; reconciling law enforcement and privacy interests; privacy challenges posed by new technologies; comparative research on privacy in different regions; jurisprudential issues of privacy protection; and many others.
Paul De Hert and Christopher Kuner
N°33 From Transparency to Justification: Toward Ex Ante Accountability for AI (May 2022) by Gianclaudio Malgieri and Frank Pasquale
Abstract: At present, policymakers tend to presume that AI used by firms is legal, and only investigate and regulate when there is suspicion of wrongdoing. What if the presumption were flipped? That is, what if a firm had to demonstrate that its AI met clear requirements for security, non-discrimination, accuracy, appropriateness, and correctability, before it was eployed? his paper proposes a system of “unlawfulness by default” for AI systems, an ex-ante model where some AI developers have the burden of proof to demonstrate that their technology is not discriminatory, not manipulative, not unfair, not inaccurate, and not illegitimate in its legal bases and purposes. The EU’s GDPR and proposed AI Act tend toward a sustainable environment of AI systems. However, they are still too lenient and the sanction in case of non-conformity with the Regulation is a monetary sanction, not a prohibition. This paper proposes a pre-approval model in which some AI developers, before launching their systems into the market, must perform a preliminary risk assessment of their technology followed by a self-certification. If the risk assessment proves that these systems are at high-risk, an approval request (to a strict regulatory authority, like a Data Protection Agency) should follow. In other terms, we propose a presumption of unlawfulness for high-risk models, while the AI developers should have the burden of proof to justify why the AI is not illegitimate (and thus not unfair, not discriminatory, and not inaccurate). Such a standard may not seem administrable now, given the widespread and rapid use of AI at firms of all sizes. But such requirements could be applied, at first, to the largest firms’ most troubling practices, and only gradually (if at all) to smaller firms and less menacing practices.
Keywords: AI, Accountability, Justification, GDPR
N°32 Humans in the GDPR and AIA governance of automated and algorithmic systems. Essential pre-requisites against abdicating responsibilities (January 2022) by Guillermo Lazcoz and Paul de Hert
Abstract: The GDPR mandates humans to intervene in different ways in automated decision-making (ADM). Similar human intervention mechanisms can be found amongst the human oversight requirements in the future regulation of AI in the EU. However, Article 22 GDPR has become an unenforceable second-class right, following the fate of its direct precedent -Article 15 of the 1995 Data Protection Directive-. Then, why should European policymakers rely on mandatory human intervention as a governance mechanism for ADM systems? Our approach aims to move away from a view of human intervention as an individual right towards a procedural right that is part of the culture of accountability in the GDPR. The core idea to make humans meaningfully intervene in ADM is to help controllers comply with regulation and to demonstrate compliance. Yet, human intervention alone is not sufficient to achieve appropriate human oversight for these systems. Human intervention will not work without human governance. This is why DPIAs should play a key role before introducing it and throughout the life-cycle of the system. This approach fits better with the governance model proposed in the Artificial Intelligence Act. Human intervention is not a panacea, but we claim that it should be better understood and integrated into the regulatory ecosystem to achieve appropriate oversight over ADM systems.
Keywords: Human oversight; GDPR; Human intervention; Artificial intelligence; Accountability
N°31 The fundamental right to personal data protection in criminal investigations and proceedings: framing big data policing through the purpose limitation and data minimisation principles of the Directive (EU) 2016/680 (December 2021) by Paul De Hert and Juraj Sajfert
Abstract: The Law Enforcement Directive (EU) 2016/680 (LED) defines its basic principles, such as purpose limitation and data minimisation, differently than the General Data Protection Regulation (EU) 2016/679 (GDPR). This contribution is exploring the influence of those differences on new policing methods, in particular on the big data policing. After describing the data protection regulatory framework for law enforcement authorities in the EU, we explain our understanding of the notion of big data policing. We then critically interpret the purpose limitation and the data minimisation principle in the GDPR and the LED, thereby busting some myths about the LED, created by other academics. Finally, we explore the boundaries of the abovementioned basic LED principles, in an attempt to measure their success in finding the delicate balance between the high level of personal data protection and the contemporary law enforcement needs.
Keywords: data protection, Law Enforcement Directive, criminal justice, big data, purpose limitation, data minimisation
N°30 Necessity knows no law in contaminated times: the rule of law under pandemic police and pandemic legislation’ (‘Nood breekt wet in besmette tijden: de rechtsstatelijkheid van de pandemiepolitie en pandemiewetgeving’) (November 2021) by Paul De Hert
Abstract: The COVID crisis was tackled in Belgium with an emergency procedure of ministerial decrees, relying on the 2007 Civil Security Law, a law that is actually intended to allow for quick and temporary intervention in case of disasters such as large explosions or fires. Despite the Council of State’ findings (“this law allows curfews”), this law is both inappropriate and dangerous: without parliamentary debate, it allows far-reaching restrictions on fundamental rights for long periods of time. In the Netherlands, too, the curfew was socially and legally controversial. There, too, there were calls for new legislation on lockdown and other police measures in the event of health crises.
The discussions in Belgium and the Netherlands are therefore similar, although the Dutch Constitution and legislation is clearer, (but not enough) with regard to the possibility of declaring a state of emergency and combating infections on a large scale. Obviously, it makes sense, from a concern for the rule of law, to rewrite the Constitution and legislation. New safeguards for new, far-reaching infectious disease control measures for the unhealthy and healthy part of the population …. Who can be against that?
This contribution is cautious about the usefulness of exemption legislation. Looking at the health crisis, and older crises such as the 9/11 crisis (terrorism), we pay attention to processes of power accumulation of the government, and the executive in particular, made visible by such a crisis, but made possible by inconspicuous processes such as technology thinking (the head) and digitization processes (the practice). This diagnosis draws on insights from Arendt, Ellul, and Jonas about modern, often subtle coercive mechanisms for which the traditional legal-state framework is not or hardly equipped although they have been frequently applied in recent years. For this reason, the traditional legal-state framework has been contaminated not only by digitization processes and power growth of police and government, but also by mobilization of mechanisms such as fear, peer pressure and techno-hypnosis.
After an exploration of the concepts of individual and collective emergency (sections 1 & 2), I emphasize the international obligation to shape a balanced constitutional and legal system on emergency states. Indeed, fundamental rights oppose an “emergency breaks law” police system (section 3). Smart pandemic laws and other exception laws are sector-specific, time-limited, and based on the requirements of crisis management (on cyclical management) and the rule of law (including a role for reviewing judges) (section 4). Nevertheless, the times are not suitable for building a balanced constitutional and legal system on emergency situations. In a second part of the contribution (sections 5 to 9), I explain what makes these times an unfavorable codification time for good practices. There are simply too many bad practices, (I use the term “infections”). In that examination of legal infections, I dwell, as I said, on mechanisms of fear, peer pressure, and techno-hypnosis, and on increases in government and police power through digitization and through “ordinary” laws enabling “extraordinary” police powers. Brand new legislation on health crises (‘the Pandemic Law’) turned out to be no more than a plaster on a deeply festering legal wound, and this explains why the ‘new’ Belgian Pandemic Law is only briefly mentioned at the and as a post scriptum.
Keywords: rule of law – crisis- COVID crisis and 9/11 – codifying best practices today or wait – mechanisms of fear, peer pressure, and techno-hypnosis -increases in government and police power through digitization
N°29 Facial recognition, visual and biometric data in the US. Recent, promising developments to regulate intrusive technologies (October 2021) by Paul De Hert and Georgios Bouchagiar
Abstract: Biometric and visual surveillance has taken new forms and sizes. While private and public actors deploy intrusive technologies that are more and more specific, the European Union’s approach to the processing of biometric and visual data remains rather abstract and tech-neutral. This working paper discusses various initiatives and regulations of the United States that could become a useful source of inspiration for European audiences. We detect five elements; namely, concreteness of the law when targeting specific technologies, clarity on its scope, precision regarding certain requirements, banning certain technologies or uses of them and organisation of remedies. In our view, these features could be particularly useful and help to protect more effectively biometric and visual data in the European Union.
Keywords: face recognition, visual data, biometric data, surveillance
N°28 Adding and removing elements of the proportionality and necessity test to achieve desired outcomes. Breyer and the necessity to end anonymity of cell phone users (September 2021) by Paul De Hert and Georgios Bouchagiar
Abstract: Case of Breyer v Germany Application no 50001/12 (ECtHR, 30 January 2020)
The Breyer judgment concerns the storage of subscriber data by telecommunications service providers. To the Court, the collection and storage of such data amounted to interference of a rather limited nature. Additional safeguards were provided in the relevant German laws and there was independent
supervision by the data protection authorities. The German lawmaker had not
exceeded the margin of appreciation. There had been no violation of Article 8 of the European Convention on Human Rights.
Keywords: Breyer, subscriber data, telecommunications service providers, right to privacy
N°27 Fashion ID and Decisively Influencing Facebook Plugins: A Fair Approach to Single and Joint Controllership (June 2021) by Paul De Hert and Georgios Bouchagiar
Abstract: Case C‑40/17 Fashion ID GmbH & Co. KG v Verbraucherzentrale NRW eV  OJ C319/2
1. Consumer-protection associations can bring legal proceedings for alleged infringements of the right to the protection of personal data.
2. The website operator, who embeds a plugin that can enable the website-visitor’s browser to request content from the provider of that plugin and, to that end, to transmit personal data to the provider of the plugin, can be seen as a controller; liability is limited to the particular processing operation(s) that she actually determines.
3. Where the website operator embeds a plugin that can enable the website-visitor’s browser to request content from the provider of that plugin and, to that end, to transmit personal data to the provider of the plugin, each actor (the website operator and the provider) must pursue a legitimate interest via the particular processing operations that are codetermined.
4. Where the website operator embeds a plugin that can enable the website-visitor’s browser to request content from the provider of that plugin and, to that end, to transmit personal data to the provider of the plugin, it is the website operator who is bound by the duties to obtain consent and to inform the data subject –regarding the processing operation(s) that that website operator actually determines.
Keywords: Fashion ID, Facebook, joint controllership, like button
N°26 European Law Enforcement and US Data Companies: A Decade of Cooperation Free from Law (September 2020) by Angela Aguinaldo and Paul De Hert
Abstract: Online evidence has been indispensable in criminal matters but due to its transnational and volatile nature, there have been issues and challenges as regards access, transfer, and usage in criminal investigations and prosecutions. In recent years, practices have been established to overcome the hurdles of cross-border access to online evidence. One of these practices is direct cooperation between law enforcement authorities and data companies, the latter of which are mostly based in the US. While this cooperation has been less blatant and apparent in its earlier years due to the want of legal basis, law enforcement authorities have been less coy towards the practice more recently. The present contribution walks the reader through the recent developments on codifying the practice of direct cooperation between European law enforcement authorities and US data companies. These developments evince how law enforcement authorities are willingly and wittingly overlooking protective safeguards and issues that ought to be addressed and thoroughly discussed. By sanctioning a relationship of direct cooperation, not only are state interests affected, but likewise issues of trust, MLA rights, privacy and data protection are affected. There ought to be a thorough discussion on these issues and hopefully the lessons learned from the recent CJEU judgments and the German Federal Constitutional Court are taken into consideration.
Keywords: US CLOUD Act, European Union, Council of Europe, cross-border access to online evidence, online evidence, mutual legal assistance, data protection, privacy, Schrems 1, Schrems 2, data companies, direct cooperation, unilateralism, jurisdictional expansion
N°25 The Dark Side of the MOOC? The Rise of EdTech in Times of COVID-19: A Data Protection Challenge for Universities (August 2020) by Jonas Botta, Postdoctoral Researcher at the German Research Institute for Public Administration and Reader in Human Rights Law at the Berlin School of Economics and Law
Abstract: The dramatic spread of COVID-19 is causing a profound upheaval in education. Almost overnight, there has been an unprecedented need for educational technologies (“EdTech”) to compensate for the loss of “face-to-face” teaching at schools and universities. Especially for universities, it makes sense to benefit from the numerous offers of e-learning platforms, mainly so-called Massive Open Online Courses (MOOCs). After all, the choice of topics for online courses is extremely diverse, from introductory courses in programming languages such as Java or Python to modules on the work of William Shakespeare and units on the legislative mechanisms in the EU Multi-Level System.
However, if, for instance, universities want to make EdTech such as MOOCs available to their students as soon as possible, they will not only have to deal with financial and educational but also legal challenges. The virtual seminar room gives rise to completely new questions regarding the privacy of students: Which data protection standards must EU universities observe within the scope of the General Data Protection Regulation if they want to process user data for teaching or research purposes? Which data protection provisions can be invoked by universities if they want to make online studies mandatory? Could universities possibly be liable for data protection violations by the MOOC providers? These are the questions this working paper aims to answer.
Keywords: (Joint) controllership, data processing for scientific research purposes, freely given consent, GDPR, e-learning platforms, Massive Open Online Courses (MOOCs)
N°24 Individuation: re-imagining data privacy laws to protect against digital harms (July 2020) by Anna Johnston, Principal of Salinger Privacy
Abstract: Most data protection and privacy laws turn on the identifiability of an individual as the threshold criteria for when data subjects will need legal protection. However I argue that privacy harms can also arise from individuation: the ability to disambiguate or ‘single out’ a person in the crowd, such that they could, at an individual level, be tracked, profiled, targeted, contacted, or subject to a decision or action which impacts upon them – even if that individual’s ‘identity’ is not known (or knowable). I conclude that data protection and privacy laws need a re-think and re-design in order to reflect the reality of the digital environment, and protect people from digital harms.
First, I will show that ‘not identifiable’ is no longer an effective proxy for ‘will suffer no privacy harm’. Second, I will argue that even the GDPR’s mention of ‘singling out’ is not sufficient to encompass harms arising from individuation. Third, I will demonstrate how some post-GDPR laws and statutory instruments have taken a more expansive approach to threshold criteria, to incorporate individuation. Finally, I will outline a six-part approach which could be taken by legislators to ensure that new or reformed laws robustly protect against digital harms, while avoiding some of the pitfalls demonstrated in the drafting of the CCPA.
Keywords: Personal data, personal information, identifiability, individuation, profiling, GDPR, CCPA, data protection, privacy, AdTech
N°23 Logic and Key Points of China’s Cybersecurity Review Measures (June 2020) by Yanqing Hong, Senior Fellow, Law and Development Institute, Peking University of China, Edited by Vagelis Papakonstantinou, Brussels Privacy Hub
Abstract: China’s Cybersecurity Review Measures (“Review Measures”) were released on April 13, 2020 and will take effect on June 1, 2020. The Review Measures will replace the Network Product and Service Security Review Measures (Trial) that have been in effect since 2017. From the initial trial to the final version, the Review Measures have been gradually condensed and refined over a three-year period of practice and exploration. This document uses 5G security as an example to analyze the logic and key points of the Review Measures.
Keywords: China Cybersecurity Act, China Cybersecurity Review Measures, 5G security
N°22 The “Ethification” of Privacy and Data Protection Law in the European Union. The Case of Artificial Intelligence (May 2020) by Niels van Dijk and Simone Casiraghi
This article has been revised and published as a peer reviewed article. You can find the article here.
Abstract: Several European Commission’s initiatives have been resorting to ethics as a means to protect individuals from the risks posed by emerging technologies and as a way to govern and regulate the same innovation fields. The proliferation of invocations of “ethics” and “ethical principles/values” in the legal and policy discourse, as well as the growing importance of ethical expertise, ethical committees, ethical advisory groups and boards, ethical guidelines and principles can be referred to as the “ethification” phenomenon. While originally limited to the fields of life and medical sciences (in particular bioethics), this increasing propagation of ethics can recently be observed in the field of data protection law, especially concerning the recent European Union (EU) initiatives on (the regulation of) Artificial Intelligence (AI). This working paper aims to explore and shed light on where and through which means ethics is claiming authority and autonomy from data protection law as a separate field and regulation strategy. First, it will provide a topological mapping to locate where the ethics work is being produced in the EU. Second, the authors will elaborate a typology of ethics based on the mapping. Third, the effects on this ethification phenomenon on data protection law and AI regulation will be analyzed through the concept of boundary work, highlighting how ethics, on the one hand, is tracing boundaries to claim autonomy from the law, on the other, it is obfuscating these boundaries when it comes to give foundations to its practice. The aim is to elucidate the benefits and drawbacks of the ethification of data protection and privacy law, and its effects on the articulations of law, technology and ethics in democratic constitutional states.
Keywords: Ethics, Ethification; Privacy; General Data Protection Regulation; Innovation Governance; Artificial Intelligence; European Commission; Horizon 2020; Science and Technology Studies; Boundary Work
N°21 Article 8 ECHR compliant and foreseeable surveillance: the ECtHR’s expanded legality requirement copied by the CJEU. A discussion of European surveillance case law (April 2020) by Paul De Hert & Gianclaudio Malgieri
Abstract: The Strasbourg based European Court of Human Rights has a long record of cases dealing with surveillance, starting with Klass v. Germany (1978). In Klass the Court explicitly accepted the necessity for secret surveillance performed by public authorities in European post-World War II democracies, provided respect of certain victim and legality requirements deduced from Article 8 and 13 of the 1950 European Convention on Human Rights (ECHR). After the introduction of this premise, the Court proposes several important guidelines for lawful and human rights compatible surveillance that taken together built up to a comprehensive framework answering equally to questions about power divisions and checks on potential power abuse. Today there is a vast body of case law developed by the ECtHR and the European Union Court of Justice (hereafter: CJEU) that confirms and adapts these guidelines, often in view of addressing recent technology (e.g. GPS surveillance) or institutional developments (e.g. overlap between police and secrete services). In this article we will focus on developments with regard to the legality principle in the context of surveillance in the realm of criminal law and intelligence work by secret services. A more rigorous interpretation of legality principle in post Klass surveillance case law certainly qualifies as one of the most remarkable developments in the European Courts case law on surveillance. In particular, we will show that the strict approach towards the legality requirement enshrined in Article 8 ECHR adopted by the ECtHR in Huvig (1990) in the context of telephone surveillance will be then re-applied in all the following judgments of the Strasbourg Court and even adopted by the CJEU (from Digital Rights Ireland on) in the context of other surveillance practices.
N°20 The Proposed ePrivacy Regulation: The Commission’s and the Parliament’s Drafts at a Crossroads? (March 2020) by Elena Gil González, Paul De Hert & Vagelis Papakonstantinou
Abstract: The EU’s Digital Single Market Strategy aims to increase trust and security in digital services. A reform of the EU personal data protection regulatory framework through the introduction of the General Data Protection Regulation (GDPR) was a key step to increasing trust in the security of digital services. Following the reform of the GDPR, the strategy also includes the review of the ePrivacy Directive (Directive 2002/58/EC). Indeed, on 10 January 2017, the European Commission presented a proposal for an ePrivacy Regulation to be in force on 25 May 2018, simultaneously with the GDPR. However, this ambitious timeline has suffered delays and the proposal is currently going through the European Union legislative process. On 26 October 2017, the European Parliament voted in favour of the amendments proposed by the Committee on Civil Liberties, Justice and Home Affairs (LIBE) in plenary session. This chapter aims to highlight specific aspects of the ePrivacy Regulation draft, in its Summer 2019 state, to shed light upon certain of its most important elements. While the new Commission after the elections of June 2019 awaits appointment, we consider it important, during this stage of the law-making process, to take a photograph of developments so far, which include the Commission’s original draft and the Parliament’s response (the Council is yet to provide its final position). In this way, future comparisons with the final wording and the reasoning behind them, will be facilitated.
N°19 Access to the Internet in the EU: a policy priority, a fundamental, a human right or a concern for eGovernment? (February 2020) by Lina Jasmontaite and Paul de Hert
Abstract: After outlining the relevant regulatory provisions governing access to the Internet in the EU (section 2) and its Member States (section 3), and after summarizing arguments supporting the introduction of the right to Internet access, the authors seek to broaden the scope of social and legal debates on Internet access in the EU. In particular, they question (a) whether the Internet is a vital element to achieve a decent standard of living in the Gigabit society (section 4); and (b) whether it deserves a place alongside the fundamental rights or human rights (section 5) and under what conditions it could be incorporated among the EU fundamental rights (section 6). The following sections of the chapter reflect on the potential scope of a right to Internet access (sections 7 and 8) and how eGovernment could facilitate the introduction of such a right (section 9). Considerations about limitations of a right to Internet access are addressed in section 10.
Access to the Internet is inherently an Internet governance matter and therefore its regula- tion should entail a multi-stakeholder debate. Access to the Internet then would be seen not only in a technical way as a communication service but as ‘the set of devices, services, facilities and skills that allow people to connect to and use Internet services, applications and content’. Perhaps, this shift in approach could strengthen the EU’s role within the broader context of Internet governance.
The authors suggest that the EU debate on Internet access should employ a human rights-based approach to Internet access because the social benefits brought by the Internet cannot be defined by numbers. The authors conclude that acknowledgment or recognition of Internet access as a fundamental right would be valuable as it would encourage policy- and law-makers, as well as civil society, to reconsider the scope and limitations imposed on this right.
N°18 Challenging algorithmic profiling: The limits of data protection and anti-discrimination in responding to emergent discrimination (January 2020) by Dr Monique Mann and Professor Tobias Matzner
Abstract: The potential for biases being built into algorithms has been known for some time (e.g., Friedman and Nissenbaum, 1996), yet literature has only recently demonstrated the ways algorithmic profiling can result in social sorting and harm marginalized groups (e.g., Browne 2015; Eubanks 2018; Noble 2018). We contend that with increased algorithmic complexity, biases will become more sophisticated and difficult to identify, control for, or contest. Our argument has four steps: first, we show how harnessing algorithms means that data gathered at a particular place and time relating to specific persons, can be used to build group models applied in different contexts to different persons. Thus, privacy and data protection rights, with their focus on individuals (Coll, 2014; Parsons, 2015), do not protect from the discriminatory potential of algorithmic profiling. Second, we explore the idea that anti-discrimination regulation may be more promising, but acknowledge limitations. Third, we argue that in order to harness anti-discrimination regulation, it needs to confront emergent forms of discrimination or risk creating new invisibilities, including invisibility from existing safeguards. Finally, we outline suggestions to address emergent forms of discrimination and exclusionary invisibilities via intersectional and post-colonial analysis.
Keywords: Algorithms, profiling, GDPR, data protection, discrimination, intersectionality
N°17 Data Localisation: Deconstructing myths and suggesting a workable model for the future. The cases of China and the EU (August 2019) Author: Yanqing Hong, Senior Fellow, Law and Development Institute, Peking University of China; Edited by Vagelis Papakonstantinou, Brussels Privacy Hub
Abstract: Data localization is highly controversial, being ultimately connected to the topic of national sovereignty in the age of the internet. Opponents believe that it constitutes a trade barrier and undermines global connectivity. Supporters point out that states need to exercise control over data as a matter of national security. It remains doubtful whether differences can ever be bridged, because each side’s argumentation can be traced back to, basic, state theory: The internet only extrapolates onto modern digital circumstances old arguments about the role of states, the rights and freedoms of individuals, global cooperation and free trade. A number of popular myths further complicates understanding. Faced with differences in political and even philosophical approaches, this paper aims to dispel misunderstandings and present a workable and realistic model for data localisation exercises based on a “reasonable limitation” principle for the local storage of data.
Keywords: Data localization, data sovereignty, cross-border data flows, the principle of reasonable limitation for the local storage of data
N°16 Big data analytics in electronic communications: A reality in need of granular regulation (even if this includes an interim period of no regulation at all) (June 2019) by Vagelis Papakonstantinou & Paul de Hert
Abstract: Over the past few years big data analytics have forcefully entered the mainstream. Admittedly, modern life would be inconceivable without the services afforded by this type of processing in the field of electronic communications. At the same time public administrations are increasingly discovering the benefits of big data analytics afforded to them by telecommunications operators. Nevertheless, despite public attention and high volumes of expert analyses, the majority of approaches on the challenges to personal data protection by this type of data processing remains theoretical; Tellingly, the EDPS speaks of the “black box” of big data analytics. However, the authors were able to open, and stare into, the “black box” of big data analytics in the electronic communications field in 2017 and 2018 in the context of GDPR compliance assessments. Their analysis first attempts to set the legal scene today, answering two crucial questions on scope and applicable law, before presenting a typology for a scalable and granular approach that the authors feel is necessary but nevertheless is missing from the text of the draft ePrivacy Regulation. The authors therefore conclude that processing requirements and particularities, as evidenced under the big data analytics paradigm, make necessary a much more detailed approach than the one afforded by the draft ePrivacy Regulation today. Until these needs are met, through the introduction of a new, fundamentally amended text, the authors suggest that the current regulatory framework and the mechanisms afforded by it be extended for an interim period, so as to afford legislators with the necessary space and time to revise their work.
Keywords: ePrivacy Directive, ePrivacy Regulation, metadata, big data analytics
N°15 Belgium, Courts, Privacy and Data Protection. An inventory of Belgian case law from the pre-GDPR regime (1995-2015). (January 2019) by Paul De Hert
Abstract: This Contribution focuses on the use made by the Belgian Constitutional Court, the Cour de Cassation and the ordinary courts of the right to privacy and the right to have personal data protected as anchored in the Belgian Constitution, the Belgian Data Protection Act and the European sources. A selection of their judgements, all dating from the era before the new EU Data Protection Regulation, are discussed along the lines of their impact on health privacy, workplace privacy, surveillance and social media privacy. Our analysis shows a great deal of European loyalty on behalf of the Belgian Constitutional Court towards European trends to favour privacy and data protection. In stark contrast stands the case law of the Cour de Cassation mainly focussed at preserving prosecutorial interests and employer’s interests at the detriment of privacy and data protection interests. In our conclusions we discuss tendencies towards cosmopolitanism and tribalism, the dramatic impact of evidence law and patterns of litigation.
Our analysis covers the data protection era where Belgian law was indirectly governed by EU Directive 95/46/EC of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data (OJ L 281, 23 November 1995, 31). The Directive contributed to the roll-out of data protection and harmonized the data protection provision in the EU Member States but suffered from implementation weaknesses and lack of recognition. A certain lack of recognition of the importance of data protection in the European (and Belgian legal) landscape disappeared with the the EU General Data Protection Regulation 2016/679 (“GDPR”) (OJ L 119, 5 Ma.2016, 1–88 ) that repealed Directive 95/46/EC and came into force on 25 May 2018 with direct applicable provisions. Further studies are needed to study the impact of the new European provisions on the work and output of the Belgian courts.
N°14 Enforcement in Indonesia Data Privacy Laws: The 2018 Facebook-Cambridge Analytica scandal as a case study (October 2018) by Anbar Jayadi
Abstract: The Facebook-Cambridge Analytica scandal is an issue-in-progress in Indonesia. Indonesia is the third country that had its Facebook users’ data allegedly ‘improperly shared’ with Cambridge Analytica. This raises concerns over how Facebook (Indonesia) handles the data of its Indonesian users and its transparency to the Indonesian public. Class-action lawsuits, parliamentary hearing, and warning letters have all been started. Nevertheless, to this date, the Indonesian authorities seem at loss. The problem is twofold: coherent data protection legislation is absent (as of the writing of this paper), and law enforcement mechanisms are weak. Academic literature and reports have already discussed, in general, the existing and upcoming data protection legal framework in Indonesia. Therefore, this paper is more interested in exploring, how the Indonesian law enforcement works when it comes to data privacy cases, using the Facebook-Cambridge Analytica scandal as a case study. The identified issues are the failure of protection of the secrecy of personal data and the exercise of the data subject’s right to sue for compensations. This paper concludes with an evaluation of forthcoming data protection challenges for Indonesia.
N°13 Big data analytics by telecommunications operators and the draft ePrivacy Regulation (September 2018) by Vagelis Papakonstantinou & Paul de Hert
Abstract: Big data analytics has been defined by the EPDS, under a common denominator approach, as the practice of combining huge volumes of diversely sourced information and analysing them, using more sophisticated algorithms to inform decisions. Notwithstanding the discussion whether personal data constitute a new asset for companies, the fact remains that organisations find new value through constant re-processing of personal data either already in their possession or coming from different third parties. Indeed, over the past few years, big data analytics have forcefully entered the mainstream. However, despite public attention and high volumes of expert analyses, that customarily focus on challenges to personal data protection by similar operations, the vast majority of approaches remains purely theoretical; Aim of this Working Paper is to map and analyse a specific field, focusing on what is actually taking place and what big data analytics operations do take place within telecommunications organisations today.
N°12 Understanding the balancing act behind the legitimate interest of the controller ground: a pragmatic approach (August 2018) by Irene Kamara and Paul De Hert
This is a Working Paper (version: May 2017). Please refer to the final published version: Kamara, I., & De Hert, P. (2018). Understanding the Balancing Act behind the Legitimate Interest of the Controller Ground. In E. Selinger, J. Polonetsky, & O. Tene (eds.), The Cambridge Handbook of Consumer Privacy (pp. 321-352). Cambridge: Cambridge University Press. doi: 10.1017/9781316831960.019
A final version of this paper has been published as Paul De Hert & Johannes Thumfart , ‘The Microsoft Ireland Case, The Cloud Act and the Cyberspace Sovereignty Trilemma. Post-Territorial Technologies and Companies Question Regulatory State Monopolies’, in Walter Hötzendorfer, Christof Tschohl & Franz Kummer (eds.), International Trends in Legal Informatics. Festschrift for Erich Schweighofer, Bern: Weblaw AG, 2020, 373-418. ISBN 978-3-96698-588-8
Abstract: The General Data Protection Regulation provides new tools and concepts such as Data Protection Impact Assessments, accountability and certification, but to a large extent retains the rationale of the Data Protection Directive for a principles-driven legislation. One of the cornerstones of both the reformed and new EU data protection legislation is the grounds for lawful processing.
Much debate has taken place over consent and the conditions for a meaningful informed choice of the data subject, while other grounds have not been at the spotlight of academia and practitioners. The legitimate interest of the controller has been one of the least discussed legal grounds for lawful processing, with a few exceptions, mainly the opinion of the Article 29 Data Protection Working Party, despite its significance as equally binding ground for processing. This contribution analyses the concept of legitimate interest of the controller of art. 6 (f) GDPR in relation to art. 7 (f) of the Data Protection Directive 95/46/EC and the interpretations of the concept by the Court of Justice of the EU and the Article 29 Data Protection Working Party.
N°11 The Microsoft Ireland case and the cyberspace sovereignty trilemma. Post-territorial technologies and companies question territorial state sovereignty and regulatory state monopolies (July 2018) by Paul de Hert & Johannes Thumfart
A final version of this paper has been published as Paul De Hert & Johannes Thumfart , ‘The Microsoft Ireland Case, The Cloud Act and the Cyberspace Sovereignty Trilemma. Post-Territorial Technologies and Companies Question Regulatory State Monopolies’, in Walter Hötzendorfer, Christof Tschohl & Franz Kummer (eds.), International Trends in Legal Informatics. Festschrift for Erich Schweighofer, Bern: Weblaw AG, 2020, 373-418. ISBN 978-3-96698-588-8
Abstract: The Microsoft Ireland case brought before the Supreme Court in 2018 and dropped the very same year has attracted attention world-wide from policymakers and scholars. This contribution focusses on two important features of the case: the conflicting and often chaotic approaches to the notion of sovereignty of many of the players and the remarkable move of a private company to trigger regulation in a world where companies, technologies, data flows and governments transgress borders with growing acceptance of the inadequacy of older territorial comprehensions of the world order.
Keywords: sovereignty, territoriality, cyberspace, Microsoft Ireland case, cyber sovereignty, Internet
N°10 “Does Technology Drive Law? The Dilemma Of Technological Exceptionalism In Cyberlaw” (July 2017) by Meg Leta Jones, JD, PhD
This working paper is a first draft of a paper that will be subject to change following dissemination and peer-review
Abstract: Seemingly plagued by newness, the law, it is often claimed, cannot keep up with new technology. Digital technologies have only reinforced the legitimacy of this now well-established idiom. The sentiment has gone unchecked for decades, even in light of social and historical research that reveals the cultural nature of technology. In the field of law and technology (cyberlaw), the theory of technological exceptionalism is used to measure whether new technologies are transformative enough to uproot existing legal foundations. This article is an attempt to disconfirm technological exceptionalism as a viable theory for cyberlaw research and policymaking by analyzing a number of information and communication technologies often labeled “exceptional,” including the printing press, the internet, photographic cameras, computers, and drones. If technologies can be exceptional – if their attributes drive social change and laws – the same linear pattern should appear across cultures where the technology is introduced: a technology enters society and allows for certain activities that place significant strains on social orders, existing law and legal concepts are applied but fall short, and necessary changes are made to account for the new technological capabilities. Because the theory of technological exceptionalism does not hold up – because the story of law and technological change is much more varied, messy, and political – it should be discarded and new theories of and approaches to law and technological change, such as the legal construction of technology, should be pursued.
N°9 “European Human Rights, Criminal Surveillance, and Intelligence Surveillance: Towards “Good Enough” Oversight, Preferably but Not Necessarily by Judges” (March 2017) by Gianclaudio Malgieri & Paul De Hert
This contribution is a Chapter in David C. Gray & Stephen Henderson (eds.), The Cambridge Handbook on Surveillance, New York: Cambridge University Press, 2017
Abstract: The two European Courts (the European Court of Human Rights, ECtHR and, to a lesser degree, the European Union Court of Justice, EUCJ) have contributed greatly to the development of a legal framework for surveillance by either law enforcement agencies in the criminal law area or by secret services. Both courts put great emphasis on a system of control ex ante and post hoc by independent supervisory authorities. A complex and controversial issue remains whether the human rights to privacy, respect of communications, and to an effective remedy (enshrined in Article 8 and 13 of European Convention on Human Rights (ECHR)), requires judicial review as a necessary safeguard for secret surveillance or alternatively, at which conditions, parallel systems of non-judicial review can be accepted as adequate safeguards against illegitimate interference in citizens’ private life.
The European Courts have not yet established a clear doctrine in determining suitable thresholds and parameters. In particular, the ECtHR has a flexible approach in interpreting article 8 and 13 ECHR, depending on several factors (“vital” interests at stake, political considerations, etc.). In general terms, the Court has shown a preference towards judiciary oversight, but in the European legal order there are several examples of alternative oversight systems assessed positively by the Court, such as the quasi-judiciary systems (where the independency of the supervisory body, its wide jurisdiction, its power to data access and its power to effective reactions are proved) or the system of oversight set by Data Protection Authorities in the EU member states. However, in recent judgements of the ECtHR and the EUCJ we see an increasing emphasis on declaring the necessity of a “good enough” judicial (ex ante or post hoc) control over surveillance, meaning not simply a judicial control, but a system of oversight (judicial, quasi-judicial, hybrid) which can provide an effective control over surveillance, supported by empirical checks in the national legal system at issue.
Keywords: Privacy, Surveillance, judicial review, European Court of Human Rights, European Convention on Human Rights
N°8 “The “Right to be Forgotten” and Search Engine Liability” (December 2016) by Hiroshi Miyashita
Abstract: This paper aims to conduct a comparative study on the right to be forgotten by analyzing the different approaches on the intermediary liability. In the EU, Google Spain case in the Court of Justice clarified the liability of search engine on the ground of data controller’s responsibility to delist a certain search results in light of fundamental right of privacy and data protection. On the contrary, in the U.S., the search engine liability is broadly exempted under the Communications Decency Act in terms of free speech doctrine. In Japan, the intermediary liability is not completely determined as the right to be forgotten cases are divided in the point of the search engine liability among judicial decisions.
The legal framework of the intermediary liability varies in the context from privacy to e-commerce and intellectual property. In the wake of right to be forgotten case in the EU, it is important to streamline the different legal models on the intermediary liability if one desires to fix its reach of the effect on right to be forgotten. This paper analyzes that the models of the search engine liability are now flux across the borders, but should be compromised by way of the appropriate balance between privacy and free speech thorough the right to be forgotten cases.
Keywords: Privacy, Data Protection, Right to be Forgotten, Search Engine, Intermediary Liability
N°7 “Structure and Enforcement of Data Privacy Law in South Korea” (October 2016) by Haksoo Ko, John Leitner, Eunsoo Kim and Jong-Gu Jung (20 pages)
Abstract: South Korea’s data privacy law has evolved rapidly, in particular during the past several years, despite a short history of relevant legislation and enforcement. South Korea’s data privacy law has exceedingly stringent consent requirements. In addition to consent, there are many other statutory provisions with onerous requirements, arguably making the overall data privacy law regime in South Korea one of the strictest in the world. South Korea’s data privacy law, in particular the Personal Information Protection Act (the PIPA), has a similar structure to the EU’s data privacy law. However, the overall legal regime for data privacy and also its enforcement mechanism reveal South Korea’s unique characteristics and its weaknesses. In terms of the overall legal regime for data privacy, one interesting characteristic is that, in addition to the PIPA, an omnibus data privacy statute, there are multiple additional statutes governing data privacy issues for specific sectors or industries. In terms of the enforcement of data privacy law, a multitude of government agencies and institutions are in charge. Thus, depending on applicable statutes and other factors, different agencies or institutions could be in charge. Issues on data privacy has gained notable traction in recent years in South Korea and, perhaps reflecting this phenomenon, relevant laws and regulations have been amended frequently. A notable trend is to strengthen penalty provisions and, in particular, the maximum amount of administrative fine is now set at 3% of relevant sales revenue. It remains to be seen if heightened penalty provisions will indeed help addressing data privacy concerns in a meaningful manner.
Keywords: Data privacy, South Korea’s data privacy law, Personal Information Protection Act
N°6 “Permissions and Prohibitions in Data Protection Jurisdiction” (May 2016) by Mistale Taylor
Abstract: Under public international law, a State has a right to exercise jurisdiction and is expected to show restraint when applying extraterritorial jurisdiction. The EU’s Data Protection Directive is far-reaching and has notable effects beyond its territory. The General Data Protection Regulation could serve to broaden these external effects. This expansive application of prescriptive jurisdiction has caused jurisdictional tensions between, for instance, the EU and the US. EU data protection law could conceivably fall into traditional public international law permissive principles of jurisdiction, such as subjective territoriality, objective territoriality, passive personality or the effects doctrine. Whilst there appears to be a shift from territory to personality in European data protection law, territory is still necessary to trigger the application of jurisdiction. The demarcations provided by public international law could offer ways to mitigate transatlantic conflicts in jurisdiction.
Keywords: jurisdiction – data protection – public international law – extraterritoriality
N°5 “The right to privacy and personal data protection in Brazil: time for internet privacy rights?” (February 2016) by Vinícius Borges
Abstract: The Brazilian Internet Bill of Rights, called ‘Marco Civil da Internet’, instituted various principles and parameters for Internet regulation in Brazil. There is however a persistent gap in the Brazilian legal system concerning laws and infrastructure for the effective guarantee of the right to data protection online, coupled with the absence of specific conceptual precision on the notion of privacy on the Internet. In this context, this paper examines the convenience of using the innovative concept of ‘Internet privacy rights’, composed of four rights. The study concludes that the express reception of such Internet privacy rights by the laws that govern it and related topics in Brazil, especially those that regulate or will regulate the protection of personal data in the country, allows the redefinition of the core of the fundamental right to privacy, where only the protection of private life, honour, intimacy and image are considered. Ultimately, it argues that Internet Privacy Rights shall be regarded as included in the core of the fundamental right to privacy in the Brazilian legal system.
Keywords: Fundamental rights. Internet. Internet privacy rights. Personal data protection. Privacy.
N°4 “The data protection regime in China” (November 2015) by Paul De Hert and Vagelis Papakonstantinou
Abstract: This in-depth analysis was commissioned by the European Parliament’s Policy Department for Citizens’ Rights and Constitutional Affairs at the request of the LIBE Committee.
One cannot talk of a proper data protection regime in China, at least not as it is perceived in the EU. The international data protection fundamentals that may be derived from all relevant regulatory instruments in force today, namely the personal data processing principles and the individual rights to information, access and rectification, are not unequivocally granted under Chinese law. An efficient enforcement mechanism, also required under European standards, is equally not provided for. China has no comprehensive data protection act but several relevant sectorial laws that, under a combined reading together with basic criminal and civil law provisions, may add up to a data protection ‘cumulative effect’. This assertion is examined and assessed in the analysis that follows. A list of realistic policy recommendations has been drawn up in order to establish whether China’s recent data protection effort is part of a persistent, yet concise, policy.
Keywords: data protection, China
N°3 “Towards efficient cooperation between supervisory authorities in the area of data privacy law” (October 2015) by Dariusz Kloza, Antonella Galetta
Abstract: As research conducted in the framework of the PHAEDRA project (Improving Practical and Helpful cooperAtion betweEn Data Protection Authorities, 2013-2015) demonstrated, numerous cross-jurisdictional cooperation initiatives in the area of data privacy have flourished in the recent decades at bilateral, regional, supranational and international levels. However, it was also determined that these initiatives are still too immature to reach their final aim, i.e. the efficient protection of data privacy in matters producing implications in more than one jurisdiction. Therefore, this contribution discusses how to make such cooperation more efficient and how this goal could be achieved. A set of 23 legal and practical recommendations that might help both policy-makers and supervisory authorities overcome contemporary inefficiencies are proposed, including a modest action plan to that end. As a conclusion, a line is drawn between binding and non-binding types of cooperation.
Keywords: privacy, personal data protection, data privacy, data protection authorities, cooperation, enforcement, General Data Protection Regulation
N°2 “The new cloud computing ISO/IEC 27018 standard through the lens of the EU legislation on data protection” (November 2014) by Paul de Hert, Vagelis Papakonstantinou, Irene Kamara
Abstract: At a time when cloud computing industry is developing rapidly, mainly due to the flexibility and the cost minimization cloud computing offers, ISO and IEC developed a new standard on cloud computing to deal with issues of protection of PII and security of information. The new standard aims to address the down-sides of cloud computing and the concerns of the cloud clients, mainly the lack of trust and transparency, by developing controls and recommendations for cloud service providers acting as PII processors.
The article examines the strengths and weaknesses of the new standard, its added value to the cloud computing landscape and to data protection, as well as its relation to the European Personal Data Protection framework.
Keywords: cloud computing, standardisation, ISO, personal data, security, confidentiality
N°1 “The data protection regime applying to the inter-agency cooperation and future architecture of the EU criminal justice and law enforcement area” (November 2014) by Paul De Hert and Vagelis Papakonstantinou
Abstract: This study aims, first, at identifying data protection shortcomings in the inter-agency cooperation regime in the EU criminal justice and law enforcement area and, second, at outlining, under six possible scenarios, the interplay among the data protection legal instruments in the law-making process today in field, as well as, the response each could provide to such shortcomings.
Keywords: Data Protection in the EU criminal justice and law enforcement area, Europol, Eurojust, EPPO, OLAF