This paper is part of the e-conference on « Data protection Issues and Covid-19: Comparative Perspectives » which consist in a daily publication at 12 p.m. (GMT+1) except on Sundays until the Summer break. A new session will start again at the beginning of the academic year 2020-21. Please subscribe to blogdroiteuropeen, so you don’t miss a publication. This e-conference was organised by Dr. Yseult Marique, Senior Lecturer at the University of Essex and FÖV Speyer and Dr. Olivia Tambou, Associate Professor at the University of Paris-Dauphine, External Scientific Fellow at the Max Planck Institute of Luxembourg, and Founder- Editor of Blogdroiteuropeen. If you are interested to contribute for our September session feel free to contact us at firstname.lastname@example.org.
Tracing apps. As in many countries around the world, in Canada, commercial, State and university communities have mobilized to develop contact-tracing apps that can be used to better understand and monitor the developments of the pandemic that have been plaguing us for more than 3 months now. As in many countries around the world, a hand has been extended to technology in an attempt to either contain the phenomenon or minimize the devastating effects of lockdown.
Is it efficient? Before going further, we need to come back to a point that is often relayed: does this type of apps work? Like many commentators on these questions, we have a tendency to develop a « deflationary perspective » towards technologies that sometimes dazzle us with the potential they claim to have without measuring their drawbacks. And as for the effectiveness of these multiple applications, to be honest, I have no idea, having no competence to evaluate the pros and cons. It is true that there is sometimes a certain mirage associated with what technologies can accomplish. Many people have taken up this expression, which has now been consecrated as Morozov’s « technological solutionism », inviting us to be wary of the potentialities of new technologies that are either overestimated or misunderstood. It is also true that the results of some countries that have adopted these technologies earlier seem rather « moderate« . After all, the fact that some technologies have been useless or even harmful in the past does not mean that the next ones will be too. “Technological solutionism » is therefore not a religion; at best, it is an invitation to moderation: as a matter of principle, history should not be used to build the future. We are therefore, undoubtedly, in a period of many doubts; doubts that should not prevent us from trying, as long as damages associated with these technologies are under control. And precisely in terms of privacy, such applications can be controlled; the risks can be controlled. In some respects, if there are risks, they relate to issues other than privacy.
Lessons learned. This is a new situation. First, of course, we are faced with solutions that have never really been tested. Each in their own country, the apps remain experimental and their effectiveness will be measured several months after the current crisis. Also, this novelty is imposed by the urgency of the situation, which does not necessarily allow the processes to be validated with the same rigor as they would otherwise. Despite this leap into the unknown, it is possible to identify several ethical and legal lessons. We will identify four of them, without claiming to be exhaustive.
1 – The public debate and its limits
Control of public debate. If we take the example of the COVI app, and its lack of success with the Canadian government, it is surprising to see that developers have tried to play the openness card by investing the media (TV, radio, web) to explain and present the application. In terms of internal procedures, particularly comprehensive technical documents (60 pages) have been put online to explain the measures taken and the limitations in data processing. Paradoxically, by working in this way, the application was exposed and gave rise to various criticisms. In contrast, the competing application (Covid Shield) followed a « low profile » approach by limiting itself to discussions with state authorities. Indeed, the latter has very succinct public documents, very little explained, which do not allow to know much about the application (even if we know that it is a decentralized data management system). For COVI, therefore, this open approach has had a chilling effect that is likely to limit the wishes of those who would like to do so in the future. For while many argue for the importance of a public debate on the use of artificial intelligence, there is a risk for the organization that carries such a project to be criticized in this way.
2 – Functions of consent
Consent and communication. A lot of people have developed a suspicion about online consents that we conclude every day without reading, understanding, or adhering to. The example of the major international platforms is in everyone’s mind: who really understands the Facebook contract, which we know is so complex that its protection function is being abused? On the other hand, with regard to these health control applications, the situation is different. First, in all Canadian applications, the express consent of individuals is required, even though it would no doubt be legally possible to impose it on individuals. Second, to the extent that no one is obliged to adhere to the application, the contract is a persuasive tool to induce the individual to adhere. A real adhesion by the end-user is thus necessary; and this is undoubtedly why the media approach was chosen by COVI. The contract is therefore not a complex text designed to protect the institution in charge to run the apps; it becomes what it should be, a communication tool to predict the future of a relationship.
Right to explanation. Except that this adherence of end-users implies the ability to explain to them how apps work. Vaguely introduced in the GDPR (Recital 71), it is no doubt important to densify this principle of explanation and introduce it into the texts of Canadian laws. Indeed, the federal Privacy Commissioner intended to promote this principle (proposal 4). The very recent Quebec privacy bill (June 2020), despite the interesting avenues it envisages, does not seem to develop such a principle.
3 – The importance of the institutional public framework
Control agency. From a more legal perspective in relation to these health control applications, a fundamental question concerns the role of the State. Coincidentally or not, the Government of Quebec has just introduced a bill to amend the laws concerning privacy. Among the key provisions, there is a clear willingness to strengthen the responsibility of stakeholders in the area of privacy (increased and reinforced obligations, more demanding consent, condemnation of certain behaviors, drastic increase in sanctions, etc.). That said, little is said about the control bodies, in this case the Commission d’accès à l’information (CAI). However, with regard to the monitoring of these tracing apps, what has emerged above all is the poor institutional capacity to react quickly. After years of state disinvestment, there is very little external scrutiny in Quebec that can play the role of « institutional watchdog » to validate or invalidate such apps. More precisely, the CAI may react, but within its judicial power, which will take … about two years. When we compare the Quebec system, we can be moved by the responsiveness of the CNIL in France, which, in less than a month, has produced, with respect to a tracing application in France (StopCovid), a first deliberation (April 26, 2020), another on the applicable decree (May 12, 2020) and a third on the implementation (May 25, 2020). StopCovid can therefore be launched with the documented and public validation of an agency with the requisite legitimacy. Obviously, this external viewpoint is a source of greater credibility, as the WHO recently pointed out in a note on the subject (28 May 2020).
Question of money. If we really want to ensure responsible artificial intelligence, we will have to pay for it. Among the governance model that these apps reveal, where everything has to be built, we have to include a financial framework where we will have to integrate the costs associated with these controls. Internal control, of course, but this is less of a problem as long as it is already taken care of by the entity using the data. But also an external control which, indeed, is probably more reliable. In Canada, one cannot but notice the lack of resources of both the CAI and the Federal Commissioner’s Office. The CNIL in France also complains about its lack of resources. Among the available models, I was particularly convinced by the British system where the control body, the ICO, is financed at nearly 85% by a « tax » (Data Protection Fee), provided for in English law, which varies according to the « consumption » of data used and the size of the entities.
4 – Control of internal documentation
If there was any doubt, the control over the activities of such tracing apps necessarily requires the implementation of internal procedures that specify the ways of doing so. Moreover, in the last 4 months, substantial principles have been identified by various forums (May 28, 2020) (WHO – May 28, 2020) (Joint Statement of Canadian Privacy Commissioners – May 07, 2020): security (Data centralization or not; geolocation or Bluetooth, etc.), proportionality, time limitation, consent, purpose, these principles are uniform and well known. Certainly, this internal documentation solution is a universal solution that can be found in the field of computer security, financial security, environment, etc. All technical fields have been going through it since the auditing phenomenon became widespread around the 1980s. Auditing becomes the « new ritual », to use Michael Power’s expression. The problem is that, unfortunately, the link between these principles and the law is still very weak. The laws, both federal and provincial, remain far too « vague » to recognize the relevance of these principles. Here again, the most recent Quebec legislation helps to fill some of this legislative silence by clarifying the scope of these documentary obligations. It is therefore interesting to see that the law, which is national in nature, is becoming culturally “colored” under the influence of the GDPR, which four years earlier had also strengthened the documentary obligations of data holders.
In the face of the urgency of the current situation, tracing apps have been developed that may make it possible to better control both the deconfinement and the development of the pandemic. From a legal point of view, this extraordinary situation has been an indication, if any were needed, of the inadequacy of both the legal and institutional framework to manage this situation in Canada.
This blog was inspired by a reaction we published (in French) in early June following the Government of Canada’s decision not to recommend the COVI app. A more substantial text will also be published on the subject in the journal Lex electronica (in French).
Vincent Gautrais (Director of the CRDP – LR Wilson Chair in eCommerce Law – Faculty of Law – Université de Montréal)
For more information on the context of this e-conference
and the other papers see here
Don’t miss the next paper tomorrow at 12 p.m. (GMT+1), Covid-19 and Data Protection in Japan, by Hiroshi Miyashita