This paper is part of the e-conference on « Data protection Issues and Covid-19: Comparative Perspectives » which consist in a daily publication at 12 p.m. (GMT+1) except on Sundays until the Summer break. A new session will start again at the beginning of the academic year 2020-21. Please subscribe to blogdroiteuropeen, so you don’t miss a publication. This e-conference was organised by Dr. Yseult Marique, Senior Lecturer at the University of Essex and FÖV Speyer and Dr. Olivia Tambou, Associate Professor at the University of Paris-Dauphine, External Scientific Fellow at the Max Planck Institute of Luxembourg, and Founder- Editor of Blogdroiteuropeen. If you are interested to contribute for our September session feel free to contact us at firstname.lastname@example.org
This second post illustrates the classical tensions between security and freedoms in the unprecedented context of the enforcement of the lockdown or the exit from the lockdown due to the Covid-19 pandemic. More information on the context of the French health emergency can be found in our first post. This second post will focus on the analysis of two specific situations: the use of the drones and the implementation of thermic cameras in relation to the Covid-19 pandemic. Both cases testify to the importance of the regulation articulated with the control of the judges. Judges can deliver urgent interim order when a quick response to a severe infringement of fundamental freedoms such as data protection rights, is required. Therefore, data protection authorities and judges are the « guardians of the temple » of data protection especially in time of health emergency. Whereas the essential role of both institutions is at the core of the European model of data protection, it often relies in practice on the vigilance and involvement of NGO’s specialised in the defence of freedoms, which triggers the control by judges and data protection authorities.
1. The control of the use of drones by public authorities in the Covid-19 context
The French Association La Quadrature du Net asked the French administrative court of Paris to order the immediate cessation of the use of drones by the Paris police for the enforcement of lockdown from 18 March 2020 onwards. The administrative court decided that the collection of data by the camera of the drone was not a personal data processing because it was not used for identifying people. The police used the pictures only to give a general idea of people gathering on the streets or other public spaces. It was only one source of information for the public authorities to decide a physical intervention where the lockdown was not respected. On 18 May, the Conseil d’Etat, the French supreme administrative court, quashed this judgment: it stated that individuals could be identified by the on-board camera. This possibility of identification is the unique criterion to take into account for deciding whether a processing is a personal data one. This is provided by the EU harmonised definition (art. 4 GDPR/Regulation 2018/1725 and art. 3 Directive 2016/780). The simple fact that a person can be identified implies personal data even if the police did not use the camera for identification purposes. Therefore a pure capture of images by an on-board camera with an optical zoom constitutes a personal data processing, regardless of whether the controller used it for identification purposes or not. This case illustrates that public authorities, as well as some judges, have still some difficulties to understand what a personal data processing is.
Furthermore, this case reminds us that European data protection law includes not only the GDPR but also the police directive 2016/680, which applies to personal data processing by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties. Therefore, the application of the police directive requires taking into account the nature of the controller and the purpose of its personal data processing. In this case, the Paris police was the controller of the personal data processing of the camera of the drones, and the purpose of the processing was to ensure public security. Therefore, the Conseil d’Etat clarifies that the 2016/680 Directive was applicable. In this situation, article 31 of the French Data Protection Act, which transposed the 2016/680 Directive, provides that personal data processing has to be authorised by a ministerial decree after the consultation of the CNIL (the French data protection authority). The lack of this formal requirement constitutes a severe infringement to the right to private life. It justifies the interim order to stop immediately the use of a drone by the Paris police. The purpose of the prior authorisation is to detail the framework of the processing, including safeguard measures to the rights and freedoms of individuals. According to article 35 of the French Data Protection Act, the decree shall include at least: the purpose of the intended processing, the service in charge of the right to access, the description of the categories of personal data collected, the categories of recipients to whom the personal data will be disclosed. In addition, it could include derogations to the right of information for national security, the limitations or restrictions of rights according to article 23 GDPR, and the description of the joint controller. The prior authorisation provides also some transparency, which enables public debate. Opinions of the CNIL are made public. The CNIL has issued an open data repository of these authorised processing since 25 May 2018, as provided by art. 36 of the French Data Act.
2. The control of the use of thermic cameras in the Covid-19 context
The Ligue des droits de l’homme asked the French administrative court of Versailles to order the immediate removal of the thermic cameras installed by a town council at the entrance of its municipal premises and of its public schools. In its judgement, the administrative court of Versailles decided that the system was in line with the GDPR and French law related to video surveillance. With its interim order of 26 June, the French Conseil d’Etat quashed the Versaille judgment: it clarified the data protection regime of thermic cameras. The Administrative Supreme Court stated that the fixed thermic camera installed at municipal premises did not involve personal data processing for the following reasons. These cameras simply offered instant information to voluntary visitors about whether their temperatures were superior to the normal with a colour code when they decided to cross the space where the cameras were installed. The measured temperature has no consequence on whether the person could have access to the premises. These cameras included no recording system, and were not controlled by agents. For all the above reasons the judge considers that this system implied no identification of the person and was not a personal data processing. A different conclusion applied to the mobile camera installed in the public schools. Municipal agents handled these cameras at various places and times and not only at the entrance of the schools. In case of high temperature, the person was denied access to the school and the person was asked to leave immediately the schools or to be picked by its parents. Even if the shape of the body itself could not identify the person, the processing of the picture by the municipal agent allowed at least for an indirect identification. Therefore, the Conseil d’Etat decided that these mobile thermic cameras were heath personal data processing. Consequently, the Conseil d’Etat examined whether the processing of these sensitive data was lawful. According to article 9 GDPR, the processing of sensitive data can only apply in ten cases. None of them were applicable here. The town council argued in vain that the consent of the individual could serve as a legal ground for the processing. The consent has to be freely given which was not the case here because the children have to accept the recording of their temperature to be at school. This processing was not necessary for reasons of public interest in the area of public health based on French law (art. 9 §2i GDPR). Furthermore, the application of these derogations to the prohibition of sensitive data should have been submitted to a priori data protection impact assessment by the town council, according to art. 35 GPDR.
3. Concluding remarks
Several lessons can be drawn from these two cases. One could consider that going before the court could be in certain situations a quicker and even more effective avenue than to lodge a complaint to a data protection authority. In both cases, the CNIL revealed after the rulings that it had already concerns about these intrusive processing. It is after the judgment of the administrative court that the CNIL made a public warning of illegal uses of thermic camera on its website. After this warning, the RATP (the French body in charge of public transportation in Paris) stopped experimenting with cameras in the underground stations to control body temperature but also the requirement to wear a mask. However, the regulatory approach has to be complemented by effective access to judicial remedies against the controller, and the processor. These parallel avenues are provided by art. 79 and 80 GDPR and constitute a core element of the European model of data protection. The drone case illustrates clearly how a ruling could reinforce the effectiveness of the control of a data protection authority. Right after the ruling, the CNIL revealed that an investigation into drones uses by several public authorities (cities and national police) was on-going since 23 April. It will have the opportunity to confront the practice of the controller with the legal framework clarified by the judges. Eventually, these cases illustrate the need to have a European public debate on various new technologies of identification and authentication by pictures including facial recognition. Data protection authorities should have a key role in launching this debate. At the European level, the European Data Protection Board adopted Guidelines on processing of personal data through video devices in January 2020. The CNIL has already issued a note clarifying the stakes with facial recognition. Beyond soft law, the future European legal framework for a trusty artificial intelligence should address this issue.
Olivia Tambou is Senior Lecturer at the University Paris-Dauphine, Founder and Editor of blogdroiteuropéen