Introduction
Coronavirus tracking apps have been the centre of attention. Initially seen by many governments as a central plank in the plan to contain the spread of the virus, they also gave rise to many privacy, data protection and security concerns. The approach adopted in many countries including – eventually – the UK, was based on a decentralised model utilising blue tooth beacons on smartphones that work on recording interactions rather than locations. While much critical attention has been given to trying to ensure that these apps do not constitute too significant an intrusion in to privacy, specifically through technical limitations to the gathering of location data, less attention has been given to the fact that smartphones may be used in two other ways to try to limit the virus through tracking users – through GPS data and through location data generated from cell site records. It does not seem to have been considered whether the handing over of those sorts of data to public authorities triggers the surveillance powers regime: the Investigatory Powers Act 2016 (IPA). Accessing data from smartphones is not the only way individuals’ locations may be tracked, however. There is increasing use of other technologies incorporating surveillance cameras. For example CCTV or drones, especially using automated facial recognition technology (AFR). Within the UK, these have not received much attention and, arguably, their use is less coherently regulated.
Mobile Phones
While the virus app model that Google and Apple developed prohibited those apps from accessing GPS data, many other apps have access to GPS data (though it is questionable how many users know that). Indeed Google was able to generate ‘Community Mobility Reports’ that show how visits to and the length of stay at different types of places changed from a baseline derived from data the company had from earlier this year. This data allows people to check, for example, if people in a given area are complying with restrictions on movements at a general level, though it is arguable that more granular reports could be produced.
Additionally, as a necessary part of getting a mobile phone to work, the network generates cell-site data, as the mobile pings the cell-site the closest to it. This gives location data which are held by the mobile phone company. This location data – as with other meta data- allows detailed pictures of individuals to be built up, based on where they were, with whom and for how long; it can allow real-time tracking. While this may be relevant for contact tracing, both European courts have recognised that this collection and analysis of data can constitute a significant intrusion into users’ privacy. Of course, location data could also be used to generate the same information as the Community Mobility Reports – that is looking at the density of population at particular places.
Data Protection Framework
The use by GPS data and cell site data from mobile phones could be analysed from the framework of the General Data Protection Regulation (GDPR) and Data Protection Act 2018 (DPA). Additionally, data “processed in an electronic communications network or by an electronic communications service indicating the geographical position of the terminal equipment of a user of a public electronic communications service” are location data covered by the Privacy and Electronic Communications Regulations 2003 (implementing the e-Privacy Directive). The initial response of the Information Commissioner’s Office (ICO) was that the use of communications data would be acceptable because, as it was anonymised, it would fall outside the DPA/GDPR and therefore questions about lawful and fair processing does not arise. A similar response could be made with regard to the e-Privacy rules. Location data may, under the e-Privacy rules, only be processed under limited conditions with consent or when properly anonymised (save to the extent that a derogation under Article 15 exists).
As statements of law, this is true: anonymised data are not personal data. It does beg the question, however, as to whether the data supplied really would be anonymous; it is notoriously difficult to ensure that data cannot be re-identified (4 random data points suffice, apparently), especially if you have a data set that tells you where a device’s home location probably is and especially if you have access to other datasets. So the answer depends on whether the mobile operators would be doing the analysis themselves and handing over reports akin to the Community Mobility Reports – focussing essentially on public spaces, or whether the raw data (which could allow individual devices to be differentiated) were to be shared. The question is significant because it seems that the answer is the difference between data protection controls and no control over use of data.
Investigatory Powers Act
The IPA already provides a mechanism whereby public authorities may analyse communications data. The IPA identifies the authorities who may exercise powers under Part 3 to obtain communications data (listed in Schedule 4, including the police and security and intelligence services but also the Department of Health and the ambulance services), the statutory purposes for which those data may be obtained, and the type of data which may be obtained – and both of these may vary by authority. Communications data is defined incredibly broadly (anything other than content but including content that can be inferred from the fact of communication – see s. 261 IPA) and would certainly include location data. Internet data (while subject to special safeguards) is also caught; in some instances this metadata may be revealing of the content of the pages visited (which may suggest when users are affected by coronavirus). The list of purposes for which communications data may be obtained (in s 60A(7)IPA) is long but includes: (d) the interests of public safety; (e) the purpose of protecting public health; and (g) the purpose of preventing death or injury or any damage to a person’s physical or mental health, or of mitigating any injury or damage to a person’s physical or mental health. Given that violation of Covid restrictions are also criminal offences, the prevention and detection of crime justifications are also relevant. Following the Watson/Tele2 ruling from the ECJ, the rules were changed so that (except in urgent cases) obtaining the data is subject to authorisation by a Judicial Commissioner who must verify whether the access to the data is proportionate. While there are codes providing further detail as to how these powers are to be exercised, the types of analysis that may be carried out on any such data are, however, not subject to review. There is further no public data on whether these powers have been used and if so how proportionality might be understood in practice. It does mean that questions of anonymisation and data protection are not determinative.
Surveillance Cameras
CCTV is remarkably common in the UK; even in noughties the then Information Commissioner, Richard Thomas, warned that the UK was sleepwalking into a surveillance society. Since then, the numbers of surveillance cameras have increased and they have increased capacity from automated number plate recognition (ANPR) to automated facial recognition (AFR) and while the majority of these cameras are static, some are mobile. This trend has increased with the development of body worn video (BWV) and drones (or unmanned aerial vehicles), which have the capacity to further blur the boundary between public and private spaces. While these systems may be used to identify who was at a particular place at a given time, they can also be used to find out who was there too as well as to track individuals’ movements. The possible use of drones to identify people with symptoms of Coronavirus goes further still and how the law regulates such technology lies outside this blog.
In the context of the pandemic, the use by the police of drones to enforce lock down rules has been facilitated so as to allow police officers to fly drones at a higher altitude and closer to members of the public than normal regulations allow. Some police forces were using drones to remind people of social distancing rules. Some have gone further: Derbyshire Police was reported by the media to have used drones to film people in the Peak District – although rules at that time permitted exercise as a reason to leave the house and there were no limits in the legislation as to how far a person could travel so to do – and released the video footage via Twitter.
Somewhat surprisingly, the use by the police of surveillance cameras of any sort for overt operations is not covered by any specific legislation (covert surveillance is covered by the Regulation of Investigatory Powers Act 2000). Instead, as became apparent in the Catt litigation (concerning the taking of photos of a peaceful protester and keeping them on file in a domestic extremists database) and in the Bridges case (concerning the deployment of AFR), the legal basis was a mixture of common law, general powers, data protection rules and some codes of varying degrees of specificity. Under the Protection of Freedoms Act 2012 (PoFA), a Surveillance Camera Commissioner was established to encourage compliance with the Surveillance Camera Code (prepared by the Secretary of State under PoFA) (SC Code); certain authorities (notably the police) specified in PoFA must have regard to the code. The SC Code does not, as recognised in Bridges, provide full guidance for all types of surveillance – notably AFR and drones – though the general principles set out in the code are applicable to all such technologies.
The European Court of Human Rights found the UK government to have infringed Article 8 ECHR in Catt. Since the British system failed on grounds of proportionality (there being no time limit in domestic rules as to how long the file on Catt could be kept), the Court did not consider the lawfulness in an Article 8(2) sense of a mishmash of broad common law power and codes. This does not mean, as seems to have been accepted in Bridges, however, that this was a ringing endorsement of the lawfulness of the regime. In its judgment it noted that ‘[i]t is of concern that the collection of data for the purposes of the database did not have a clearer and more coherent legal base’ (para 99 – see also para 105), highlighting the ambiguity in scope of the database in question. Despite the confidence of the Court of Appeal in Bridges as to this patchwork of codes, legislation and common law in principle being capable of satisfying the lawfulness test, in the light of Catt there may be questions as to whether the current legal basis is sufficient.
In any event, the use of drones may run into the same problem that the Court of Appeal found that the use of AFR did; the lack of constraints in the SC Code (despite the requests of the Surveillance Camera Commissioner that it be updated) specifically as regards the choice of when, why and where to deploy the technologies (this point was made also by the ICO but dismissed by the court at first instance). This would suggest that specific guidance on the deployment of such technologies is needed – and perhaps even guidance specific to the context of the pandemic.
The Court of Appeal found, on the facts of the case, that the intrusion for those individuals of no interest to the police but who were caught by the AFR was minimal and that this fact did not change no matter how many such people were subject to surveillance. It is perhaps significant for this conclusion that the software automatically and immediately deleted the data of those which it deemed not to be a match with the images on the police watchlist. In terms of jurisprudence it is also significant that, in coming to this conclusion, the Court of Appeal distinguished the specific facts in Bridges from those in Watson/Tele2, but also S and Marper v UK (which concerned the retention of other forms of biometric data, and in which the ECtHR also found a violation of Article 8) and did not pay particular attention to Catt. It is however a worrying conclusion to leave: that mass surveillance in a public place – especially in connection with the policing of minor offences – was not disproportionate. This line of reasoning could presumably be applied to monitoring the pandemic restrictions.
Against this background, what of the particular use of drone footage by the Derbyshire Police? The police put the footage up via Twitter (and it has had over 2 million views). In doing so, the police have essentially lost control of where that data goes. If the individuals in that footage are distinguishable then the Data Protection Act applies and the police actions are likely incompatible with their obligations.
It is also questionable whether there is a violation of Article 8. While there is nothing inherently private about going for a walk in a national park, the police did record that activity (triggering concerns about State retention of data). There are also overtones of the Peck case – although the content recorded in that case was much more sensitive. There footage from CCTV resulted in images of Peck being published and broadcast by mainstream media broadly and which resulted in the case ending up before the ECtHR which found there to be a violation of Article 8 ECHR. However, in Re an application by JR 38 for Judicial Review, a decision which pre-dated the ECtHR ruling in Catt, the UK Supreme Court held that the publication of CCTV images of the appellant in the course of rioting were published in two newspapers as part of a police campaign to identify those involved in rioting and to deter future sectarian disturbances did not constitute a violation of privacy rights. The criminal context here was more serious (extensive sectarian rioting in which there were 46 incidents involving 75 young people committing over 100 offences over four months), and the publication of the CCTV footage was an action of last resort. It is hard to say that socially-distanced dog-walking in a national park constitutes the same sort of threat to public order. Yet in re JR 38 the majority of their Lordships held that Article 8 did not apply, focussing on the criminal nature of the conduct – how far that reasoning extends and its relevance to a public health emergency is unclear. There is a risk, however, of a gap in domestic protections.
Conclusions
The test and trace apps raise concerns about privacy – but the main point of concern there relates to the linkage to our medical records. If we are more broadly concerned about the ability to move about freely without being tracked and traced, while COVID-19 responses bring this issue into sharp relief, there are other regimes that allow mass surveillance – but at least some allow for the oversight of the use of those powers. The problem otherwise, as Judge Koskelo, in her concurring opinion in Catt emphasised is that: “[t]he present case is […] essentially an individual manifestation of the consequences arising from shortcomings in the underlying legal framework”. These technologies have been rolled out without – so far – much comment and certainly without the consideration that might be possible through the enactment a legal framework, a framework which – it is to be hoped – would control their misuse.
Lorna Woods OBE is professor of internet law at the University of Essex. She has researched into the use of communications networks as surveillance tools. She is a member of an advisory group to the Surveillance Camera Commissioner and also sits on the ANPR National Users Group, supporting the police use of this technology.