Digital Contact Tracing in India: A Failure of Democratic Science and Technology Policy

As the world reeled from the impact of the COVID-19 pandemic, early in April, Indian Prime Minister Narendra Modi urged the people of India to download and use the ‘Aarogya Setu’ smartphone application. Translating loosely as ‘bridge to health’, the software application was among the arsenal of digital surveillance measures being adopted to assist in the government’s response to the pandemic, which has included drone surveillance, facial recognition technology, call data analysis, among others.

The manner in which digital surveillance infrastructure has been crafted and deployed in India in the government response to the pandemic bears careful interrogation. Aarogya Setu, in particular, exemplifies the failure of technology policy and regulation in India to consider not only the human and constitutional rights implications of digital surveillance, but more broadly, the failure of democratic engagement in science and technology governance and the resultant uncritical validation and application of experimental digital technologies.

Bridge to Nowhere: Aarogya Setu and ‘Techno-Solutionism’ in India

Aarogya Setu has been billed as a ‘digital contact tracing app’ – a term which broadly describes a set of communication protocols which identify the possibilities of COVID-19 infection or exposure risk, through the use of communication technology standards normally found in contemporary smartphones – namely, Bluetooth and GPS location tracing.

The information collected by Aarogya Setu includes demographic information (name, age, address), health information (COVID-19 test status), location information and information about recent physical associations. Aarogya Setu has been promoted as a key element of the Government of India’s response to the COVID-19 pandemic. Yet, its precise function and use have always remained nebulous. Two documents that detail possible information management are the Aarogya Setu Privacy Policy, and an executive order called the ‘Aarogya Setu Data Access and Knowledge Sharing Protocol’, state that information collected through the system will be used for ‘management of COVID-19 in the country’ or ‘framing appropriate health responses’.

However, the Government has not released any public documentation detailing its intentions or goals, metrics for evaluating its functioning, the working of the information ecosystem within which it will operate, or how it will interplay with the otherwise ‘data sparse’ primary healthcare infrastructure in the country. Further, there is little clarity to the public on how they are expected to participate in the system without access to smartphone, or how they should respond to the various health-related messages facilitated by the information system.

The opacity of its use has been compounded by the manner in which it was developed and deployed – through ‘voluntary’ engagements with private companies and individuals, without following norms of transparent procurement, and without trialing the system prior to its deployment. The lack of clarity on its purpose has led to the use of personal information collected by the app, for tele-medicine, as well as potentially for building a ‘National Health Stack’ for digital health infrastructure.

The messaging by the Government of India on Aarogya Setu has been effective by one metric – it has been downloaded more than 140 million times. While the absolute number of ‘downloads’ is high, it is only about 1/10th of the entire population of India, and, critically, is inherently exclusionary considering only about 500 million people even have access to mobile internet, and there is no clarity on how this excluded population is expected to participate in this technological ‘management’ of COVID-19. Additionally, there has been a concerning lack of critical evaluation of its efficacy or utility, and particularly a failure to engage with the context of its use within a severely under-resourced public health system, and a low technology-penetration and digital literacy environment. Ultimately, the Government of India’s rollout of Aarogya Setu comes across as an example of ‘techno-solutionism’ or technology theatre – the side-lining of important and critical conversations on holistic approaches to the public health crisis, by focusing public resources on, and privileging claims of (often dubious) technologies, without looking at the wider social and political context in which it is to be deployed.

Aarogya Setu’s Many Legal and Constitutional Dilemmas

Aarogya Setu exists in a legal and regulatory vacuum, with no specific precedence of legislation to guide its implementation or limitations, with the government resorting to departmental orders of dubious legality. The absence of due legal consideration to matters of digital surveillance by public agencies in general, and specifically in the case of Aarogya Setu, has given rise to a number of concerns – ranging from arbitrary and unchecked executive power, to data protection, to discrimination and exclusion from access to basic government services. Much of this stems from the government’s failure to consider the impact of technology on the constitutional rights and interests of Indians.

India does not have a data protection law which provides clear and adequate guidance to public authorities engaged in data collection, processing or use. However, the Right to Privacy is protected in India as a fundamental constitutional right, which includes an individual’s right of self-determination over their personal information. In India, an infringement of the Right to Privacy, would be balanced against the legitimate claims of the government, by applying a metric of proportionality and necessity to adjudge the rights infringement.

Aarogya Setu’s implementation poses an unresolved dilemma for Indian privacy jurisprudence, namely, whether consent can be claimed as a legitimate basis to foreclose a review of the proportionality and necessity requirements for infringements of the right to privacy. It would, at a glance, seem strange to not allow individuals to ‘opt-out’ of claims over their right to privacy and insist on a higher standard for the collection, processing and use of personal information. However, claims of consent in participating in Aarogya Setu (which is provided through a clickwrap agreement) must be seen in the broader context of its implementation.

At least two contextual elements are central to analysing claims of voluntariness and consent vis-a-vis the right to privacy within Aarogya Setu’s information ecosystem. For one, there is an inherent unbalanced power dynamic between the government and the individual, which must be taken into account when factoring involuntariness. Under EU’s data protection law, the General Data Protection Regulation or GDPR, recognises that there must be a different legal basis for processing activities undertaken by public agencies which is not solely covered by consent – an aspect also echoed by the EU Data Protection Board’s guidelines on contact tracing apps. Secondly, the context of a public health emergency, and the manner in which the app’s efficacy has been communicated and the coercive means by consent has been obtained, must be kept in mind. In such a scenario, where the consequences of using a digital health surveillance technology are not made clear, and individuals are not provided reasonable alternatives to participate in a public healthcare systems or obtaining access to goods and services, clickwrap agreements and one-off consent, is insufficient.

A second constitutional concern with the manner in which Aarogya Setu has been implemented has been its impact on the right to equality and non-discrimination under Indian law. This is linked both with the manner in which the regulations framed around Aarogya Setu have facilitated technology-based exclusion, as well as how information gathered from the app has been used in government policy and decision-making.

As noted above, Aarogya Setu was (and in some cases continues to be) mandated for access to important public goods and services, including, among other things, travel and access to employment, without detailing exceptions for those unable to access the technologies necessary for use of the app – namely, a stable internet connection, power source and a smartphone. Various Government Orders, for example, had required all government and private sector employees to download and use the app, and Government departments in charge of railways and airlines, had similarly issued strictures which required that public transport would only be accessible upon downloading the app. There have also been reported cases of individuals being denied access to private services like banks and pharmacies for not having Aarogya Setu, as well as being a requirement for employees of many companies, particularly food and logistics workplaces. Prima facie, this creates an unreasonable classification between those with the means and ability to access these technologies, and those without.

Article 14 of the Constitution of India prohibits such unreasonable classification created by government policy, and as such, such mandates could be deemed unconstitutional. In the case of private mandates for the use of Aarogya Setu, (such as by employers or owners of shops and establishments), however, there is little statutory recourse against technology-enabled discrimination.

A related area of concern is the manner in which information gathered from Aarogya Setu is used in government decision-making, both at an individual level and at the level of government policy. The Aarogya Setu app uses an algorithmic logic to provide a ‘health code’ for individuals. A health code determination triggers many administrative decisions, including, as formerly mentioned, the decision to allow an individual to use specific modes of travel. Information collected from the app has also played a part in determining quarantine zones and restrictions in urban clusters in India, although the precise manner in which this has occurred has not been revealed. Both these forms of information processing reveal a lack of transparency, accountability and due process standards expected of government decision-making. In the former, there is no explanation or justification to individuals for why they have received a particular code, despite that such a determination has consequential impacts on their freedom of movement or association. In the latter, there is no clarity on how the ‘big data’ based policy making has considered the assumptions, biases and limitations within the information collected from the app. Considering the disparate nature of access to Aarogya Setu, information sources will be similarly skewed towards making certain populations hyper-visible, while potentially “invisibilising” others.

Such information use can potentially hide biased and discriminatory policy making – for example, failing to provide healthcare resources to a particular community because of their deemed ‘invisibility’ from the data source of Aarogya Setu; or, it can simply be arbitrary and unreasonable – for example, by providing an adverse ‘health code’ decision due to a technical fault in the data processing software or algorithm. In either case, the information processing activity by the Government requires greater constitutional scrutiny.

Conclusion

Nearly six months and more than a hundred million ‘downloads’ later, the ethical, legal and constitutional dilemmas of India’s ‘digital contact tracing’ infrastructure have come no closer to resolution. An ongoing petition in the Karnataka High Court has challenged the Government of India’s use of Aarogya Setu, on some of the above concerns. Whatever the outcome of this judicial challenge may be, the experiment has left behind an unfortunate legacy of undemocratic policy-making in matters of science and technology – from the communication of scientific and technological policy and tools by the government, to its own unbounded discretion and lack of introspection about the limitations of the utility of Big Data and algorithmic decision-making systems in health administration.

Divij Joshi is a lawyer and a researcher from India, researching technology, policy and law. He is presently a Mozilla Tech Policy fellow working on automated decision making in India. He also edits the SpicyIP Blog

Laisser un commentaire

Ce site utilise Akismet pour réduire les indésirables. En savoir plus sur la façon dont les données de vos commentaires sont traitées.