Giving privacy a bad name | Mireille Caruana
What is of utmost importance is transparency and accountability: the detailed description and information about in-built privacy and security safeguards must be made publicly available and analysed
Mireille Caruana is a Senior Lecturer of the Department of Media, Communications and Technology Law, Faculty of Laws, University of Malta
What is all the fuss with contact tracing apps about?
Contact tracing apps are a public health initiative to combat the spread of COVID-19. Users install an app on their mobile phone that sends out short, little coded bursts of gobbledygook.
When someone tests positive for COVID-19, the app can tell if someone else has come in close contact with that person, will send the user a notice, and then that person can choose to self-isolate. The app traces who you have come into contact with and notifies those who have come into contact with someone who has tested positive.
The system is purported to only really be feasible if a large percentage of the population download the app.
Doing so will require an unprecedented level of trust in governmental authorities to ensure that users who download the app will not regret handing over the power of 24-hour monitoring and surveillance to the State without protection of our fundamental rights.
In recent weeks, Malta and other EU member states were grilled by privacy advocates who have expressed concern on the compatibility of such apps with human rights to privacy and data protection.
There are also concerns these apps fail to comply with the European Union’s General Data Protection Regulation (GDPR).
At the core of the debate is whether the interference with our human rights are justified and whether that interference is proportional to the aims of states faced with an unprecedented public health crisis.
Understandably governments are looking to technology to help fight the battle against COVID-19 and contact tracing has emerged as a tool that may help to do so.
The question is: Do we design for privacy? Or do we design in a way that ensures efficiency in tracking? The latter rightly evokes serious questions about government surveillance. So how the apps are designed really matters to how much privacy we give up.
The extent of privacy invasiveness of a contact tracing app is very much dependent on its in-built features, the system underlying the technology and how it is actually deployed. But in order to comply with the GDPR, certain legal tests must be satisfied.
The privacy impact assessments for contact tracing applications must entail a case-by-case analysis: does the app collect health data that is adequate, relevant and limited to what is necessary in relation to the purpose of combating the spread of the disease? Is the collected data only to be used for the specific purpose of contact tracing? What rules have been implemented to ensure erasure of data within specified time-frames? Will the data be processed and stored only within the European Union? What privacy and data protection safeguards have been implemented? Are privacy and data protection ‘by design’ principles built into the ‘code’ of the app? For example, is the data stored locally, on users’ phones? Or is it stored centrally by some government agency?
There is still a lot to learn about this virus and how it spreads.
Yet contract tracing apps could be a vital tool in keeping the right people at home as governments look to start their economies back up again.
Therefore, deployment of these tools must be undertaken quickly and without all the necessary evidence to justify their invasion of our privacy.
Privacy advocates would be unreasonable in insisting that an app not be deployed pending conclusive evidence of the efficacy of the deployment of such an application to contain the virus.
The UN Special Rapporteur on the Right to Privacy, Joseph Cannataci, is reported by MaltaToday as stating:
“There are many confounding variables which can distort analysis, but what is going to happen if, in a year or two, we would realise that the introduction of such apps, or indeed even more invasive contact tracing using mobile phone data, did not significantly arrest the impact of COVID-19?” he asked.
But what if it did significantly arrest the impact of COVID-19? Most people would conclude that the interference with our right to privacy would have been justified. While the ringing of a precautionary bell may well be expected, it would be wrong to assume that all contact tracing apps are a disproportionate or an unjustified interference with our fundamental right to privacy.
What is of utmost importance is transparency and accountability: the detailed description and information about in-built privacy and security safeguards must be made publicly available and analysed.
The source code should be made available and the deployment of the app should be put on a lawful basis. In this manner the safeguards enacted in law and coded into the contact tracing system would be known to the public, who will have reasonable expectations about how and for what purposes the data processing will be carried out, including the impact upon their rights to privacy and data protection.
It is essential that the privacy and data protection impact assessment be made public. This would support community confidence in the app. The GDPR requires app providers using personal data which is deemed high risk to the rights and freedoms of natural persons to undertake a data protection impact assessment. This includes processing on a large scale of health data, which would certainly be in the case of a contact tracing app.
Government would do well to heed advice and guidance as it considers the privacy issues through the data protection impact assessment.
The Information and Data Protection Commissioner’s Office should have an enhanced role watching the implementation closely, auditing the system, and investigating any complaints from the public.
If we let privacy stand in the way of effectively controlling a pandemic, we would be doing nothing more than giving privacy a bad name.
Yes, crises like the 9/11 terror attacks or the London Bombings have sparked surveillance legislation in the past.
But the fundamental rights to privacy and data protection are not absolute. We give up some of our privacy to feel safe and secure all the time. You agree to constant monitoring on the London Underground if it means you are safe.
This poses challenging questions about trade-offs. Governments are tasked with ensuring fundamental rights while promoting other countervailing public interests like public health and, ultimately, the right to life.
The deployment of contact tracing apps will require an unprecedented act of trust by users.
Protocols should be followed to ensure fundamental rights are protected and law should be enacted to put the app’s use on a lawful basis.
But if the app does what is promised, trusting technology will have played a fundamental part in controlling a pandemic. What we should avoid doing is enacting new surveillance laws in a time of crisis that actually outlive the crisis itself.
Thank you to Dr M.R. Leiser of Leiden Law School for his contributions to this article