Enric Luján: “At complex times like these we must choose the lesser of two evils in terms of privacy”
We talk to the expert in privacy, PhD student at the University of Barcelona (UB), and member of Críptica, Enric Luján. Before we begin, he make it clear that these are his own opinions and not those of his collective, where there are several viewpoints on personal data management during coronavirus times.
What challenges face us following the Covid19 health crisis regarding personal data management?
The root problem is how a pandemic is managed democratically. I’m not very sure, but I don’t think anyone in my field knows. What democratic or privacy threshold is established during a pandemic? The General Data Protection Regulation (GDPR) of the European Union establishes that, in cases such as the present, data can be used to try to slow down the consequences in ways that we might consider abusive under normal conditions. Or at least they wouldn’t go through the ordinary protocols. The problem here is that almost anything is possible based on public interest. “Public interest” is an undetermined legal concept that has a wide range of action.
So, what do we have to be concerned about?
On one hand, I’m concerned about the lack of definition of public interest in the regulation, because it permits countries to do almost anything, but I’m also concerned about the opposite stance, which is based on an extremely restrictive interpretation of privacy and that consists of systematically challenging any kind of digital initiative by the public authorities that attempts to control the pandemic as a society. For me, the real debate lies in trying to see which data can help improve management of Covid19 (such as the mobility study by the INE), although firmly rejecting any digital initiatives that do not actually help tackle it, where any type of data processing by the latter is a clear abuse of privacy because any information must be requested with a view to its effectiveness to stop the virus.
What’s the midpoint, then?
The goal should be for the public authorities to define a clear digital strategy to see which data might be of use to manage the pandemic and which is of no use at all. Based on this, specific guidelines could be conveyed to the bodies responsible for developing the technological solutions.
What you do think of health self-management initiatives, such as the self-diagnosis apps or tools that let you download your geolocation data to know who you’ve been in contact with so that you can warn them that you’ve been tested positive for Covid19?
I think we’ve got an elderly society. The user profile isn’t a digital activist, but more like my dad. I’m worried that we can’t get beyond a technological elite. But what do grandparents do? The problem of having focused the entire governmental response on apps or the like comes up against the digital divide, especially in regions where things like smartphones are a world away. In this context, proposals like the one in South Korea would be impossible here.
But when the request is for an app to work with open source code, you presume that not all society as a whole will check its code. Of course, this measure is aimed at an elite, but it is an additional measure.
Yes, but how do we make sure that the vast majority of society is protected from the coronavirus. The only way is for the public authorities to take joint responsibility. And all this with the intrinsic risk of them ending up shirking their responsibilities. What we have at present is lockdown easing that has been programmed analogically, in timetables: just like in the Middle Ages. We need an effective response and a response for most of society. How do we make this response privacy-respectful and democratic? The problem with this is that the most efficient action is often not the most democratic. The policy doesn’t mean doing what you want, but facing contingencies that you’ll never tackle under ideal conditions. In an ideal world we would have months to plan ahead and to approve it through the appropriate formal procedures, etc. But that wasn’t the case. I do worry about privacy, but I also think that you must understand the present situation.
But the fact that the apps run in open source code is an enforceable measure, isn’t it? To make sure nobody is sharing our data with others.
The GDPR says that, in cases of emergency in the field of public health, this data can be transferred provided that it is deleted from the databases of institutions and companies once the epidemic has ended. Yes, open source code is always necessary. I imagine it made sense not to open it initially, to try to prevent someone from discovering some kind of vulnerability, considering that the aim was to collect medical data. But at present, now that these apps have been around for weeks, it doesn’t make much sense for them to remain the same. Just because we’re in the worst-case scenario doesn’t mean I no longer have to be alert. At complex times like these we must choose the lesser of two evils in terms of privacy.