Data privacy erosions could leave a dangerous legacy

Net Results: ‘Decisions made in haste are typically disastrous policymaking’

Right now the majority of such data is gathered, owned and sold on by private companies, with minimal oversight. Photograph: Orlando Barria/EPA
Right now the majority of such data is gathered, owned and sold on by private companies, with minimal oversight. Photograph: Orlando Barria/EPA

When an image circulated on Twitter last week showing colour-coded data from more than a million smart thermometers plotted to a map of the United States, it created a sensation.

Areas with fever-level readings from the thermometers, highlighted in orange and red, stood out against cooler colours for normal temperatures. Fever zones closely matched many known coronavirus clusters.

Except, dramatically, in one place: Florida. The majority of the state was showing abnormally high temperature readings from the Kinsa thermometers. Users can opt to automatically upload readings to the company. Stripped of personal identifiers, the data is then provided to researchers.

The image suggested an alarming – and still not officially detected – level of infection. Soon after, such a rise was confirmed by medical reports to the US Centers for Disease Control (CDC).

READ MORE

The incident encapsulates the possibilities and tensions inherent in employing new technologies and services – including mobile phone location data, apps and smart device sensors – to gather and exploit personal data.

In a pandemic such uses could be beneficial. If an outbreak can be spotted in real time, earlier than the lag of official reports, virus-suppression methods could be launched faster and regional care resources bolstered.

Or, such widespread population surveillance could be abused, offering governments new forms of intense population monitoring. China’s use of such technologies, rolled out at scale as it battled Covid-19, poses many questions for less authoritarian societies. Where is the balance between privacy and social benefit? Equally importantly: does the use of such technologies really have significant effect?

Private ownership

At least as worrying is that, right now, the majority of such data is gathered, owned and sold on by private companies, with minimal oversight – typically, far less scrutiny than if the same data were gathered by governments. In the US and Europe, personal data disappears into private companies of all sizes, from small start-ups to multinationals, and is moved between legal jurisdictions.

In the US a federal data protection law is urgently needed, even as governments couple with private surveillance giants. The Wall Street Journal reports that the CDC is working with Palantir, a controversial Silicon Valley company associated with surveillance analysis technologies. Palantir and ClearView, the equally controversial facial recognition surveillance company, are talking to many US state governments.

And Verily, the company owned by Google parent Alphabet and slated to provide the online Covid-19 testing service touted by Donald Trump, is under scrutiny, as the testing data it gathers is shared with Google and other third parties. Users also must have a Google account to use a proposed national testing service (a forced opt-in that is illegal in the EU).

Europeans have some protections under the General Data Protection Regulation. Companies outside of the EU are supposed to apply GDPR-compliant protections to data transferred out of Europe, but whether this is properly done under agreements such as the US-EU data transfer arrangement, Privacy Shield, continues to be challenged in EU national courts and the Court of Justice of the European Union.

But some GDPR protections can be waived in the current virus emergency. A letter this week to the journal Science, signed by leading researchers, notes that aggregated mobile phone location data gathered from global operators could be important in fighting Covid-19. On Monday the European Commission permitted such uses.

Not so fast

Governments must act fast as they face the challenges of this pandemic. But are they moving too fast?

"Now that we face a public health emergency, there are understandable questions as to whether technological solutions might be useful to slow the spread of the virus and to identify persons who are either infected or who are in noncompliance with public health directives," says Elizabeth Joh, a professor of law at the University of California, Davis.

“The problem is that decisions made in haste and with the justification of emergency are typically disastrous policymaking. If, for example, it seems useful now for emergency workers to have vital signs monitored so that they can receive appropriate care, what happens after this emergency is over? Should Americans be convinced to sign up for apps that would alert them to whether they have been in contact with an infected person (as has been done elsewhere)? Giving up our privacy as a concession to a public health emergency may seem understandable in the moment, but that may pave the way for long-term privacy erosions once the emergency has passed,” says Joh, who has particular expertise in privacy, policing and surveillance.

She notes the lack of comprehensive regulation for privacy protections in the US “even as governments at all levels adopt more surveillance technologies”.

This is equally a concern for us in Europe, especially as many of the same Silicon Valley data giants are involved. Concessions made now in the name of public health should not slide surreptitiously into a new future norm without wider debate – and clear limitations on corporate control of our data.