In the wake of the overturning of Roe v Wade, which enshrined a woman’s right to an abortion, many people in the United States, particularly women, are discovering the enormous personal exposure and vulnerability that comes when a nation doesn’t have a federal data protection and privacy law.
Millions are realising just how tracked, datafied, analysed, quantified, and marketable they are, and the real-world implications of how seemingly inconsequential data, taken in an isolated app or service, can suddenly become a tool of persecution and prosecution.
All that longtime corporate surveillance becomes not just a joke about irritating cookie permissions, but your life, exposed and for sale, available to just about anyone who might want to know about you, your personal habits, your health, your intimate body functions, your political leanings, your decisions, your actions.
That data set might include location information about movements and cross-state travels taken from devices, apps, shared ride services, ticket purchases and precisely timed geographic locations triangulated by a phone to mobile phone masts.
File being prepared for DPP over insider trading
Christmas tech for kids: great gift ideas with safety features for parental peace of mind
MenoPal app offers proactive support to women going through menopause
Ezviz RE4 Plus review: Efficient budget robot cleaner but can suffer from wanderlust under the wrong conditions
Add in health data gleaned from the apps you use (period, ovulation and sex trackers are only a start), personal relationship, gender, dating and lifestyle data inferred from social media and apps. Your purchases, tracked. Your political leanings and who and what you donate to. What you read. The topics you search. They’re all for sale.
A buyer — maybe, an individual, or an organisation, or state or law enforcement authorities — can specify all sorts of detailed, highly nuanced parameters. Perhaps, women between 13-40, likely or known to be pregnant, travelling over a state border, in the vicinity of this clinic or that advice centre, linked to other people. Add to this corporate surveillance a dash of facial recognition, or licence plate recognition, and maybe some state or federal records.
But then again, someone might not need that official data, or even a clear attachment of a name to a collection of data, because researchers have shown many times that it isn’t much of a challenge to de-anonymise data and reconnect it to a specific individual. Most of the time, in the US especially, you can’t easily or feasibly opt out of such corporate and state surveillance. Without a federal data protection and privacy law, few restraints restrict how data is gathered and used.
If you are a typical millennial — the age at which you might be most likely to seek abortion services, or support someone in that situation — you’ve been tracked since childhood. You are in innumerable, endlessly reconfigurable data sets.
But it’s not just millennials. And it’s not just Americans. Most of us have lived this data-producing, carefully parsed life for two decades.
Thanks to the General Data Protection Regulation (GDPR), having specific privacy and data protection rights, plus a raft of other protective European legislation in place or pending, Europeans are in a far better place, though much could be improved.
Indeed, cases brought to national data protection regulators highlight many functional, regulatory and compliance holes, and that’s before you start to enumerate the bits of proposals and legislation that go too far, or target the wrong things, or concede too much to companies (generally in the name of “innovation”, or, if you’re in the UK right now, “turbocharging the digital economy”).
Some US states, such as California, have implemented their own data protection and privacy laws. California offers many of the same protections as GDPR. Notably, this is the state in which the global technology industry as we know it was born and nurtured, and which remains its favoured home.
But the very states that have better data protections are also those that maintain strong abortion protections, too (as long as conservatives fail in attempts to enact a full federal criminalisation of abortion).
In the wake of the Roe decision, many well-known technology companies swiftly made it known that they would support employees in more restrictive states who might need abortion services, by giving them time off and funding for travel and healthcare.
But, they’ll still gather, analyse, categorise and sell the data of employees, and all the rest of us who use their data-guzzling technologies and services.
Companies will not sacrifice their invasive business model when a small payout — microscopically small set against their data-driven profits — can virtue-signal a stance that the same companies undercut at every turn with their business modus operandi. I can’t even find a metaphor for how ridiculously meaningless this gesture is.
No one should mistake this huge societal data problem as relevant to just this one area of health and personal autonomy. Every woman can be profiled. Every woman, all her most intimate and personal attributes and actions, is for sale. So is every man. Children, too.
Yes, the theoretical implications of surveillance capitalism and surveillance governance have been known for some time. But it’s only when, with horrific clarity, more of us see the personal ramifications for ourselves and those we care about, that we might decide it is time, at last, to do something meaningful about it.