Privacy is dead, but could this cloud have a silver lining?

How do we bias new technology’s impact towards the good?

John Browne: My mother counselled me: “Don’t trust people with your secrets because they will surely use them against you”. Given her experience in Auschwitz, that was sound advice

When I was a child, my mother counselled me: “Don’t trust people with your secrets because they will surely use them against you”. Given her experience in Auschwitz during the second World War, that was sound advice at the time, but now we live in an age where technology seems to have privileged access to our intimate worlds. Search engine providers amass detailed records of our thinking processes. Smartphones record our every move and – as we’ve seen with a recent WhatsApp hack – can be turned with ease into bugging devices. Meanwhile, “smart” security cameras can now automatically pick individuals out from a crowd.

We have long since passed the point where anyone can expect that anything they say or do can remain private. When put so bluntly, this is a deeply unsettling thought. But is this really news? And might there be unexpected benefits to be found in a less private world?

Often, we forget that privacy is and always has been hard to come by. Just a few decades ago, it would have been easy for the local postmaster to keep close tabs on who was corresponding with whom. Telegram operatives had to read all the messages that they sent and received, and telephone operators could choose – or be commanded to – listen to any of the calls they connected. In small communities and villages, everyone knew each other’s business.

As cities have grown and our webs of communication have spread wider, it seems to me that we sometimes confuse privacy with anonymity. We’ve been tricked into believing that our privacy is protected because people do not know who we are. But that is not really true. Online environments have in fact made it ever easier to unearth information about other people. Many secrets can also be taken from us without us knowingly revealing them.

READ MORE

Consider the recent algorithm developed at Stanford University that can correctly distinguish between pictures of gay and straight men with an accuracy of over 80 per cent. It is chilling to consider what coercive regimes, or indeed any place where homosexuality remains illegal or taboo, might do with this technology. In a world where our actions, both online and off, are increasingly monitored and analysed, how free are we to speak our minds and challenge authority?

These are all causes for concern, but might it also be the case that in the clamour to condemn technologically-mediated “snooping” – whether it’s conducted by governments or businesses – we are sometimes too quick to forget the great benefits that can arise from sharing our personal data and pooling our knowledge?

When you delve a little deeper into this topic, which is what I do in my new book Make, Think, Imagine: Engineering The Future of Civilisation, you soon see that this is an area where none of our questions have simple answers. Communications technology, like many other engineered products, is and always has been a double-edged sword. So, the urgent question at the crux of my book is this: when both constructive and destructive uses of a new technology are possible, how do we bias their impact towards the good?

Much better than ditching engineering advances entirely, or stripping them of their power, is finding ways to nurture the positives, whilst minimising the risks of abuse. But how we choose to wield our double-edged swords is always up for debate, with different people and different societies reaching very different conclusions.

As chairman of the Crick Institute, one of Europe’s leading biomedical research hubs, I see firsthand the great potential of sharing some of the most personal data of all: our genome sequences and health records. Researchers think that we will never make sense of the human genome unless we pool all the relevant data we can. This is what Genomics Medicine Ireland and several other big projects around the world are trying to do. The problem is, many of these scientists are worried that fears around privacy could stymie their efforts. If that happens, we will lose out on the next generation of drugs and healthcare that will allow us all to lead longer, healthier lives.

In China, a country I visit regularly, I see a society with a very different approach to sharing and privacy again. The government there is in the process of developing a “social credit system”. This will monitor personal data from many sources, ranging from people’s purchasing habits to their credit history and even the content of their social media feeds. Using an algorithm, it will then grant every citizen a single, publicly visible rating, designed to reflect their trustworthiness.

Many worry that this will be used to stamp out free speech and create a bland, conformist society. But the same scheme could also dramatically reduce crime. It is also meant to provide an objective means of gauging trust which, ultimately, is the basis of all co-operation and business.

But if schemes of this kind are to make a positive impact overall, those who design and implement them should remember that trust must run in all directions. Society must be reassured that the algorithms that judge us are not inexplicable “black boxes”. This is a great concern of the Co Antrim-born philosopher Onora O’Neill who warns that people are “being asked to place and refuse trust in complex systems, rather than in something they understand pretty well”. The end result is that we find it very difficult to place our trust intelligently. This is part of the reason privacy has been further eroded in recent years. How many of us actually read the long pages of impenetrable legal jargon presented to us, before quickly tapping “Agree” and downloading new apps and updates for our smartphones and computers?

Amidst today’s widespread negativity about the corrosive effect of network technologies, there are clear signs that the world is waking up to the urgency of the issues at hand.

Eventually, regulators start to catch up. It is encouraging that governments in Dublin, Brussels and beyond are working hard to improve their technological literacy. They should, however, try to avoid knee-jerk responses. If we break our communications networks, hinder innovation or outlaw “big tech” as a whole, we must surrender the good as well as the bad. The networks that cause so much indignation are the same ones that allow us to contact colleagues and loved ones. They also support democracy, connect marginalised individuals and provide people in communities from Donegal to Djibouti and Dhaka, and everywhere else in between, with unprecedented access to education, entertainment and economic opportunity.

And although it is still fashionable to criticise them, there are indications that Facebook and their ilk are facing up to the huge responsibility that they shouldered when they decided to accumulate so much personal data from their billions of users. It is too early to judge their effectiveness, but current attempts to remove hate-speech and fake news from their platforms are moves in the right direction.

Finally, we should remember that we are not passive players in the course of innovation. Even if current fears about technology are focused on the retreat of privacy, now is the time to be open and speak our minds; we needn’t always ‘Agree’. Progress should always be a back-and-forth conversation, not something that is imposed from on high. If sense prevails our voices will be heard by innovators, regulators and businesses alike.

Thanks to the engineering that connects us, so much is possible. We must use it to work together and think more imaginatively, to help build the world we want to live in.

John Browne was group chief executive of BP from 1995 to 2007. His new book Make, Think, Imagine: Engineering the Future of Civilisation is out now from Bloomsbury, price £30