When your face is your boarding pass you are holidaying with Big Brother

Facial-recognition technology poses a serious threat to our human rights and civil liberties

Facial-recognition technology facilitates surveillance by stealth, conducted by software programmes whose accuracy is, at best, tenuous. Photograph: Eric Piermont/AFP/Getty Images

Imagine moving through an airport without producing your passport or boarding pass at key touch points. Imagine that you can get through check-in and passport control by just showing your face. Machines will read and analyse your face, instruct you to your departure gate, and organise your duty-free shopping and boarding. Instead of spending time in long queues, you transition swiftly through expedited lanes. Sounds good? Then imagine the same process on arrival, as you proceed through another set of passport controls before collecting your luggage. In fact, why not use your face for a speedy car hire and hotel check-in before heading to the beach or pool? All this is possible as a result of facial-recognition technology.

Such technology has been introduced at Shannon and Dublin airports, at Heathrow and Amsterdam Schiphol, and it is trialled at airports of many of our popular holiday destinations. Speed, security and the overall enhancement of our travel experience are the main selling points of this end-to-end travel experience. But facial-recognition technology comes at a price. It is a biometric convenience trap that promises expediency in exchange for something very precious: our privacy and our fundamental rights. When our faces become our boarding passes or identity cards, we are holidaying with Big Brother.

Surveillance technology is becoming embedded in registered traveller schemes, which turn airports into biometric entry-exit points that vow to deliver seamless travel but track us domestically and internationally, from the moment we plan our holidays until we arrive at our destination.

Social media

Facial-recognition technology is a key component of the biometric airport. Driven by artificial intelligence, it scans our faces and compares these scans against existing image databases. Some of these databases use stationary images, such as passport photos, photos on driving licences, or images from police and immigration databases. They can also be scraped from our social media, or created from live images recorded in the street, the shopping centre, the supermarket, or at public gatherings such as demonstrations or sporting events.

READ MORE

Proponents of facial recognition argue that airports are not private spaces; that we cannot claim a right to privacy when we enter airport premises or use its services. But this argument is misleading. It ignores how facial-recognition systems become part of transnational surveillance networks that extend beyond airport security and that connect the dots of our biometric and digital identities.

There are long-standing worries over the gender and racial bias of facial-recognition technology

This tracking technology poses a very serious threat to our human rights and civil liberties. It turns fleeting appearances in public spaces into a durable presence in image databases, without our full understanding, without our informed consent, and without due regard to its proportional and necessary use. It normalises pervasive surveillance practices in public spaces. Such intrusive technology creates a chill factor that stifles or even penalises nonconformist modes of appearance or behaviour. This is particularly worrying when biometric data – our faces – are shared with destination countries that criminalise such nonconformist behaviour.

There are also long-standing worries over the gender and racial bias of facial recognition technology. Research has shown that it works best on light-skinned and male faces because the databases used to train the technology are predominantly populated with images of male and light-skinned faces. Such bias raises concerns over the accuracy of the technology. It could lead to embarrassing hold-ups or detentions at airports if our faces are not recognised accurately. It could lead to discrimination and harassment of female holiday makers and of passengers with darker-skinned faces. It could also lead to security lapses, especially when software-generated decisions are accepted as accurate, with limited or no human oversight.

Law-enforcement agencies

We should also be concerned about the human rights records of companies that develop and sell facial-recognition technologies. For example, Amazon sells face recognition software to law-enforcement agencies such as the US Immigration and Customs Enforcement. Only recently, Microsoft pulled the plug on its MS Celeb database, which contained more than 10 million images from 100,000 individuals, including images from well-known critics of big tech companies. And Google was previously involved with the US military’s Project Maven, which sought to use machine learning to analyse drone footage.

Facial-recognition technology facilitates surveillance by stealth, conducted by software programmes whose accuracy is, at best, tenuous. Unless we ban its use, holidaymakers and other travellers will find it increasingly difficult to escape the mission creep of this technology. So remember: when you pack the sun cream, Big Brother may travel with you.

Birgit Schippers is a senior lecturer in politics at St Mary's University College Belfast