Wearable technologies are nothing new. The Google Glass consumer-oriented product allows owners to do many of the things they could do on their tablet with a pair of mock spectacles.
Then you have apps such as FitBit, which track your activity, so you can feel better about yourself as you walk home from the pub.
Likewise there are numerous wearable technologies used to monitor patients with various health conditions.
So who’s to say this kind of technology couldn’t be used in other settings? To time an employee’s smoke break, for example, or to alert the boss when someone is spending too much time chatting at the water cooler.
We are already living in the age of the “quantified self”. So it was only a matter of time before business bought in.
Workforce-management innovation company Kronos is now offering new software of the wearable kind that has "tracking and communication capabilities for manufacturing and retail companies". Some pilot companies are expected to roll out their new software this year.
Big Brother is monitoring you
This may all sound a bit Big Brother-esque but this management approach isn’t new.
The Hitachi's Business Microscope device has been around for a few years and is designed to look like an ID badge that any company might issue to staff. Embedded within the badge, says Hitachi, are "infrared sensors, an accelerometer, a microphone sensor and a wireless communication device".
The badges can record and transmit to the powers that be “who talks to whom, how often, where and how energetically”.
Another company called Vuzix has developed "smart glasses" which have a high-definition camera in the lens to scan barcodes and provide warehouse staff with voice and visual data on stock. Even Walt Disney World resort uses wearable technology to track staff movements and develop collaboration and customer service.
While these technologies do offer some benefits, at what cost to worker privacy? As author Bob Greene notes, "Technology always wins, but victory can come with a price."
Last May, a Pew Research Centre report suggested that by 2025 there will be a “global, immersive, invisible, ambient networked computing environment built through the continued proliferation of smart sensors, cameras, software, databases and massive data centers”.
The report also said we’d all be experiencing an “augmented reality . . . through the use of portable/wearable/implantable technologies”.
Can anything stop this movement and should we even care?
Self quantification
“Wearable technologies are indeed becoming more pervasive and privacy is certainly a concern for both end-users and organisations that collect data,” says professor of software engineering
Bashar Nuseibeh
from the
University of Limerick
(UL) and at the
Open University
UK.
“There is significant research in this area, some of which we do at UL. The general area of collecting and analysing information from personal sensors in wearables is often called self quantification – and we are working with the Open University in the UK to conduct studies in hospitals with doctors and patients to better understand if and to what extent they are worried about privacy, and what we can do (from a technology perspective) to help them become more privacy aware and to help them manage the personal data that is collected by the wearable devices,” he explains.
“Our work is also collaborative with both social psychologists, to understand human behaviour, and with business experts, to understand the business value of the data being collected.”
Progress/privacy trade-off
Location tracking of staff has been around for a while. “I’ve spoken to taxi drivers using the Hailo service,” says Nuseibeh.
“That has really interesting privacy implications. Hailo HQ keeps a watchful eye on what their drivers are doing, where they are, and how much they’re charging, so they certainly can’t cheat their customers.
“But taxi drivers seem to think the increase in fares they get from being part of Hailo far outweighs any perceived privacy infringement.”
This is a common trade-off in the realm of digital communications and big data analysis: on the one hand you have the social and economic benefits of the application of a new technology, while you have the privacy threat – whether real or perceived – on the other.
That perception is frequently dependent on context.
Nuseibeh’s three-year project, monitoring behaviour and communications between staff at a hospital in the UK, looks at such areas as different organisational groupings, management of doctors, nurses in the ward.
There are both vertical and horizontal group relationships being monitored.
“People in the same cohort share information differently with people above or below them,” he says. “We’re only in the first year but have realised it’s definitely a factor worth monitoring, because from this we can examine how group dynamics change over time.
“Even looking at the same group of junior doctors, you can see how trust changes as people move up and down in rank, and with it, behaviour too.
While the immediate response to corporate and work-based staff monitoring is usually one of shock and dismay, it isn’t a simple case of employer versus employee.
If used correctly, this type of technology could benefit workers as well as improve worker productivity. In the hospital setting, for example, nurses and doctors working long shifts could be monitored to assess levels of fatigue.
Like most innovations in this realm, however, nothing will be said or done until something goes wrong. "I would be somewhat concerned about the direction this type of technology is going," says Lee Tobin, a researcher at the DigitalFIRE lab at University College Dublin.
“The legal side always lags behind until something goes wrong. Tech manufacturers make new products or services, sell it to companies, and everyone thinks it’s great – until something nasty happens or someone breaks the law.
“As far as I know, data protection wouldn’t come into it as any device used to monitor staff would come under corporate law the same way a corporate Gmail account, for example, would.
“As both a technologist and someone trying to encourage effective privacy management, I’m kind of in the middle,” says Nuseibeh.
“What we need to do is make engineers and designers of this kind of technology understand that what they’re creating isn’t just a technological innovation, but that it also has social and ethical implications. Then we’ll find a greater balance.”