Digital surveillance of employees has gone mainstream during the pandemic, but it’s not always effective and it erodes human dignity. Governments, companies and workers need a plan for the future
Workplaces are the venue of a new social experiment – of the sort seen in a Danish hospital in 2016. Back then, doctors were made to wear infrared badges, beacons were fixed to cabinets and walls, and then everyone was told to go about their working day. The aim was to figure out whether medical secretaries could be moved away from the doctors in a planned hospital redesign. It was optimisation through surveillance.
The outcome? At first glance, the data suggested that the separation could happen. But researchers subsequently realised that the data was imperfect and incomplete. The sensors didn’t function close to MRI scanners. And they placed too much emphasis on moving around, neglecting the important interactions that happen while seated at desks.
So the researchers subsequently developed a process of “collective sensemaking” – using input from staff to correct what the machines had got wrong.
Elsewhere, the technology that’s monitoring employees can be no less wrong, but is far less collaborative. Content moderators at Facebook can be disqualified from any bonus entitlements if they are twice caught logging into their computer even a minute or two late. In Amazon warehouses, devices track staff and tell them if they’re not working fast enough. If you fall behind, you’re told to make up for it tomorrow.
For years, we thought that our jobs would be done by machines. The far more surprising reality? We’ve ended up being managed by them.
“Bossware” or “tattletech” is software that enables employers to monitor their employees so as to, purportedly, improve productivity – but it also allows them to know exactly what workers are doing and when. This data can be used to decide who is a good worker and who is a bad one. It is often collected without people’s full understanding. And if workers feel it has been used unfairly against them, they have very few legal tools available to defend themselves.
The programs available to employers most commonly offer features such as keystroke logging – recording what we type – and instant messaging monitoring. But other programs give managers the ability to see exactly what their workers are doing, either in real-time or through regular screenshots of the employee’s computer. It’s the digital equivalent of a window: totally transparent.
And the software is becoming more imaginative, too. Managers can now, if they wish, analyse the tone of written messages or the emotions indicated by facial expressions made in front of a camera – all to judge whether someone is bored by their colleagues’ presentation, or if they might soon quit their job.
Demand for tattletech was growing even before the pandemic began. Half of 239 large corporations surveyed in the US in 2018 were already monitoring the content of employee emails and social media accounts, as well as how employees met and how they used their workspaces.
But the pandemic really hastened these trends. With lockdowns keeping workers in their homes, out of sight – in the traditional sense – of their managers, companies have turned more and more to surveillance tech. The CEO of Prodoscore, a California-based company that distills dozens of data points on workers into a single productivity score, has claimed a 600 per cent increase in prospective customers since the pandemic began.
This is a massive change. We were previously used to – and complacent about – the idea that delivery drivers and warehouse workers were tracked as they went about their business. But Covid has made more workers subject to tracking software. And it’s brought that software into our homes.
There are, of course, some good reasons for this technology’s popularity. During the pandemic, tools that monitor workers have helped them to work more efficiently, manage tasks and coordinate with colleagues. In warehouses, geolocation data has helped keep workers at a safe distance from one another.
But the benefits are currently outweighed by the risks. Data, as the case of the Danish hospital shows, can easily be misinterpreted. If machine learning is used to assess collected data and decide who is a good worker and who is a bad worker, then biases will likely be coded into those decisions. And when data is misused – or collected in ways workers feel is intrusive – it can erode trust between employer and employee.
“For people who are marginalised within society, for many women, for lower wage workers, for many people of colour, surveillance feels incredibly personal because they sense that they are less trusted in the workplace,” observes Gina Neff, professor of Technology and Society at the Oxford Internet Institute and a co-author of the Danish study. This really is a matter of human dignity.
Then there is the problem of “mission creep”. People may have consented to some of these developments during the pandemic in order to help their employers through a crisis, but it’s very possible not only that this data will continue being collected in more normal times – but that it will be used for purposes different to those originally agreed.
Is there protection in the law? Some, but not enough. Under Article 8 of the European Convention on Human Rights, people have a right to respect for private and family life, home and correspondence – but legal experts judge that employers’ rights to know what their workers are doing during the working day take priority.
Better protection for workers exists in the General Data Protection Regulation (GDPR), which insists that any personal data collected about an individual is used fairly, lawfully, transparently and for a specified purpose. That means employees should know what data their employer holds about them, and what it is being used for.
But unions say this is not always the case: according to Prospect, 48 per cent of workers do not know what data their employer collects about them, and 70 per cent do not think they would be involved in decisions about implementing new technology at work.
Besides, as Cori Crider, director of Foxglove, points out, GDPR “requires an individual person to stand up for themselves and go through the bother, hassle and time of putting in necessary requests and complaints”. As protections go, it asks a lot of those whom it is meant to protect.
The truth is that most existing employment laws were created in the past century to protect workers from physical harm. They weren’t designed to deal with the rapid transformation that technology has wrought upon our working lives. A new set of laws and norms is required, based on simple principles:
- Companies must make sure they fully understand the implications of the software they purchase for monitoring staff. And must share that knowledge with their employees.
- Workers must be heavily involved in conversations about how their data is used. This will provide greater transparency and fairness – and better results, too.
- We must be given the greatest opportunity to consent to and, if necessary, challenge the use of this technology. The Institute for the Future of Work recommends that “employee contracts, collective agreements and technology agreements should include explicit agreement about use of data and algorithmic systems shaping access, terms and quality of work”.
- The government should bring forward legislation that boosts workers’ rights. And the Information Commissioner’s Office, the regulator, must ensure its guide on employment practices is updated to match the world we now live in. Tech giants who have pioneered employee-monitoring capabilities should also design systems to adequately protect them.
- The government must prioritise a “human-centered approach” to research. In the UK, we are gifted with existing bodies – the Centre for Data Ethics and AI Office Council being just two examples – whom the government can support to form a strategy for the responsible use of these technologies.
- We must retain the right to switch off. Your boss has a right to know, within reason, where you are and what you are doing in your working day. But they shouldn’t have the right to know what you are doing at 10 o’clock on a Friday night.
All of which is to say: leave your keyboards, go to the pub, and arrange a call with your managers for Monday morning. We all need to talk about tattletech.
‘The Amazonian Era: How Algorithmic Systems are Eroding Good Work’, Institute for the Future of Work, May 2021
‘Who does the work of data?’ Møller et al, Interactions Volume 27, May – June 2020
‘Workplace digital monitoring and surveillance: what are my rights?’ Prospect
‘COVID-19: Fast-forward to a new era of employee surveillance’, Sara Riso, Eurofound
Tánaiste signs Code of Practice on Right to Disconnect, Government of Ireland
Photograph by Piotr Malecki/Panos Pictures