Dec. 14, 2020
One of the worst jobs I have ever had was made particularly bad by the micromanaging efforts of my manager’s boss.
One of the worst jobs I have ever had was made particularly bad by the micromanaging efforts of my manager’s boss. He seemed to spend all day skulking around, peering over the shoulders of junior staff to check that whatever we were doing looked like work. If he spotted someone doing something he considered untoward (usually reading the news or, on slow days, perhaps online shopping) he would come up behind them, point at the screen, wag his finger and say: “Not work!”
Sometimes it actually was work, but there was no point in arguing. It was a frustrating and corrosive environment, and not conducive to getting things done. His measure of productivity was clearly a blunt instrument and, instead of fostering a motivated workplace, he created an atmosphere of jittery paranoia and low-level resentment.
I think of him often (much more than I would like to), especially when I read anything about workplace surveillance. This term usually arises in the context of some new technology with alarming privacy implications that allows managers to track whatever employees are doing on their computers. But the concept is not a new one; the idea that people need to be constantly observed if they are to work efficiently dates back to Taylorist theories from the early 1900s about the best way to organise factory staff.
During the pandemic, there has been a renewed sense of panic about the implications of companies monitoring their employees. Most office work has been conducted online, and surveillance methods have adapted accordingly. Companies that offer remote monitoring software have reported a surge of interest in their products. Issues have been raised about things such as where the data collected from Zoom calls is stored, and which other companies it might be shared with.
The latest outcry happened last month, when it transpired that Microsoft 365, a software package released in 2019 that gives managers an overall rating of their team’s productivity by measuring things such as how many emails people are sending and who they are communicating with, also allows you to zero in on individuals. It’s possible to see how much people participate in group chats, and how much they contribute to shared documents.
Software that measures things such as what (and how fast) people are typing and what they are looking at on their screens would (or at least should) give most people the creeps. But in focusing primarily on these methods, partly because they seem new, we can miss how ingrained the instinct to watch and measure workers is.
Surveillance isn’t created by technology, but rather facilitated by it. It has been said that Covid has accelerated these practices, but perhaps the pandemic has simply highlighted the extent to which they always went on.
Employers have long correlated workers’ efficiency with their visibility, and this logic has followed through to the modern workplace. As far back as 1915, a contraption called the “ modern efficiency desk” (a flat metal desk that could be installed in rows) was designed so that clerks, who had previously used wooden desks surrounded by stacks of paper, were more exposed while working, and could therefore be more easily monitored.
My old boss was an extreme example, but in any open-plan office it is normal to be watched almost constantly by your superiors. In fact, one of the selling points of this layout is that it facilitates surveillance. Hence, a common experience is trying to orientate the appearance of your productivity around what you think is being measured, rather than trying to do your work to the best standard; dragging out tasks to stay late so your boss will not think you are shirking your responsibilities by leaving early, for example.
Lots of white collar jobs (law and accountancy are two examples) make employees record how they spend their time (even down to the minute) so they can bill clients. This same system is used for non-billable time too; certain things that are presented as perks (such as having key cards, clock-in systems for flexible hours, company phones that you can also use for personal communication and in-office socialising) also have monitoring possibilities built in. Meanwhile, digital forms of communication, such as Slack chats, generate an automatic record of everything people say, even in conversations that feel casual.
Away from the white collar world, Amazon workers operate under regimes of extreme surveillance , with networks of security cameras and hourly productivity goals for moving packages. And in many call centres, information is collected on everything from the length of calls and the number of call transfers, to the time people spend on their toilet breaks. This is, of course, significantly more invasive than a programme that monitors inter-office email communication, but the purpose is much the same.
All of this measuring is done in the name of maximising productivity. But the best measure of productivity is simply the quality and quantity of a person’s work. Monitoring what people are doing is not the same thing as measuring their work output. Indeed, a Harvard Business Review report from earlier this year argued that needlessly monitoring employees can erode trust. It exalted the benefits of new tracking options from a manager’s perspective, but stressed that not everything that can be tracked is relevant or useful; sometimes it is just a thing that can be tracked.
We are inured to the idea that professional environments have a built-in layer of surveillance, and now that this environment has merged with the home for many workers, some of these practices have started to look more extreme. But the discussion about surveillance should not start and end with the tools employers use to monitor people working from home. We should instead be asking: how necessary is any of this?
Rachel Connolly is a London-based journalist from Belfast