Managers are increasingly reliant on computer software and algorithms when assessing the performance of their staff. This is the case both in traditional forms of employment where there is a clear employer-employee relationship, and with gig economy arrangements which muddy the water by attempting (sometimes unsuccessfully) to classify staff as self-employed contractors.
One of the glaring problems of reliance on computer software to make important decisions is the possibility that it has been poorly programmed or implemented, as was the case with the “mutant algorithms” exams fiasco. When it comes to determining someone’s ability to work, a faulty algorithm can destroy livelihoods for no good reason. And sometimes it can even land people in jail.
Imprisoned by buggy software
The Horizon computer system was introduced into the Post Office in 2000. At a cost of £1 billion, it was designed to digitally process a variety of tasks including transactions, accounting and stocktaking. But over the years sub-postmasters increasingly raised concerns over bugs in the system. These software errors resulted in hundreds of sub-postmasters being accused of theft, forced to repay allegedly missing money, losing contracts and even being criminally convicted and sent to prison.
A report in 2013 by independent investigators Second Sight, found evidence of defects in the Horizon software. Although the Post Office admitted there were problems, it defended the system, arguing at the time that “there is absolutely no evidence of any systemic issues with the computer system”. However, a successful civil claim (Bates & Ors v Post Office Ltd) brought by over 550 sub-postmasters resulted in the Post Office settling for £57.75 million, without admitting liability.
Fast forward to 2020, when the Criminal Cases Review Commission (CCRC) decided, in light of the successful civil claim, to refer for appeal the convictions of dozens of sub-postmasters who had been convicted for theft, fraud and false accounting as a result of the faulty computer system and subsequent alleged cover-ups by Post Office management. In April 2021, the Court of Appeal quashed the convictions of 39 sub-postmasters in Hamilton & Ors v Post Office Ltd  EWCA Crim 577, ruling that the failures of investigation and disclosure of problems with the Horizon system were “so egregious as to make the prosecution of any of the ‘Horizon cases’ an affront to the conscience of the court.” It concluded that:
“Defendants were prosecuted, convicted and sentenced on the basis that the Horizon data must be correct, and cash must therefore be missing, when in fact there could be no confidence as to that foundation.”
In particular, the court noted that the Post Office had failed to disclose evidence about bugs in the software to the defence lawyers of the sub-postmasters which meant that they had not had a fair trial.
In their commentary on the case, Peter Lownds and Sapandeep Singh Maini-Thompson of 2 Hare Court conclude that:
“As algorithms and machines are increasingly trusted to make decisions, prosecutors must be more attentive to the role such evidence plays in shaping the criminal liability of ordinary persons.”
But although the reliance on faulty computer software played a central part in this miscarriage of justice, the consequent decisions by Post Office managers are equally, if not more, critical. Lawyer and commentator David Allen Green argues:
“The problem was not so much the Horizon software but a sequence of horrible, deliberate decisions made by human beings – about whether to bring prosecutions, to contest civil cases, and to avoid the disclosure of relevant documents.”
Sacked by an algorithm
The Amsterdam District Court recently ordered Uber to reinstate six Uber drivers and pay compensation after they were deactivated by the algorithm on the basis of suspected fraud. In terms of gig economy work, being deactivated from an app is the equivalent of getting sacked by an employer.
The Uber drivers, five of whom are British, were backed by the App Drivers and Couriers Union (ADCU) and the campaign group Worker Info Exchange. The case was brought under Article 22 of the General Data Protection Regulation (GDPR) which provides for protections against automated individual decision-making. Commenting, James Farrar, Director of Worker Info Exchange said:
“For the Uber drivers robbed of their jobs and livelihoods this has been a dystopian nightmare come true. They were publicly accused of ‘fraudulent activity’ on the back of poorly governed use of bad technology. This case is a wake-up call for lawmakers about the abuse of surveillance technology now proliferating in the gig economy.”
One of the Uber drivers, Abdifatah Abdalla, lost his private hire licence from Transport for London (TfL) essentially on the basis of being deactivated by the Uber app. The City of London Magistrates Court ordered TfL to reinstate his licence following the successful case, and reportedly criticised TfL’s “willingness to accept [the evidence] Uber provided” without conducting its own investigation. Uber is applying to have the judgment set aside.
Holding algorithms to account
The two cases outlined above merely highlight the growing problem of reliance on computer algorithms and some of the legal ramifications which can ensue when there is insufficient human oversight over automated decisions or findings.
Whilst computer software can help with staff management – and forms a central pillar of the gig economy – blindly following algorithmic decisions can land organisations in hot water.
It is vital that any challenges to software findings are thoroughly investigated. And algorithms should be routinely audited to ensure they are performing correctly.
The Law and Policy Blog: The Post Office case is damning, but do not blame “computer error”.
Private Eye: Justice Lost in the Post.
[Added] Stephen Mason: Electronic Evidence, Chapter 6: The presumption that computers are ‘reliable’, in particular para 6.143.