An algorithm is simply a set of rules that determine the automated responses to problems.[1] However, when 80% of the potential programmers of these algorithms are men of particular races,[2] the perpetuation of bias is almost an inevitable reality. This is magnified when there is little to no legislation surrounding the testing of bias in algorithms.
The broad use of automated algorithms and the magnitude of their effect means that one bias in one programmer can be amplified to an unprecedented extent.
There have been countless examples of the dangerous ramifications of these biases. For example, despite its futuristic and technological appearance, the innovative speech recognition feature on Google needed to be programmed by humans. The algorithm is programmed by real voices and develops itself by recognising patterns and organising data accordingly. As men were the primary people programming this algorithm, the Google feature ended up better recognising and responding to male voices, which formed the norms and patterns that determined its decisions.[3]
Similarly, racial bias is overwhelmingly present in search functions and facial recognition technology. A computer scientist named Joy Buolamwini found that the facial recognition features of some of the biggest tech companies in the world were better at identifying lighter skin than darker.[4]
Beyond identifying certain faces and voices, algorithms are also instrumental in determining things like credit scores and healthcare eligibility.[5] When the simple biases of programmers are affecting algorithmic decision-making to such a life-altering extent, society finds itself in new realms of injustice which it may not have the procedures or frameworks to tackle.
A recent MIT Technology Review article told several stories about individuals whose low socio-economic status was either perpetuated or produced by biased algorithms. One case revealed the devastating effects that credit scoring algorithms can have on individuals, creating a snowballing effect on poverty. Beyond this, childcare eligibility, medical aid and housing benefits all of which disproportionately affect people of low socio-economic status also depend on such algorithms which are vulnerable to both fault and bias in their programming.[6]
In one case, a woman with a disability who had been entitled to medical aid suddenly had her assistance cut without explanation. When her lawyer was confronting this in the courtroom, it was uncovered that the decision was not made by a person but by an automated algorithm. The questions raised in response to this information by the lawyer could not then be answered by healthcare representative and no person could be held accountable for the injustice. Similar eligibility cuts have been common, each coming back to the decision made by an algorithm programmed by humans.[7]
The most concerning aspect of this is that while some of these algorithms were faulty or malfunctioning, others were behaving exactly the way they had programmed…
The problems that cases like this raise are ones we have never had to face in our society, and because of this they are ones we do not yet have the framework to confront. Most of the public outside of the IT world do not even fully understand how algorithms operate or what they affect. Further complicating this is the fact that there is no required public access to the factors used by private companies to program their algorithms.[8]
This obscurity means that when it comes to court cases involving algorithms, lawyers like the one in the medical aid case do not have the tools to establish justice. If they are looking to confront the person responsible for an injustice, questions of where they should look are raised. Is it the fault of the coder, the distributor or the system itself?
If those seeking justice do not have access to the programming factors, the technical information or even the vocabulary surrounding the algorithms that make decisions for the public, their pursuit of equality may inevitably be in vain.
Algorithms and automation more broadly have immense potential for good in society. Their innate lack of humanness gives them a capacity for objectivity that would be impossible for humans to achieve. Many cases have seen algorithms work effectively in promoting diversity and reducing discrimination, despite the many other cases that have done the opposite.[9]
However, as it stands, the lack of transparency, information and legislation surrounding algorithm makes possible their fault and misuse. For algorithms to play the role we have given them in society fairly and effectively, the systems and the people that surround them must have the knowledge and, in many cases, the self-awareness to ensure they work for all.
While algorithms pose great risk, they also pose great potential; the rest of society simply needs to catch up.
______________________________________________________________________
Michael McQueen is a trends forecaster, business strategist and award-winning conference speaker.
He features regularly as a commentator on TV and radio and is a bestselling author of 8 books. To order Michael's latest book "The Case for Character", click here.
To see Michael speaking live, click here and for more information on Michael's speaking topics, michaelmcqueen.net/programs.
______________________________________________________________________
[1] Moya, G 2021, ‘Algorithmic racial and gender bias is real. The California State Legislature must act,’ The Fresno Bee, 13 January.
[2] Pozniak, H 2020, ‘The bias battle: how software can outsmart recruitment prejudices,’ The Guardian, 23 December.
[3] 2020, ‘Making algorithms unbiased in today’s gender biased world,’ Analytics Insight, 26 December.
[4] 2020, ‘Making algorithms unbiased in today’s gender biased world,’ Analytics Insight, 26 December.
[5] Hao, K 2020, ‘The coming war on the hidden algorithms that trap people in poverty,’ MIT Technology Review, 4 December.
[6] Hao, K 2020, ‘The coming war on the hidden algorithms that trap people in poverty,’ MIT Technology Review, 4 December.
[7] Hao, K 2020, ‘The coming war on the hidden algorithms that trap people in poverty,’ MIT Technology Review, 4 December.
[8] Hao, K 2020, ‘The coming war on the hidden algorithms that trap people in poverty,’ MIT Technology Review, 4 December.
[9] Pozniak, H 2020, ‘The bias battle: how software can outsmart recruitment prejudices,’ The Guardian, 23 December.