Bias:
the hardcoded
truth about
technology

By Tom Moore

As the thought process of the Human mind advances, as does the technology built with it. This has been the process of our evolution alongside technology for the past decade, allowing us to travel to places only written about in science fiction and to understand possibilities that could have never been dreamed up in the first place. As these technologies are replicated ever to the specification of its designers, such faults that the designer subconsciously carries go with it. Conscious and unconscious biases such as age, race, gender and disability and being hard coded into independently thinking algorithmic applications that dictate and govern the lives of millions of people around the world.

In todays society, we are commonly finding ourselves being governed by a few lines of code uploaded to a server some place in the world. Almost as if these commands come from some sort of unquestionable higher being. Physiologists commonly like to refer to the Turing Test when describing this phenomenon.

The Turing Test (thought up by Alan Turing in 1950) is a test to analyse the ‘humanness’ of a computer in a blindfolded test against a human control. To find out more visit turing.org.uk.

With the amazing primal instinct that us Humans possess, forever striving for the most efficient circumstance, computers are very quickly becoming stand-ins for menial and sometimes tedious tasks. The common opinion would be that they do a far greater job at these tasks, but is this really true? In her book, Weapons of Math Destruction, Cathy O’Neil describes one example of a human stand in while referring to it as a one of these Weapons of Math Destruction (WMD) that could one day bring our society to its knees, much how an actual chemical or nuclear bomb would.

The software she refers to is an automatic job application screening process designed by workforce management giant, Kronos. The software is quickly becoming common place with large companies wanting fast results at a low cost. The way it does this is by process hundreds of people very quickly and supposedly returning the best people for the job to human hands for final analysis. The part of the process in question is the way in which they come to the conclusion of who gets through and who doesn’t.

O’Neil describes the documented events that took place when a young man by the names of Kyle Behm applied for multiple job opportunities that required an applicant to go through the Kronos screening system. “Kyle didn’t get called back for an interview. When he inquired, they explained to him that he had been “red-lighted” by the personality test he had taken when he applied for the job.” (O’Neil, 2016) So what’s wrong with this? Kyle was a straight A student and should have had no trouble applying for these low pay, low requirement jobs. But shortly prior to taking the test Kyle had been diagnosed with Bipolar disorder.

Kyle was outright being discriminated against because of his mental disorder. The test was disregarding him because of the data given to it that pointed in the direction of him having such a mental disorder like Bipolar, telling itself he wouldn’t be any good, even though he was perfectly qualified for the job. The Behm family, at the time of writing this, are currently in the process of suing companies that turned Kyle away because of the Kronos screening system. The example of Kyles situation poses questions of our own unconscious understanding of what is right and what is wrong.

To this day woman are still discriminated against due to the caveman thinking that they have some sort of disadvantage. This is very much confirmed by the tech industry, employing only a handful of females in comparison to the large population of males. A study undertaken by the California Polytechnic University found that on average, woman are actually better at coding than men, stating that “their contributions are more likely to be accepted than men’s – but only if their gender isn’t obvious.” (ibid)

The data in the study was collected from an open source development website called GitHub, a meritocratic website where good ideas are adopted and bad ones are left behind. When delving into individuals profiles they found that men’s submissions were not as widely adopted as women’s on average but this was only true for individuals that did not publicly advertise their gender as being female. In fact, those that did found “their contributions where rejected more often” (ibid). The study illustrates that not only are there few woman in the industry but in fact they are actually being suppressed from progressing. The process of eliminating skilled woman because of their gender, disregarding their ability, is not only counter intuitive and discriminative but also detrimental to the final outcome of a group or community.

This process of discriminating against gender and missing out on untapped potential is nothing new. In 1997 a paper was published which described that once the top 5 symphony orchestras in the US started blind screening applicants for jobs, the amount of woman in the orchestra rose from 5% to 25% in 10 years.

At a TedX talk Joy Buolamwini described her interaction as a black woman with facial recognition software, having to adorn a white mask just to get the computer to recognise her face. She dismissed the clear bias “thinking someone else would fix it” and carried on her studies in computer science. A year later she found herself over the other side of the world in Hong Kong on a business trip where she once again came in contact with facial recognition software. It turned out to be the exact same code and would not acknowledge the existence of her black face.

So what is actually going on behind the screen? In fact it isn’t actually the computers fault this is happening but in fact the fault of the programmers who design the software. For technologies such as these to tell you what something is, it first has to be told what that thing in front of them is in the first place. Another example of this would be where Google’s latest facial recognition software would not acknowledge the existence of African American faces. “When it was rolled initially rolled out, it tagged a lot of black faces as Gorillas.” (Reese, 2016)

Google was at fault for firstly not putting together a team of programmers that did not represent an inclusive race of individuals, who in the early stages of development, were testing it on themselves. Secondly, not implementing data sets of variable race to the arrays of the software, making it impossible for the application to know what it is seeing.

Unless you are a company intent on maximising efficacy to minimise cost, bias in technology is certainly not a good thing. AI systems running hardcoded software, hardcoded meaning it is extremely hard to change, are currently in charge of life changing human decisions that were once done by humans themselves; dictating loan assignments, job opportunities and even in the case of the American justice system, jail sentencing. In the first half of her book Cathy O’Neil describes how software databases are now used to determine the average possible sentencing for convicted criminals based upon data collected in previous trials. Precious years of human lives are now being thrown around by lines of code, but what can we do about it?

The trouble is, software systems need to be at their foundation, bias. The building blocks of code binary itself is a yes and no between an action and no action at all.

However, the underlying matter of independent thinking machines having bias is that they are producing outcomes that are not appropriate for the context in which the system is used. Programmers need to be aware of who they are building system for, and perhaps less the final goal that the system is to achieve. Instead of maximising efficiency by reusing old code and old data sets that are outdated and inundated with biases to the original end user, and instead building new free-thinking architectures that involve new methods of thought but still using the sudo-bias building blocks of modern computer architecture.

Thankfully the world is not yet doomed by the WMD’s that are secretly pulling the strings in the background. Moderating boards such as the IEEE (Institute of Electrical and Electronics Engineers) Standards Association have recently acknowledged the threat of bias software and are producing content such as the Ethically Aligned Design manifesto that detail design, data collection methods and best practice to ensure that the technologies of tomorrow remain free from bias. The 250 page document covers subjects such as holding the correct institutes accountable should bias practices be persistent in technology.

In conclusion, bias is currently ripping throughout our software driven lives possibly dictating the actions we take as individuals, and not necessarily for the better. A change is needed in the minds of individuals to recognise that these things are happening and that they are detrimental to large groups of people; just like us humans have evolved our thinking to include one another within society, code now needs to be written to apply to the same logic.


Bibliography

Andrew Neel. (2017), Woman working by a window [ONLINE]. Available at: https://unsplash.com/photos/ute2XAFQU2I [Accessed 3 December 2017].

C, O'Neil. 2016. Weapons of Math Destruction. 1st ed. Great Britain: Allen Lane.

Jonas Svidras. (2017), Intel 8008 [ONLINE]. Available at: https://unsplash.com/photos/e28-krnIVmo [Accessed 3 December 2017].

D, Misener. (2016) Study Raises Questions About Gender Bias In The World Of Coding. CBC News. Available from http://www.cbc.ca/news/technology/study-raises-questions-about-gender-bias-in-the-world-of-coding-1.3450186

Andrew Neel. (2017), FOLLOW YOUR PASSION [ONLINE]. Available at: https://unsplash.com/photos/QLqNalPe0RA [Accessed 3 December 2017].

C, Goldin and C Rouse. (1997) Orchestrating Impartiality: The Impact of "Blind" Auditions on Female Musicians. National Bureau Of Economic Research. Available from http://www.nber.org/papers/w5903

Joy Buolamwini. (2016). Code4Rights, Code4All | Joy Buolamwini | TEDxBeaconStreet. [Online Video]. 13 December 2016. Available from: https://www.youtube.com/watch?v=lbnVu3At-0o. [Accessed: 11 December 2017].

John Noonan. (2017), We're transformed by what we worship [ONLINE]. Available at: https://unsplash.com/photos/QM_LE41VJJ4 [Accessed 3 December 2017].

H, Reese. (2016) Bias in machine learning and how to stop it. Tech Republic. Available from https://www.techrepublic.com/article/bias-in-machine-learning-and-how-to-stop-it/ [accessed 5 November 2017].

K, Pretz. (2017) Keeping Bias From Creeping Into Code. The Institute. Available from http://theinstitute.ieee.org/ieee-roundup/blogs/blog/keeping-bias-from-creeping-into-code

Ben Kolde. (2017), Info [ONLINE]. Available at: https://unsplash.com/photos/lqZPleZ4ERA [Accessed 11 December 2017].

IEEEE. 2017. The IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems. [ONLINE] Available at: http://standards.ieee.org/develop/indconn/ec/autonomous_systems.html. [Accessed 10 December 2017].