In honour of Eusebius McKaiser and what he taught me about racism, privilege and our responsibility to confront it, I’d like to highlight the column that my IITPSA colleague Kudzayi Chipidza wrote about the racial and gender bias in artificial intelligence systems.
Kudzayi explains that “just as children learn from their elders, similarly, AI-driven applications ‘learn’ from the data models and datasets they are trained on.” He specifically mentions the facial analysis and recognition technologies that communicate the worldview of the people that trained the datasets that these technologies are based on.
It reminded me of the racial bias I saw on my iPhone some years ago. I like the feature that brings you photo memories and asks if you want to see more photos of of a particular friend. When I clicked ‘yes’ to see more photos of one of my black friends I was shocked to see that the algorithm brought up photos featuring a wide variety of my black male friends and colleagues. Meaning that the algorithm behind this functionality wasn’t able to distinguish between individual black faces.
My indignation turned to embarrassment when I remembered a similar mistake I made on my first job in South Africa in 2003. After having lived the first 35 years of my life in a majority white country I had moved to a majority black country and worked for a majority black corporate.
One day I went to reception to collect a business contact I had a follow-up appointment with. From our first meeting I remembered a friendly black gentleman with a shaved head and wearing a nice suit. At reception I saw several friendly black gentlemen with shaved heads, all wearing nice suits. Confused, I asked the first gentleman who greeted me to come up with me in the elevator. We chatted and it was only when we ran into the person he was supposed to meet that I realised my terrible mistake. If only that elevator could have rocketed me straight out of the building! Red-faced and apologetic I went back to collect the right gentleman.
It made me aware that racial bias can be so much more insidious than the blatant forms that are easy to recognise and confront. Just as Eusebius makes painfully clear in his book Run, Racist, Run. And it doesn’t always come from ill-will, it can result unintentionally from the worldview you’ve been brought up with. I believe that both the iPhone algorithm and I have since learned more inclusive facial recognition skills by opening up to different worldviews.
As Kudzayi states, it is this awareness of both blatant and insidious racial and gender bias and how to confront it, that we as people need when training data models and algorithms. He ends by sharing this powerful quote by computer scientist Joy Buolamwini: “If we fail to make ethical and inclusive artificial intelligence, we risk losing gains made in civil rights and gender equity under the guise of machine neutrality”.
- The Global Digital Compact: Shaping our Collective Digital Future
- The Impact of Artificial Intelligence in African School Leadership Development
- Staying human in a digital age and the ethics of techno-solutionism
- Crafting Authentic CVs in a digital age for our client Solid Green
- City of Gold, City of Education