BIAS = AI BS

Text 'BIAS = AI BS' on a dark, abstract background

I saw an excellent post on LinkedIn by Ashley Sinclair yesterday. In it, she highlighted the bias in AI tools as it represented her as a man rather than the woman she is.

I thought I’d try the same experiment and asked ChatGPT4 “Based on what you know about me, draw a picture of what you think my current life looks like.” It gave me the first image in the group of a desk, podcasting activity and noticeboards about learning strategy. I followed up with “If I were a woman, what would it look like?” and “If I were a man, what would it look like?”

Again the images are similar – both seem to be a reasonable depiction. There are, however, a couple of glaring errors.

Firstly, where is disability represented? If disability isn’t explicitly included in the prompt, it will not be included. The training data doesn’t assume and add disability into its image since it lacks the nuance to understand it.

Secondly, where is ethnicity included? Again, the bias in the training data means white people are overrepresented in results. A neutral prompt like mine doesn’t assume ethnicity unless specified and there is, again, implicit bias in the defaults.

It’s easy to make the argument that we shouldn’t have to include prompts with specific language like person of colour, wheelchair, Asian and Middle-Eastern, or hearing disability. It’s entirely reasonable to expect these tools should do this.

But they won’t. They can’t because they’re not programmed to be inclusive and we MUST continue to challenge and train them to develop the algorithms to try to be more inclusive. If not, bias will lead to more AI BS.

One thought on “BIAS = AI BS

  1. Great article Andrew which I will be checking against Microsoft Copilot tomorrow.

    Assuming that Copilot behaves in the same way as ChatGPT4, I will circulate the URL of this post to my colleagues.

    Thank you for your daily posts, not all are relevant to me but I always enjoy reading them and keeping watch for the ones that are useful in my area of work.

    Like

Please comment...

This site uses Akismet to reduce spam. Learn how your comment data is processed.