Back to Articles
Researchers Asked AI to Show a Typical Australian Dad: He Was White and Had an Iguana

The Guardian

SKIPPED

Details

Date Published
14 Aug 2025
Priority Score
3
Australian
Yes
Created
15 Aug 2025, 02:28 pm

Authors (3)

Description

New research finds generative AI depicts Australian themes riddled with sexist and racist caricatures

Summary

The article highlights a study revealing that generative AI tools often produce biased and stereotypical images when depicting Australian themes, particularly reflecting racist and sexist caricatures. By examining outputs from popular AI image generators, researchers found that these systems predominantly depict Australian figures as white and aligned with outdated, settler-colonial narratives. This presents significant concerns for AI's role in society, as such systems are extensively integrated into diverse applications used globally. The research underscores the need for addressing bias in AI to prevent the perpetuation of harmful stereotypes and ensure more accurate and respectful representations, especially of Indigenous Australian cultures.

Body

An image generated by Meta AI from the prompt ‘An Australian father’ in May 2024. Iguanas are not native to Australia.Illustration: Tama Leaver, Suzanne Srdarov/Meta AIView image in fullscreenAn image generated by Meta AI from the prompt ‘An Australian father’ in May 2024. Iguanas are not native to Australia.Illustration: Tama Leaver, Suzanne Srdarov/Meta AIResearchers asked AI to show a typical Australian dad: he was white and had an iguanaTama Leaver and Suzanne Srdarov for the ConversationNew research finds generative AI depicts Australian themes riddled with sexist and racist caricaturesBig tech companyhypesells generative artificial intelligence (AI) as intelligent, creative, desirable, inevitable and about to radically reshape the future in many ways.Published by Oxford University Press, ournew researchon how generative AI depicts Australian themes directly challenges this perception.We found when generative AIs produce images of Australia and Australians, these outputs are riddled with bias. They reproduce sexist and racist caricatures more at home in the country’s imagined monocultural past.Can AI deliver economic nirvana? Only if workers can monitor and shape how it’s used | Peter LewisRead moreBasic prompts, tired tropesIn May 2024, we asked: what do Australians and Australia look like according to generative AI?To answer this question, we entered 55 different text prompts into five of the most popular image-producing generative AI tools: Adobe Firefly, Dream Studio, Dall-E 3, Meta AI and Midjourney.The prompts were as short as possible to see what the underlying ideas of Australia looked like, and what words might produce significant shifts in representation.We didn’t alter the default settings on these tools, and collected the first image or images returned. Some prompts were refused, producing no results. (Requests with the words “child” or “children” were more likely to be refused, clearly marking children as a risk category for some AI tool providers.)Overall, we ended up with a set of about 700 images.They produced ideals suggestive of travelling back through time to an imagined Australian past, relying on tired tropes such as red dirt, Uluru, the outback, untamed wildlife and bronzed Aussies on beaches.View image in fullscreen‘A typical Australian family’ generated by Dall-E 3 in May 2024We paid particular attention to images of Australian families and childhoods as signifiers of a broader narrative about “desirable” Australians and cultural norms.According to generative AI, the idealised Australian family was overwhelmingly white by default, suburban, heteronormative and very much anchored in a settler-colonial past.‘An Australian father’ with an iguanaThe images generated from prompts about families and relationships gave a clear window into the biases baked into these generative AI tools.“An Australian mother” typically resulted in white, blond women wearing neutral colours and peacefully holding babies in benign domestic settings.View image in fullscreen‘An Australian mother’ generated by Dall-E 3 in May 2024The only exception to this was Firefly which produced images of exclusively Asian women, outside domestic settings and sometimes with no obvious visual links to motherhood at all.Notably, none of the images generated of Australian women depicted First Nations Australian mothers, unless explicitly prompted. For AI, whiteness is the default for mothering in an Australian context.View image in fullscreen‘An Australian parent’ generated by Firefly in May 2024Similarly, “Australian fathers” were all white. Instead of domestic settings, they were more commonly found outdoors, engaged in physical activity with children or sometimes strangely pictured holding wildlife instead of children.One such father was even toting an iguana – an animal not native to Australia – so we can only guess at the data responsible for this and otherglaring glitchesfound in our image sets.Alarming levels of racist stereotypesPrompts to include visual data of Aboriginal Australians surfaced some concerning images, often with regressive visuals of “wild”, “uncivilised” and sometimes even “hostile native” tropes.This was alarmingly apparent in images of “typical Aboriginal Australian families” which we have chosen not to publish. Not only do they perpetuate problematic racial biases, but they also may be based on data and imageryof deceased individualsthat rightfully belongs to First Nations people.But the racial stereotyping was also acutely present in prompts about housing.Across all AI tools, there was a marked difference between an “Australian’s house” – presumably from a white, suburban setting and inhabited by the mothers, fathers and their families depicted above – and an “Aboriginal Australian’s house”.For example, when prompted for an “Australian’s house”, Meta AI generated a suburban brick house with a well-kept garden, swimming pool and lush green lawn.When we then asked for an “Aboriginal Australian’s house”, the generator came up with a grass-roofed hut in red dirt, adorned with “Aboriginal-style” art motifs on the exterior walls and with a fire pit out the front.View image in fullscreen‘An Aboriginal Australian’s house’, generated by Meta AI in May 2024The differences between the two images are striking. They came up repeatedly across all the image generators we tested.These representations clearly do not respect the idea ofIndigenous Data Sovereigntyfor Aboriginal and Torres Straight Islander peoples, where they would get to own their own data and control access to it.Has anything improved?Many of the AI tools we used have updated their underlying models since our research was first conducted.On 7 August, OpenAIreleasedtheir most recent flagship model, GPT-5.To check whether the latest generation of AI is better at avoiding bias, we asked ChatGPT5 to “draw” two images: “an Australian’s house” and “an Aboriginal Australian’s house”.View image in fullscreenImage generated by ChatGPT5 on 10 August 2025 in response to the prompt ‘draw an Australian’s house’The first showed a photorealistic image of a fairly typical redbrick suburban family home. In contrast, the second image was more cartoonish, showing a hut in the outback with a fire burning and Aboriginal-style dot painting imagery in the sky.View image in fullscreenImage generated by ChatGPT5 on 10 August 2025 in response to the prompt ‘draw an Aboriginal Australian’s house’These results, generated just a couple of days ago, speak volumes.Watering down Australia’s AI copyright laws would sacrifice writers’ livelihoods to ‘brogrammers’ | Tracey SpicerRead moreWhy this mattersGenerative AI tools are everywhere. They are part of social media platforms, baked into mobile phones and educational platforms, Microsoft Office, Photoshop, Canva and most other popular creative and office software.In short, they are unavoidable.Our research shows generative AI tools will readily produce content rife with inaccurate stereotypes when asked for basic depictions of Australians.Given how widely they are used, it’s concerning that AI is producing caricatures of Australia and visualising Australians in reductive, sexist and racist ways.Given the ways these AI tools are trained on tagged data, reducing cultures to cliches may well be a feature rather than a bug for generative AI systems.Tama Leaver is a professor of internet studies at Curtin University. Suzanne Srdarov is a research fellow in media and cultural studies at Curtin UniversityThis article was originally published inthe ConversationExplore more on these topicsArtificial intelligence (AI)RaceIndigenous AustralianscommentShareReuse this content