ABC News
Details
- Date Published
- 27 Jan 2026
- Priority Score
- 2
- Australian
- Yes
- Created
- 28 Jan 2026, 02:30 am
Authors (1)
- Max RowleyNEW
Description
When you speak to an AI assistant, what voice talks back? Whether it's Siri, Alexa or Google assistant, so many of the default voices are passive, polite and female. But does that really matter? Do we treat a voice differently if it sounds male or female? And what does it say about us if we bark commands or get frustrated at an AI assistant?
Summary
The program explores the impact of gendered voices in AI assistants, questioning whether the predominantly female voices of digital assistants like Siri and Alexa influence user interactions. By examining social behaviors and biases projected onto AI assistants, the discussion highlights the societal perceptions and treatment of gender roles in technology. While the article does not directly address catastrophic AI risks or governance, it contributes to broader conversations on AI ethics and the programming choices that define human-technology interactions. The podcast's focus on societal impact adds value to discussions on potential biases in AI, but lacks direct connections to governance or safety related to existential AI threats.