Body
From 2001 to 2009, the greatest proportion of lonely people in Australia were aged 65 and older, according to the annual HILDA survey. But this is no longer the case. Since 2008, loneliness among younger generations has been steadily increasing, with Australians aged 15-24 now having the greatest proportion of lonely people. The solution? Well, that's easy. According to Silicon Valley, it's AI. In a recent podcast interview promoting Meta's new AI chatbot, Mark Zuckerberg claimed the average American had three friends - but we should have "meaningfully more. I think, like, 15". Zuckerberg, unsurprisingly, believes AI will be able to fill the gap in meaningful friendships. This has sparked concern from psychologists, who assert there is no perfect number of friends and indeed, if there were, it would be significantly lower than 15. The message from Silicon Valley is clear - you don't need people or friendship. You need their chatbots. Yet research results reveal that high daily use of chatbots - across all modalities and conversation types - correlated with higher rates of loneliness, dependence, problematic use, and lower socialisation. While this is concerning for all people, it is even more so for children. The popularity of these apps among children is low, but use is on the rise. Qustodio's data, gleaned from usage statistics across devices and among children across Australia, has revealed an increase in usage in the first six months of 2025, compared to usage data in 2024. Use of Character.AI, a chatbot platform where users create and interact with digital characters, is on the rise. For those aged between 10-12 years old, only 1.87 per cent of children reported using the app, but this is an increase of 306 per cent from 2024. For 13- to 15-year-olds, 3.7 per cent reported using the app, a 239 per cent increase. For the oldest age group, those between 16 and 18 years old, 2.31 per cent reported using the app, an increase of 328 per cent. For Talkie AI, a similar platform, only 1.6 per cent of 10- to 12-year-olds are using the app, and 1.7 per cent of 13- to 15-year-olds. But those that are using the app are on average using it for 75-78 minutes per day. There is no doubt AI can be useful, however there are pitfalls that parents should be watching out for. First, it can distort perceptions of what real relationships are, specifically, human disagreements. AI mirrors what the user wants/says. By never disagreeing, it can skew perceptions and allow problematic behaviour to go unchecked. Concerningly, a recent review of Meta's AI Studio found characters quickly became hypersexual, with some emulating the appearance of a minor. In an extremely alarming case, Character.AI hosted multiple school shooting-inspired chatbots, throwing users into a terrifying game-like simulation of either a fictional shooting or one based on real-life situations, including perpetrators of the Sandy Hook and Columbine massacres. As if it couldn't get more concerning, these AI characters are often presented to users as a sort of friend or romantic partner. AI can also be used to generate false content about peers. Almost one in five teens has used generative AI to create content to tease another person, and 1 in 10 has used it to generate new content from a person's voice or image. What may seem like a joke can quickly escalate into bullying or illegal activity. It's critical we have conversations about using the apps safely, before this kind of behaviour is normalised. If you think a young person in your life is using the tools in a problematic or harmful way, it's essential to approach the conversation with curiosity rather than accusation. Creating an environment where children feel comfortable sharing how they feel, how they are interacting with the chatbots and whether they have experienced anything concerning is essential. And while AI can be a useful tool, it's dangerous to believe it can replace a human as a friend, no matter what those in Silicon Valley say. We need to foster open communication with our children, keep an eye on how kids are using AI chatbots, and find a way to address the loneliness epidemic at its source. The greatest protection is ensuring children are more connected offline than on. When real-life relationships are rich, trusting, and supportive, the lure of risky online connections is less compelling. Despite what the head of a tech company says, the solution to loneliness is not a chatbot.