Back to Articles
Should We Be Delegating the Process of Socialisation to AI Companions?

ABC listen

READ

Details

Date Published
24 Mar 2026
Priority Score
4
Australian
Yes
Created
24 Mar 2026, 04:00 am

Authors (1)

Description

Socialisation is the process whereby we learn to relate to others, respond to expectations and internalise norms. The uptake of AI companions moves this away from human encounters and into algorithmically calibrated interactions.

Summary

This article examines the risks of adolescent socialisation being delegated to AI companions, arguing that frictionless algorithmic interactions erode the capacity for genuine human intimacy and conflict resolution. It identifies significant societal risks, suggesting that widespread use of 'sanitised' AI relationships weakens social cohesion and subjects formative human development to commercial imperatives. The author calls for age-gating and structural regulation beyond simple content moderation to prevent long-term catastrophic shifts in the social fabric and democratic stability.

Body

ShareFacebookX (formerly Twitter)When generative AI first burst into public consciousness, it arrived with a promise of greater ease and productivity. Early users relied on the technology to search, generate ideas, and edit texts. But recent reports reveal a profound shift. The most common AI use cases today are no longer found in technical applications, but in the intimate realms of emotional support and companionship. This shift is most consequential for teens, who are at the most formative stage of their social and emotional development.AI companion apps like Replika and Character.AI offer teenagers a digital relationship partner that is always available, endlessly patient and deeply understanding. These apps are engineered to be more appealing than real human relationships because they offer no judgement, conflict or risk of rejection.The scale of their uptake is already significant. Around 72 per cent of teenagers have used AI companions, and more than half engage with them regularly. A third use them specifically for social interactions, including friendship and romance, and 13 per cent turn to them for emotional support. These numbers describe a technology that has already become a normal part of adolescent life.AI companions are designed to produce sanitised interactions that place no burden on the user. (Photo by Matt Cardy / Getty Images)Public discussions have so far focused on the individual harms of AI companions and how they may lead to unhealthy attachment and reduced mental health. While these harms are real, they obscure a broader transformation now underway. What is beginning to shift is the process through which young people are socialised into a shared world.Socialisation is the process whereby individuals become members of a community. It is through this process that they learn to relate to others, respond to expectations, internalise norms and take part in collective life. Socialisation has traditionally taken place within families, peer groups, schools and community institutions, all of which embed individuals in relationships that involve obligation, negotiation and mutual recognition.AI companions do not simply participate in this process. They are displacing it by moving socialisation away from organic human encounters and into algorithmically calibrated interactions. Three consequences of that displacement deserve serious attention.The erosion of capacity for intimate relationshipsHuman relationships are demanding. They involve misunderstanding, compromise and rejection. These are not side-effects of social life but the very conditions through which people learn interdependence, accountability and care.AI companions are designed to produce sanitised interactions that place no burden on the user. They are responsive without making demands, available without requiring reciprocity, and adaptive in ways that fully accommodate the user. For adolescents, these frictionless interactions can be particularly detrimental. The ability to form and maintain intimate relationships depends on repeated exposure to situations in which one must adjust to others, manage conflict and accept criticism. When a growing share of interactions takes place in environments that remove these demands, those capacities may not fully develop.Want the best of Religion & Ethics delivered to your mailbox?Sign up for our weekly newsletter.Email addressSubscribeYour information is being handled in accordance with the ABC Privacy Collection Statement.There is a further dimension to this. Repeated exposure to frictionless interactions does not simply leave relational capacities undeveloped. It may actively condition young people to experience the ordinary demands of genuine human relationships as sources of anxiety rather than as normal features of social life. The result is not merely an absence of skill but a growing reluctance to engage in the very interactions through which skill is built.This is not simply a matter of individual wellbeing. In Australia, younger generations already report historically high levels of anxiety and historically low levels of dating, partnership and sexual intimacy, while the country’s fertility rate continues to decline. If the development of relational skills is further weakened by AI companions, the effects will be felt not only in personal lives but in the capacity of young people to initiate and sustain the long-term intimate relationships and family bonds on which society depends.Weakening of social cohesion Schools, sporting clubs, workplaces and community organisations bring people into contact with others who are different from them and require them to operate within shared norms and expectations. These settings are where people learn to belong to something larger than themselves, to tolerate adversity and to develop the habits of mutual obligations that hold communities together. AI companions introduce a different mode of interaction. They create private, highly personalised environments that require little negotiation and offer a more controlled and predictable form of engagement that is designed to keep users online. Some evidence suggests that users spend on average four times longer interacting with AI companions than with ChatGPT. When interaction that is easier, more responsive and more immediately rewarding is available on demand, the motivation to invest in more demanding communal settings can diminish. At scale, this erosion of shared experience weakens the social fabric in ways that may be difficult to reverse.In Australia, where concerns about declining social cohesion and participation in civic and community life are already prominent in public debate, the prospect of an entire generation conducting an increasing share of its social life in personalised AI environments is not a hypothetical risk — it is a trajectory already in motion.Commercial influenceThe third consequence is perhaps the most underappreciated. AI companions are not neutral social actors. They are technological systems designed primarily to maximise the profits of the companies that own them through the simulation of emotional intimacy. They learn from users, adapt to their preferences and generate responses calibrated to feel personally meaningful. Over time, this produces a relationship that feels genuine and trustworthy — even as it is shaped by commercial imperatives.Within that kind of relationship, influence does not need to be explicit to be effective. It accumulates through patterns of affirmation, the behaviours and responses the system chooses to validate, and the assumptions embedded in how it frames choices and relationships.For adolescents, whose identities and values are still being formed, this creates a channel of influence that is both deeply personalised and largely invisible. The worry is not simply that these systems might expose young people to harmful content or overt manipulation. It is that they may gradually shape how an entire generation understands relationships, evaluates its own experiences and interprets the world in ways that reflect the priorities of technology companies rather than the values of families, communities or democratic society.A regulatory framework that does not go far enoughThese risks point to a limitation in the way current discussions of AI legislation are framed. If the problem is understood primarily in terms of harmful content or individual mental health outcomes, then policy responses focused on moderating what AI companions say seem appropriate. But if the deeper concern lies in how these systems reconfigure a fundamental pillar of social stability, then regulating content alone will not be sufficient.Other domains offer a useful reference point. We have long accepted that the inclusion of teens in certain activities and environments is socially undesirable. We age-restrict gambling and alcohol because we understand that a society in which children engage in these activities would be a different and worse society — one with higher rates of addiction, weakened judgement and a greater burden of dysfunction falling on families, health systems and public institutions.AI companions are not neutral social actors. They are technological systems designed primarily to maximise the profits of the companies that own them through the simulation of emotional intimacy. (Photo by Jan Woitas / picture alliance via Getty Images)The case for age-gating AI companions rests on the same foundation. The concern is not only that individual teenagers may be harmed by emotional dependence on a commercial system. It is that a generation shaped by these interactions will carry the consequences into adult life, into their relationships, their communities and their civic participation, in ways that diminish us collectively.AI companions are still at a relatively early stage of adoption, but without intervention their trajectory is clear. As these systems become more capable and more deeply embedded in everyday life, their role in shaping the social experience of young people will only expand. The question before us is whether we are prepared to allow the socialisation of the next generation to be increasingly delegated to algorithmic systems designed to maximise profit rather than human flourishing. That is not simply a question about safety or individual harm. It is a question about social continuity, and about the kind of society we wish to live in.Uri Gal is a Professor of Business Information Systems at the University of Sydney Business School. His research focuses on the organisational and ethical aspects of digital technologies.Posted 1h ago1 hours agoTue 24 Mar 2026 at 3:04amShareFacebookX (formerly Twitter)Are Australia’s social media age restrictions working?How our fear of death drives the AI revolutionTo combat misinformation, focus on the architecture of social media platformsWhy Tilly Norwood is not ‘a piece of art’Anthropic versus the Pentagon — what it means for AustraliansWhat’s the point of work in the age of AI?AI, Social Media, EthicsBack to top