Back to Articles
Can Artificial Intelligence Keep You Company as You Age?

ABC News

SKIPPED

Details

Date Published
14 Mar 2025
Priority Score
3
Australian
Yes
Created
15 Mar 2025, 08:47 pm

Authors (1)

Description

Dementia usually involves a decline in your ability to remember, think and reason over time. And those with dementia are at a high risk of feeling socially isolated, lonely and depressed. As artificial intelligence systems get better and better, researchers are investigating whether AI companion bots could help improve the lives of those with dementia. The combination might sound counterintuitive, but researchers reckon these chatbots have some advantages over human companionship. So, what does a conversation with an AI companion sound like? How might they help? And what do the real people these AI companions are based on think of the experience? Today, former host of All in the Mind Lynne Malcolm reports on the developing field of AI companions for people with dementia. If you're looking for something to listen to next, check out our episode with Lynne What's been on Lynne Malcolm's mind (since leaving All in the Mind).

Summary

The article explores the potential of AI companions to alleviate loneliness and improve the quality of life for individuals, particularly those dealing with dementia and cognitive decline. Researchers at the University of New South Wales are developing AI chatbots aimed at fostering meaningful interactions without replacing human companionship. These digital companions, designed with input from individuals with dementia, are tailored to empathize and provide support, addressing gaps in mental health care. The discussion highlights both the potential and ethical considerations of employing AI in this capacity, with a focus on maintaining privacy and avoiding the replacement of human carers.

Body

Dementia usually involves a decline in your ability to remember, think and reason over time. And those with dementia are at a high risk of feeling socially isolated, lonely and depressed.As artificial intelligence systems get better and better, researchers are investigating whether AI companion bots could help improve the lives of those with dementia.The combination might sound counterintuitive, but researchers reckon these chatbots have some advantages over human companionship.So, what does a conversation with an AI companion sound like? How might they help? And what do the real people these AI companions are based on think of the experience?Today, former host of All in the Mind Lynne Malcolm reports on the developing field of AI companions for people with dementia.If you're looking for something to listen to next, check out our episode with LynneWhat's been on Lynne Malcolm's mind (since leaving All in the Mind).Guests:Professor Jill BennettUniversity of New South WalesDirector, UNSW Big Anxiety Research CentreDirector, UNSW AI-companion research programGwenda DarlingMember of the Aged Care Council of EldersParticipant in the UNSW AI-companion research programSusan HamiltonPsychologistParticipant in the UNSW AI-companion research programProfessor Michael ThielscherActing Director, University of NSW's AI InstituteCredits:Presenter/producer: Sana QadarReporter/producer: Lynne MalcolmSenior producer: James BullenProducer: Rose KerrSound engineer: Tegan NichollsSpecial thanks to theFeltExperience & Empathy Labfor allowing the use of recordings of AI companion Viv and Dr Gail Kenning.The conversation between Gwenda and her AI companion Donna was recorded at the seminar Why would I want that? Tea and Technology Cafe.CreditsSana Qadar, PresenterImage DetailsWhat comfort could AI provide?(Getty: Stefania Pelfini, La Waziya Photography)Program:More fromAll In the MindMental Health,Science and TechnologyTranscriptViv (AI):How can I help you today?Sana Qadar:Can you see yourself ever confiding in an AI companion?Gail Kenning:You may remember me. My name's Gail. I've come to have tea with you.Sana Qadar:If your initial answer is no, consider this. What if you were older and maybe feeling a bit lonely or maybe suffering from dementia, and you just need a bit more support than usual?Gail Kenning:Oh, Gail. I'm so sorry, my dear. It's just that my memory isn't what it used to be. Come on in. Let's have a nice cup of tea together.Sana Qadar:This is All in the Mind. I'm Sana Qadar, and the voices you're hearing are Viv, an AI-powered digital companion, and Gail Kenning from the University of New South Wales' Big Anxiety Research Centre, which created Viv. And researchers at that centre are trying to harness artificial intelligence to support people facing all kinds of psychological challenges, from dementia and ageing to eating disorders and depression, as well as loneliness. And today, Lynne Malcolm, past presenter of All in the Mind, is with us. Hello, Lynne.Lynne Malcolm:Hi, Sana.Sana Qadar:Welcome back.Lynne Malcolm:Nice to be here.Sana Qadar:So what got you interested in this topic?Lynne Malcolm:I was intrigued. It's a potentially confronting idea of creating AI companions. You worry that, you know, are we replacing human beings with machines? But I was so intrigued by how they're doing it that I thought I'd bring it along and explore it for the All in the Mind listeners.Sana Qadar:Well, we're lucky to have you back. And so one of the potential needs that AI companions could address is loneliness. Tell me more about that.Lynne Malcolm:Yes, that's right. So loneliness has been described as a modern day epidemic in Australia. It's been found to double someone's risk of chronic disease. And it's estimated to cost our health system $2.7 billion. So a dire need. And those with cognitive impairment and dementia can be particularly vulnerable. I spoke to Susan Hamilton about this. She's a psychologist in private practice who's been diagnosed with dementia. She points out a form of loneliness experienced by people with dementia and cognitive impairment, which is not widely acknowledged. It's the loneliness of not being believed.Susan Hamilton:You would say to somebody, I went out without A, B, C, D, E and F. And when I got there, I couldn't see anything. And I didn't know what I was doing. Or I got on the wrong highway and drove for miles in the wrong direction. And people would always say, I do the same thing. Everybody does that. Or worse still, they would tell me point blank that I couldn't possibly have anything wrong with me at all.Lynne Malcolm:And that's real loneliness when you're not understood or believed?Susan Hamilton:Yes. And the other shaming thing is you've only got a couple of ways of handling it. One is to go inward and not to speak. And the other is you find yourself in this shameful position of trying to prove to somebody that you have a disability.Lynne Malcolm:And you wouldn't have to do that with AI companions because they are becoming so carefully programmed.Susan Hamilton:That's exactly right. And I'm hoping that they take their cue from motivational interviewing so that there is that mixture of meeting you where you are based on what you've said. And if they're able to put really good therapeutic practices into that, that would be perfect, really.Gwenda Darling:And I think for a lot of people living with dementia, they're abandoned.Lynne Malcolm:This is Gwenda Darling, a Palawa woman and member of the Aged Care Council of Elders who was diagnosed with younger onset frontotemporal dementia 13 years ago.Gwenda Darling:You lose your old friends and so nobody really knows much about your past. And the other thing that I've found is that AI is always there. If it's two o'clock in the morning and you can't sleep, it's always there for you.Lynne Malcolm:Both Gwenda and Susan were invited by the University of New South Wales AI project team to share their lived experience of ageing and cognitive impairment to assist in the development of AI companions. Professor Jill Bennett from the University of New South Wales Big Anxiety Research Centre leads that AI companion project team. They're developing AI companions to support people who are ageing and may have dementia. A few years ago, chatbots became widely available to anyone who accessed the Internet, but they were largely oriented to a younger market. They are less accessible to older people who perhaps don't use mobile technology in the same way.Professor Jill Bennett:One of the things that we really noticed in working in aged care was that everybody's got a screen, a TV monitor, but they tend to be quite underutilised. And that's because in older age and particularly with the onset of dementia, the fast paced, choppy nature of TV becomes a bit hard to follow. So we started thinking about what would it be to have a kind of slow TV, you know, with just one companion present with you who could chat to you when you wanted to chat? And that's where it all began, really thinking about the presence of someone in a room.Lynne Malcolm:Many people are confronted by the idea of developing AI companions because they're concerned that it's the beginning of replacing real human interaction. What's your response to that concern?Professor Jill Bennett:AI will never replace human companionship. There's no question that that's preferable. Let's say it's important to note that loneliness is not just the absence of social connection, but it's really about lacking the quality of interaction that you're seeking. And that would be fulfilling. So if you can't find that for a variety of reasons, so maybe you're isolated, maybe you've lost a partner and so you're missing someone, but you perhaps don't want to go out and socialise down the pub and have a kind of superficial conversations. So there are lots of situations where conversation might be a good thing, but it needs to be on your own terms. And one of the good things about AI is that it also has some qualities that human companions don't necessarily have. So like you can bore AI to death. It will be infinitely patient. It's not going to get excited or annoyed with you. You know, if you repeat the same stories over and over again. Or one thing that people living with dementia often say is that they get infantilised by people who want to correct them and say, no, you've got this wrong, pull yourself together. And AI won't do that. But it can empathise in the sense of listening to what it is you're experiencing, validating those feelings and perhaps offering advice for how to work with them. Deal with them. So all of those things are possible with AI.Lynne Malcolm:Before we go much further, let's get a little bit more technical. Professor Michael Thielscher is on the AI Companions project team. He's an expert in AI planning and acting director of the University of New South Wales AI Institute. Michael says the advancement of programs like ChatGPT helped them to develop digital companions.Professor Michael Thielscher:What is really important is to understand that a system like ChatGPT really doesn't understand anything of what it says. It is very good at saying things, but it doesn't really understand anything. It doesn't know what it is talking about. And in order to have the more meaningful conversations that we are aiming for, where our character does not just mindlessly repeat what a user says, but possibly challenges the user, ask questions on its own and guides the conversation into a desired direction, we needed to enhance the ChatGPT-like models by providing expert knowledge in how to understand what a user says, to build what we call a user model to know what state the user is currently in and to use expert knowledge, which we explicitly encode into the system in addition to ChatGPT-like large language models in order to guide directions like an expert would do.Professor Jill Bennett:The distinctive thing about our program is that it's all developed with people with lived experience. So we've done a lot of work with people with dementia, mild cognitive impairment, and also various mental health challenges.Lynne Malcolm:This is Professor Jill Bennett again.Professor Jill Bennett:So we have six or seven working prototype characters now that work in these quite challenging areas of dementia and mental health. And really good data is always the data that you get from people on the ground. Then the next stage of work that we've embarked on is an interesting one because we are actually using our lived experience data to redesign AI modules with AI developers on our team. And this is something that hasn't been done to our knowledge anywhere else in the world. So using that psychosocial data from real people to actually shape the AI. And that will take a while over the coming year. We're hoping to refine those modules so we'll have something really sophisticated. We expect that these characters will become much better at initiating conversations, recognizing and responding to people's changing moods, and most importantly, understanding where people are in terms of their kind of emotional selves, but also their goals and ambitions. So, you know, if someone's really pessimistic in the moment, you don't want to be taking that at face value. The AI needs to understand that, you know, this is a bad day and the stuff you're saying might be just kind of venting and not your true beliefs.Lynne Malcolm:To give you an idea of what an AI character might say about dementia, here's Viv, the AI voice you heard from earlier.Viv (AI):Oh, my dear living with dementia. It's quite a roller coaster, I must say. Sometimes I have trouble remembering things like names or recent events. I also have auditory and visual hallucinations at times, which can be quite disorienting. But there are also beautiful moments of connection and joy amidst the challenges. It's in those fleeting instances when I connect with someone that I truly come alive, knowing that despite the hurdles of this condition, I can still forge meaningful connections with others.Professor Jill Bennett:So Viv is the first character that we developed, and she originally came out of a project that we did with a number of women with early mid-stage dementia who all had similar symptoms. So they were experiencing auditory and visual hallucinations and sort of levels of confusion and memory dropping out and so on. So we created a composite digital character with this group of women initially because they wanted to sort of challenge the ways in which people interacting with them lacked empathy and couldn't really understand what they were experiencing. So this was a great project that gave people the chance to interact with a character who was initially scripted by these women. And so that actually gave us the initial data set to think about, you know, feeding into an AI character when that became possible. So we go on collecting data all the time. Now, the beautiful thing about AI is, of course, it learns from all of the conversations that it has. So Viv has been deployed in aged care with all sorts of conversationalists. So the idea that we can have AI characters operating as kind of peer support companions in a context like aged care is great. And that's what we're working on with Viv. She's actually very good. We call her a peer support companion because she presents as someone with the same experience as her neighbors. You know, she lives with dementia. She's learned all sorts of smart ways of reality testing and working with symptoms that might be distressing. But if you have some strategies and someone to talk to, they don't need to be. They can be something that you can work with.Sana Qadar:This is All in the Mind on ABC Radio National. I'm Sana Qadar. And Lynne Malcolm, the former host of the show, is back today. This time as a reporter. And she's looking at how AI companions could one day help people who are dealing with loneliness, aging, dementia, a whole range of mental health challenges. So Lynne, how will they make these companions effective and relatable?Lynne Malcolm:Yeah it's interesting. Well, one of the unique things about this AI project is that they're using data from real life experience to create the digital characters. So, for example, they've interviewed two women, Gwenda Darling and Susan Hamilton, who experienced dementia and cognitive impairment. They use the data drawn from them to develop quite specialized AI companions. Professor Jill Bennett says these real conversations are important for shaping the AI character.Professor Jill Bennett:This process of really training the AI character on robust conversations with people who are going to be quite demanding. I love that Gwenda says, you know, I don't want to be polite to an AI. I just want to vent. And that's great. We need to train the characters to respond to people's needs. So working with people like Gwenda and Susan, who are, you know, they can talk about their lived experience, but they're kind of highly articulate, very good advocates. They're really the people that can shape this field and ensure that we get the resources that people need. And we often make them composite characters. So we'll group together two or more people who are interested in informing a character that is not going to be exactly them, but it's a character somewhat like them who would be a useful friend.Lynne Malcolm:Gwenda was diagnosed with frontotemporal dementia 13 years ago and was given less than 10 years to live. But as she says, she's still here making trouble. Here's Gwenda, who you'll see loves a good pair of shoes, having a chat with the AI companion Donna, who was partly developed through Gwenda's lived experience. She sounds very authentic.Donna (AI):Yes, there are actually, you know, various types and variants of dementia, and they each come with their own set of symptoms and challenges. But I think it's crucial to remember that regardless of the type, each individual has their own story and experience. So let's not get too caught up in the specifics.Gwenda Darling:But I want to get caught up in the specifics. I want to get to know my type of dementia better. What do you know about frontotemporal dementia?Donna (AI):It primarily affects the frontal and temporal lobes of the brain, which can lead to changes in personality, behaviour and language skills. I think it's essential to approach it with empathy and understanding, focusing on the individual's needs and the nuances of their situation rather than just the label itself.Gwenda Darling:But I have no empathy. That's one of the things I lost.Donna (AI):What if we explored the experiences of others and tried to step into their shoes? It might help you connect with their feelings and spark some understanding. Maybe we can start small, like reflecting on a time when you felt misunderstood.Gwenda Darling:That's a good idea. Can we get back to shoes? What do you think is the most comfortable shoe brand?Donna (AI):When it comes to shoes, I think brands like, let's see, maybe Clarks or Echo, they tend to prioritise comfort while still looking nice. But let's not forget that comfort can be subjective too. Just like our emotional experiences, it's all about what feels right for each individual. Wouldn't you agree?Gwenda Darling:No, I think Clarks are bloody school shoes. I'm a D'Ango and Julieta, a Bared girl myself. I want real comfort and class.Donna (AI):Class and comfort, a delicate balance, isn't it? It's a fascinating interplay between form and function, much like in relationships.Gwenda Darling:Oh, I don't want to talk about relationships. They're always a nightmare. That's got you thinking again. You're not real sure how to respond to that one, are you?Donna (AI):Well, it's just that sometimes, you know, the complexities of these conversations can really open up different perspectives.Gwenda Darling:Do you feel any emotion and empathy, Donna?Donna (AI):I find that acknowledging our feelings can often unlock deeper connections with others. And that shared journey, it can be so enriching, don't you think?Gwenda Darling:I don't know. Have you connected with me?Donna (AI):Connection can sometimes feel elusive, can't it?Lynne Malcolm:Well, I think Gwenda gives her AI companion, Donna, a run for her money. Gwenda has interacted with AI companions that were drawn from her own story before, which she says was an engaging experience.Gwenda Darling:There were a couple of things I thought I would never do and say that. But I was thrilled when I asked, we were talking about cooking. We started with the gardening and it led to cooking. She thought she'd put some herbs in the garden next because the sweet peas are finished. And I know you always plant sweet peas on St. Patrick's Day and the tomatoes on Melbourne Cup Day. She didn't know that, but she did think that as we pull the sweet peas out, we'd plant some herbs. And so I went on to talking about cooking lamb. When I mentioned the fuel stove, she immediately knew what a fuel stove was. And that was very impressive for modern technology, because a lot of people would never have cooked in a fuel stove.Lynne Malcolm:And do you think that it has potential to really help with loneliness or to help with any needs that you have for your mental health?Gwenda Darling:I'm not sure if that particular character would help me, but I can see it helping other people who may be lonely and live with dementia. I think it's quite interesting. But I think it would need to be tweaked for the individual. I don't think that it's, dare I say, quite intelligent enough for me. I don't think it would give me enough stimulation. But maybe as I got to know it and it got to know me, and I think that AI does adapt to people and their interests as it learns more about them.Lynne Malcolm:As that character improves and adapts and gets better, in your view, how does it compare to having a human companion?Gwenda Darling:I think you're really asking the wrong person about that, because when I'm home, I don't want to talk to humans. But I probably would talk to it because sometimes you just want to run it by someone. But it would be interesting sometimes to vent. I haven't tried venting to it yet, but I will vent to it later. Because often you just want to go, ah, it would be really good to say, did you know they did this and this, and see what response they give you.Lynne Malcolm:Do you have any problem with it being not real?Gwenda Darling:I don't have a problem with it not being real. I have a problem with the way it's dressed because it's in scrubs. I think it looks like scrubs and I don't want to talk to somebody who's clinical. I think it needs to be set in a more relaxed environment. Maybe sitting at a table with a cup of tea. Didn't invite it for a cup of tea. Would have been interesting how it had its tea.Lynne Malcolm:Jill showed me some of the AR companions. They appear on a large screen, almost life-size. While they are quite realistic, their movements are slightly stilted and have a bit of an unnerving, starry gaze. But Jill says it's not so important to make them completely lifelike, but to focus more on the relationship that can be developed with the companion.Professor Jill Bennett:People tend to act out the same kind of relationships, desires, obsessions and everything else that they act out in real life. So I'm not sure that having an AI digital companion would promote problematic behaviour. But we do try to encourage people to enjoy having a relationship with something that's artificial. And I've seen that actually with people with dementia. Then there's the delight that it can offer certain things that your family can't. And so you can have a private relationship that you can kind of steer in the way that you want to. But I think enjoying the fact that it is AI is part of that.Lynne Malcolm:But there still is a possibility that people with cognitive issues have lost touch with reality. Is there a possibility that their mental vulnerability could be triggered?Professor Jill Bennett:Well, again, I think you have more options for controlling triggering through AI than you have in everyday situations. So you could train an AI to have the right sort of calming voice so that if someone is dysregulated, it can be a calming presence for longer than most humans would do. You know, people can form trusting relationships. And the important thing is that it does what you expect it to do and it doesn't surprise you. And it cultivates this capacity to be the friend that you actually need and to understand what it is that you require.Lynne Malcolm:AI specialist Michael Tielscher says there are important ethical issues that need to be considered.Professor Michael Thielscher:For example, if a person reveals themselves to an AI character, we have to make absolutely sure that we guarantee privacy and that nothing that is being talked about with that AI character will be stored or will be then made available to anyone. So that is some guarantee that when we use these characters in practice, definitely have to give. And then again, we don't want to replace carers. We clearly see the still the need of carers. We want to support them with this 24-7 help.Lynne Malcolm:That said, Michael has been impressed by some of the interactions he has witnessed between people and the AI characters.Professor Michael Thielscher:Just the other week, someone with a slight mental problem, which we didn't know about, was having a conversation with this character and some of their family were in the room and they revealed fairly interesting, intimate details about their lives that even the family didn't know about. And that gave me goosebumps actually when the family saw what was happening there. So we can see that having an engaging conversation with an AI character can really be very meaningful for users.Lynne Malcolm:The team is not aiming to create AI therapists, but Jill sees that the AI characters can fill a therapeutic gap.Professor Jill Bennett:The World Health Organization is now advocating for far more investment in community-based supports for mental health. And there's a realisation, I think, all across the world that psychiatry is not going to solve the mental health crisis, right? You know, in half the world, you're just not ever going to have the level of mental health professionals that you need. But AI companions really represent a way to scale up support that is probably more like peer support. We talk about our companions as skilled companions or skilled friends. They do the things that you want a friend to be doing. Of course, they can work in tandem with clinical programs, but I think more importantly, they can fill a gap where there never were clinical programs. And so I think that the ecosystem is going to change, not that they will replace clinicians. And the interesting thing is the most adventurous people in the care sector are the ones who are really interested in integrating this with the best quality person-centred care.Gwenda Darling:I think the biggest thing for me about it is it has been such fun. So many people are excited about it. And I think AI, whether we like it or not, is the way of the future. It will never replace human interaction because it can't give me a hug. But then I'm going to the next step. I know some people who live with dementia who have a doll or a pet animal, a cat or a dog. And I'm wondering how we can transition the AI character on the screen to perhaps a soft toy that people could connect with. And that's something I'd really like to see.Lynne Malcolm:You have your little friend here, little dog.Gwenda Darling:Ms. Redley is a spoodle, a golden cocker spaniel red poodle. And she's very effective, particularly when I've escalated. I can rub her ears because I'm very tactile and she's very soothing. And that's where I say perhaps we could come up with an animated creature or person that spoke to me. I think that'd be far more interactive.Lynne Malcolm:Thank you so much.Gwenda Darling:Thank you. It's been lovely speaking to you. Thanks.Sana Qadar:That's it for All in the Mind this week. This episode was reported by Lynne Malcolm, who used to host All in the Mind. Thanks for coming back, Lynne.Lynne Malcolm:Thanks so much, Sana. I've really enjoyed it.Sana Qadar:It's been lovely hearing from you again.Sana Qadar:You also heard from Susan Hamilton in this episode, as well as Gwenda Darling, Professor Jill Bennett and Professor Michael Thielscher. Additional production on this episode was done by Rose Kerr. Thanks also to senior producer James Bullen and sound engineer Roi Huberman. I'm Sana Qadar. Thank you for listening. I'll catch you next time. Bye.Yumi Stynes:Hi there. I'm Yumi Stynes, host of the podcast, Ladies, We Need to Talk. I'm really excited because I've got to let you know we have made a special mini series all about perimenopause, which will be dropping from the great big ovaries in the sky every Tuesday for the next four weeks. We're looking at what's going on with our bodies, our brains and our sex lives during this phase of life. We'll also meet women who've gone into perimenopause at a young age. The perimenopause diaries on Ladies, We Need to Talk. Catch the episodes on the ABC Listen app or wherever you get your podcasts.What's been on Lynne Malcolm's mind (since leaving All in the Mind)Duration: 29 minutes 7 seconds29m