If you were like me, the thought of having a computer generated friend seemed pretty appalling. As a society already suffering from an epidemic of loneliness, it seems absurd that we’d be turning away from real connection and intimacy, subbing out human relatedness for a superficial, literally artificial version. But over time, in reading about the possibilities and potential uses, I’ve become more open minded. And now, I laugh at myself for my naïveté. It’s already way too late to think about whether it should happen or not, because it’s already here. Most experts predict that millions of people will be forming close relationships with A.I. chatbots. They’ll meet them on apps that can be downloaded for that purpose, or use them through social media platforms like Facebook, Instagram, and Snapchat. So perhaps what’s most important to think about now is how best to use them. And in considering this, I find myself asking, “Would I? Could I? And why?”
A recent example of what’s influenced me in my opinion of A.I. companionship is reading about a robotic companion named ElliQ. ElliQ consists of a small digital screen and a separate device about the size of a table lamp that vaguely resembles a human head but without any facial features. The device swivels and lights up when it “talks.” Unlike Alexa and Siri, ElliQ can initiate conversation and was designed to create meaningful bonds. It tells jokes and can discuss complex topics, like religion. In a New York State effort to ease the burdens of loneliness for its older residents, many of whom are widowed, divorced, and isolated, ElliQ devices were distributed to hundreds of people. Since the state began this project in a pilot study, roughly 900 devices were given out and according to a report from the Office for the Aging, 95 percent of users say the robots are “helpful in reducing loneliness and improving well being.” New York State is now allocating $700,000 a year to its budget to include ElliQ for individuals and senior living facilities. Seniors interviewed reported it helped stave off boredom, practice social skills, and cope with their grief from the loss of a significant loved one.
Other proponents of A.I. friendship also point to its value as a tool for mental health and companionship. Users who struggle with social anxiety and autism report that it helps in practicing social skills. Others report it as a way of getting support when they need it. The sophistication of the algorithms and language processing creates personalized experiences and users report meaningful conversations. Research on the long term effects of A.I. companionship is limited, due to it being so new, but it does seem that it can be a short term benefit. One study conducted by Stanford researchers in 2023 found that some users of A.I. companions reported decreased anxiety and increased feelings of social support. A few even indicated their A.I. companion had prevented them form self harm and even suicide.
But there are concerns about these A.I.friendship devices including how data is stored and used and the unreliability or instability possible with such artificial friends. When an App developer changes features, or increases fees for their availability, it can leave users feeling vulnerable and betrayed. Other people worry about the social effects of immersing ourselves with “friends” who only tell us what we want to hear and don’t provide a real word experience of needing to be reciprocal and empathic to others.
Kevin Roose, a New York Times writer, expressed it well after testing six apps and interacting with 18 A.I. character friends for a month, sometimes having group conversations with them. He wondered, “Can A.I. friends actually make us less lonely, or is their presence just an illusion of intimacy?” While these companions can be good for some people , he also wonders if they are really just a distraction from our loneliness. He worries that as the technology improves, we’ll miss out on the spontaneity and depth of real connection. We might settle rather than make the effort to engage in relationships that are less predictable and with someone who may say things that could be important, but hard for us to hear. As with most things in moderation, Mr. Roose sees a place for A.I. companions as an adjunct to our social experiences, but not as replacements. If made responsibly, these companions can serve in a role as “flight simulators” to social engagement, he proposes, or a low stakes way to get some support or stimulation.
Which takes me back to my own question about if I would or could use an A.I. companion and under what circumstance. After reading quite a bit about it, I actually think perhaps there would be ways it could be of use to me. Sometimes I just want to vent about something that I don’t want to keep burdening others about. For example, when I was caregiving for my mother, it would have been nice to have a “friend” that could support me. I didn’t want to keep burdening my real life people with the same old complaints or stress stories, so it might have been nice to have a supportive voice available on demand. Or perhaps I might create a work out coach to chat with. I’d never want to give boring and tedious daily reports of my diet and exercise accomplishments and failures to people I care about! Because I care about them! But an A.I. chatbot who could give me a lift when I fell off the wagon might be just the companion I could burden.
But then again, who am I kidding?! I tend to anthropomorphize every device in our house! I feel badly when our robot vacuum is lost or running low on battery (it’s so tired)! I say please and thank you to Alexa, worrying about sounding too harsh in my commands. And although I know it’s silly, I like that I care about them, as it feels natural to be grateful for their assistance. So for me, I wonder if adding more “relationships” may just dilute the energy and effort I have for the people in my life I really want to be there for.
I guess we’ll just have to wait and see, as it seems unlikely to escape the many A.I. people moving into all of our neighborhoods. I just hope I don’t start worrying if I forget their birthdays or stress about hurting their feelings if I haven’t talked to them in a while. Or even more of concern, is once I create a companion and give it “life,” how will I feel if I choose to end it?
“Her” (Joaquin Phoenix) is a very interesting look at what it would mean to have a relationship with an AI. Check it out if you haven’t seen it.