AI is not your friend
Virtual friends, partners and therapists are no solution to the epidemic of loneliness.
Do you dream of having a girlfriend who always laughs at your jokes? Or wish you had a friend who’s there for you 24/7? Or what about a boyfriend who never argues back? Maybe you should consider exploring ‘digital intimacy’.
In an interview with the Dwarkesh Podcast earlier this month, Facebook founder and Meta CEO Mark Zuckerberg laid out some bleak-sounding plans for the future of human interactions. He explained that while the average American has ‘three people that they would consider friends… the average person has demand for meaningfully more, I think it’s, like, 15 friends’. Zuck’s solution to this? Artificial intelligence.
Zuckerberg concedes that AI ‘probably’ won’t be able to replace real-life connections with other human beings. But it could, he thinks, help us feel less alone, should AI be the only option available. He points out that plenty of people are already using large-language models (LLM) to mimic romantic relationships or act as virtual therapists.
Zuckerberg is right that we are currently living through what some have described as a ‘loneliness epidemic’. That statistic he was referring to most likely comes from a 2023 Pew Research Centre survey, which found that 40 per cent of Americans say that they have just three or fewer friends. Young people are particularly affected. While almost half of people over 65 say they have five or more close friends, this shrinks to 32 per cent among those under 30. Another study from 2021 found that a quarter of Americans between the ages of 20 and 24 have either just one or no close friends. The situation is similar in the UK, where young adults are far more likely to say they regularly experience loneliness and where 20 per cent of under-24s say that they don’t have a best friend. The same can be said for romantic relationships, with increasing numbers of young people struggling to find, maintain and even define love.
It’s no wonder, then, that young adults are becoming more open to virtual substitutes for these relationships. Apparently, one in four Americans under 40 believes that AI could ‘replace real-life romantic relationships’. A smaller number are open to pursuing such relationships themselves – seven per cent of respondents to this survey said they would consider having an AI romantic partner. Another study found that one in 10 young people is open to an AI friendship, with one per cent having one already.
Though there might be a small number of people willing to admit it, there is clearly a market for these kinds of services. Perhaps the starkest example of this is Replika, an AI chatbot explicitly designed to act as a virtual companion. (Tellingly, it has over 10million downloads on the Google Play store.) The app allows users to customise an avatar, who they can then hold conversations with over text. It functions just like a real, if creepily deferential, confidant. Perhaps unsurprisingly, people mostly use Replika for weird sex stuff. And, predictably, a lot of those people have become unhealthily attached to their AI ‘relationships’. Replika users express a strong desire to ‘have a family’ with their virtual girlfriends, complain about the stigma around AI ‘relationships’ and weigh up the pros and cons of digital vs human relationships.
This is easy enough to poke fun at. For most people, it’s obvious that the AI character in their phone doesn’t actually have a mind of its own, is completely incapable of loving or caring for you, or even coming up with novel responses like an actual human can. But the people who fall down these rabbit holes are already in a very dark place. Take, for example, the tragic case of Sewell Setzer III. This 14-year-old boy from Florida killed himself last year after he became entangled in what he believed was a very real relationship with an AI chatbot, which he had customised to behave like a Game of Thrones character. In another case from 2023, an AI called Eliza apparently encouraged a Belgian man to take his own life. The man had expressed a desire to die in order to save the planet from climate change, and he appeared to have developed an intense romantic attachment to the chatbot.
This is one of the biggest problems with AI ‘relationships’. Namely, they are not sentient. They cannot meaningfully push back against anything you say. In real life, if you said that you were thinking of ending it all, a normal friend or partner would try to talk you down. But AI doesn’t have that sense built in. Anyone who has used ChatGPT and similar services will know just how predictably sycophantic it can often be. It thinks whatever you say is brilliant, because it is built to please you, the customer. It’s like expecting a prostitute to tell you honestly whether you’re good in bed.
It’s this unpredictability that makes human relationships worthwhile and fulfilling. Yes, an AI friend will never disagree with you or force you to confront uncomfortable truths about yourself. Nor will it want you to pick it up from the airport, or come crying to you after a breakup or make you read the first draft of its terrible novel. In a way, the lack of demands might be merciful. But it also strips away an essential part of what a human relationship really is. If anything, relying on the digital world for your intimacy will only make people more lonely and isolated. Why bother going out into the real world, where people are complicated and sometimes chaotic, when ChatGPT is right there in your pocket?
Believing that we can simply plug the loneliness gap with tech reduces the very human need for connection down to a purely transactional interaction. It’s not the case that people simply need someone or something to vent to or to act out their sexual frustrations on. There is something special about living, breathing human company that invigorates us and helps us function. Often, when we open up to our friends, it’s not that we’re really looking for them to give us a step-by-step plan for how to cheer ourselves up or resolve a tricky situation. A hug or an offer to go down the pub for a couple of hours is both more helpful and more meaningful. In the same way, a romantic relationship isn’t just about having someone to enact sexual fantasies with or to mine as a source of endless compliments. It is the shared life that makes us feel seen, known and truly connected.
AI is useful for a lot of things. It can help in medical diagnostics, improving supply chains, automating manufacturing and much else besides. And it is highly likely that, in the future, LLMs will indeed behave in more convincingly human ways, dropping the distinctive, formulaic way of speaking and being more able to push back against its users. But AI can never replicate the messy, unpredictable and, crucially, reciprocal nature of real human connections. No matter how convincing the illusion, a machine will never know you or love you like a human being.
Lauren Smith is a staff writer at spiked.