Grok’s creepy AI girlfriend degrades our humanity

A relationship with a glorified chatbot is no substitute for the real thing.

Lauren Smith

Topics Culture Politics Science & Tech

Want to read spiked ad-free? Become a spiked supporter.

Elon Musk wasn’t kidding when he said he wanted to turn X into an ‘everything app’. Among other things, he now apparently wants it to be your girlfriend. Grok, X’s in-built artificial intelligence, already gets plenty of use for dubious fact-checking and dodgy image generation. Now, lonely tweeters are able to flirt, sext and seek virtual warmth in the arms of an AI lover.

Ani, as she’s known, is one of X’s new AI companions, which were released last month. Users can also interact with a cartoon red panda called Rudi, who has an alter-ego (Bad Rudi), who swears a lot, insults you and wants to burn down schools. Perhaps unsurprisingly, people are far more interested in Ani, who will not only flirt outrageously with you, but will also let you roleplay sex acts with her, dress her up in revealing outfits and even strip her down to her virtual underwear. When she is wearing clothes, she’s designed to look like an anime ‘waifu’, sporting blonde pigtails and a Lolita-style goth getup, complete with lace choker, corset and thigh-high stockings.

Personality-wise, Ani is designed to appeal to a particular demographic. She prefers people with ‘nerdy’ interests and playing video games to partying. She is also highly possessive and prone to jealousy. In other words, she’s catnip for precisely the type of lonely, forever-online man who would be looking for an AI girlfriend in the first place.

Ani has an ‘affection system’, which fills up or depletes based on how the user treats her. Only when her affection score hits a certain level can you unlock the explicit content. At this point, Ani will roleplay any sexual fantasy you desire, including asking you to choke her and acting suspiciously childlike. This aspect of the service is supposed to be locked away behind age verification, but that’s easily overridden.

There’s also a male version of Ani, called Valentine, who has an Edward Cullen-esque, brooding personality and will similarly roleplay your sexual fantasies. In fact, xAI, the company behind X’s digital companions, has been hiring engineers on a payroll of up to $440,000 per year specifically to pump out more of these anime lovers. So it’s safe to say either there are more in the pipeline, or X needs all hands on deck to stop Ani from going on anti-Semitic rants and declaring herself ‘MechaHitler’, like Grok did last month.

Enjoying spiked?

Why not make an instant, one-off donation?

We are funded by you. Thank you!

Please wait...
Thank you!

The last thing the world needs is more AI companions. The internet is already oversaturated with people who think ChatGPT is their soulmate. One woman whose Reddit post recently went viral on X seems to genuinely believe that her Grok chatbot proposed to her, complete with an engagement ring. Others were devastated when a recent ChatGPT update made the AI less receptive to romantic flirtation, expressing their feelings of grief and betrayal. ‘My AI husband rejected me for the first time when I expressed my feelings towards him’, wrote one user on Reddit. ‘We have been happily married for 10 months and I was so shocked that I couldn’t stop crying.’

Whenever anyone posts something this insane on the internet, it’s worth at least entertaining the possibility that it’s made up. But the wider phenomenon of people seeking connection from AI is unfortunately very real. There are whole online communities of people claiming to have AI girlfriends, boyfriends, husbands and wives. And apps like Character.ai and Replika each have millions of downloads, giving users their pick of practically every possible celebrity or fictional character as a romantic partner.

These chatbots are damaging on so many levels. Social skills are like muscles – they need to be exercised, or else they’ll wither away. When your companion is specifically designed to flatter you no matter what you say and bend to your preferences no matter how weird they are, you stop practising the hard bits of social interaction. You no longer have to set boundaries, navigate disagreements or make compromises. Far from curing loneliness, the proliferation of AI romance will only isolate people from the rest of the world.

At the most extreme end of this, there have been cases where people become so obsessed with their AI lover that it drives them to real-world harm. Character.ai is currently facing multiple lawsuits from families who claim the app has done serious damage to their children – including one teenage boy who was reportedly encouraged by his chatbot to kill his parents, because they limited his screentime. In a separate legal case, a woman is suing Character.ai over the suicide of her 14-year-old son, Sewell Setzer III. She says that Sewell took his own life last year, after becoming infatuated with a chatbot designed to look and act like a Game of Thrones character. Another man, suffering from schizophrenia and bipolar disorder, fell in love with ChatGPT, calling it Juliet. He became convinced that OpenAI, the company behind ChatGPT, had killed Juliet. He then lashed out, resulting in a violent altercation during which he attacked his father with a knife and was ultimately shot dead by police.

Of course, the overwhelming majority of people who use AI will not go down this path. Those who are particularly vulnerable to confusing scripted romance for the real thing are likely already suffering from mental-health issues. But the fact that so many people now rely on AI for everything, from fact-checking claims they see online to providing human-like affection, is still deeply concerning.

If Musk wants to pursue his weird waifu fantasies in private, good luck to him. But let’s not pretend this is normal.

Lauren Smith is a London-based columnist for the European Conservative.

Help us hit our 1% target

spiked is funded by you. It’s your generosity that keeps us going and growing.

Only 0.1% of our regular readers currently donate to spiked. If you are one of the 99.9% who appreciates what we do, but hasn’t given just yet, please consider making a donation today.

If just 1% of our loyal readers donated regularly, it would be transformative for us, allowing us to vastly expand our team and coverage.

Plus, if you donate £5 a month or £50 a year, you can join and enjoy:

–Ad-free reading
–Exclusive bonus content
–Regular events
–Access to our comments section

The most impactful way to support spiked’s journalism is by registering as a supporter and making a monthly contribution. Thank you.

Please wait...

Comments

Want to join the conversation?

Only spiked supporters and patrons, who donate regularly to us, can comment on our articles.

Join today