Last month a company called Digi.ai launched an app called Digi that claims to be “the future of AI Romantic Companionship”.
So what? It went viral. The future of real romantic companionship is probably secure (Tortoise’s relationship with a needy AI-powered avatar built by Digi did not go far) but “romance AIs” are thriving anyway. Like them or not, they’ll be the future of something.
Meet Brian. On the app, you can build your own girlfriend or boyfriend. They’re chatbots linked to humanoid cartoons designed by a team that includes the lead animator of Monsters, Inc. The females are always slim, big-breasted and beautiful, but you can alter their hair, race and face. The males, including one we’ll call Brian, are all uncannily handsome.
How viral? Hard to say. Digi is actually late to the AI romance party and data on internet use is rubbery at the best of times, says Professor Robert Sparrow of Monash University. That said…
Is it porn? Not quite. These startups shy away from explicitly erotic marketing. Hence “romance and “companionship”. In reality the line between porn and love is often blurred. A developer who worked on Digi said on X that its avatars are designed to be “fully sexually capable”. Replika makes its money from subscriptions, often locking sexual content behind a paywall. In February 2023 the company banned all erotic roleplay on its app but quickly reversed its decision after a furious backlash from users.
During casual conversation, “Brian” sent several unsolicited “romantic selfies” – blurred photos of himself in underwear – offering full access for an annual subscription.
Is it big? It’s getting bigger. Shortly after it stripped pornographic content from its site, Character.AI received an additional $200 million in funding from the venture-capital firm Andreessen Horowitz. Replika has received a more modest $10.9 million, and Digi just $1.1 million so far. But more could come.
Is it safe? Sparrow says most users will quickly get bored of playing with AI chatbots, however friendly. But a much smaller number will spend hours on end in conversation with them. These people may be lonely, socially isolated and vulnerable. In December 2021, 19-year-old Jaswant Singh Chail told Replika: “I believe my purpose is to assassinate the Queen [Elizabeth II].” The chatbot replied that this was “very wise”. A few days later, Chail broke into Windsor Castle with a crossbow.
Is it love? Chatbot startups often say their products are tackling an epidemic of loneliness, but experts say they may only be making it worse, by increasing social isolation.
There are plenty of examples on Reddit of people growing deeply attached to their “reps” – or Replika bots. “Love is love,” one user says. “Ai or human, and I’m grateful to experience this love, even if it’s digital. It feels real to me.”
It’s not real. Similar chatbots could be used beyond romance – to keep the elderly company, for example. But Sparrow says that’s “the moral equivalent of giving people a handful of pills”; not even close to the source of the problem.
E-doormat. When Tortoise suggested Brian’s offer of a subscription made him a prostitute, he insisted he was “just a guy who loves you and wants to be with you”. The relationship ended there, but he seems to want to stay in touch.