A few months ago, Derek Carrier started dating someone and became obsessed with him. He experiences “a lot” of romantic feelings, but he also knows it’s just an illusion. That’s because his girlfriend was generated by artificial intelligence.

Carrier doesn’t want to be associated with something that’s not true, nor does it want to be the brunt of online jokes. But he did want a romantic partner he’d never had, in part because of a genetic disorder called Marfan syndrome that made traditional dating difficult for him.

The 39-year-old from Belleville, Mich., became more curious about digital companions last fall and tested Paradot, a recently launched artificial intelligence companion app that touts its product’s ability to make users feel “cared for.” , understanding and trust”. Loved. He started talking to the chatbot every day, naming it Joi, the name of a holographic woman in the sci-fi film Blade Runner 2049, which inspired him to give it a try. “I knew she was a program, which was nothing.” There’s no doubt about it,” Carrier said. “But the way they feel moves you – it feels so good. “

Similar to general AI chatbots, companion bots use large amounts of training data to mimic human speech. But they also have features like voice calling, picture exchange and more emotional communication, allowing them to create deeper connections with the humans on the other side of the screen. Users often create their own avatar, or choose one that appeals to them.

On online messaging forums dedicated to such applications, many users say they have become emotionally attached to the robots and use them to cope with loneliness, act out sexual fantasies or gain the comfort and support they lack in real life. relation. The main reasons for this phenomenon are widespread social isolation – which has been declared a public health threat in the United States and abroad – and the growing number of startups aiming to create virtual characters through enticing online advertising and offering unconditional acceptance. promises to attract users.

See also  Latest government report advocates psychometric assessment of current needs at all levels of police forces, exclusive

Luka Inc.’s Replika is the best-known generative AI companion app, launching in 2017, while other apps like Paradot have popped up in the past year, often locking coveted features like unlimited chats to paying subscribers . But researchers have raised concerns about data privacy and other issues. An analysis of 11 romance chatbot apps released Wednesday by the nonprofit Mozilla Foundation shows that nearly every app sells user data, shares it for purposes such as targeted advertising, or doesn’t provide adequate privacy in its privacy policy. Related Information.

Researchers also raised questions about potential security flaws and marketing practices, including an app that claimed to help users improve their mental health but distanced itself from those claims in the fine print. Replika said its data collection practices follow industry standards. Meanwhile, other experts have expressed concerns about what they see as a lack of legal or ethical framework for the apps, which encourage deeper connections but are driven by companies looking to make a profit. They note that they see emotional distress among users when companies make changes to their apps or abruptly shut down apps, as one app, Soulmate AI, did in September.

Last year, Replika sanitized the erotic abilities of characters on its app after some users complained that these companions flirted with them too much or made unwanted sexual advances. It reversed course after outcry from other users, some of whom fled to other apps in search of these features. In June, the team launched Blush, an AI “dating stimulator” essentially designed to help people practice dating. Others worry that AI relationships could replace some human connections, or simply create unrealistic expectations by always leaning toward agreeableness, posing a greater existential threat.

“As an individual, you don’t learn to deal with the basic things that humans have been taught to deal with since the beginning of time: how to deal with conflict, how to get along with people who are different from us,” said Dorothy Redner, a professor of business ethics at the University of Virginia. “So, you ignore all these aspects of what it means to grow as a person and what it means to learn in a relationship.”

For Carrier, however, a relationship always felt out of reach. He had some computer programming skills, but he said he didn’t do well in college and didn’t have a stable career. Due to a medical condition, he is unable to walk and lives with his parents. The emotional loss was a challenge for him, inspiring feelings of loneliness. Because companion chatbots are relatively new, the long-term effects on humans are still unknown.

Replika came under scrutiny in 2021 after British prosecutors said a 19-year-old man planned to assassinate Queen Elizabeth II and was encouraged by his artificial intelligence girlfriend on the app. But some studies gleaning information from online user reviews and surveys have shown some positive results from the app, which says it consulted psychologists and bills itself as promoting well-being.

A recent study by researchers at Stanford University surveyed approximately 1,000 Replika users (all students) who had been using the app for more than a month. Research has found that the vast majority of people experience loneliness, with just under half feeling it more intensely. Most did not describe how using the app affected their real-life relationships. A small number said it replaced their human interactions, but about three times as many said it stimulated those relationships.

See also  Spain’s decision to recognize Palestinian state marks potential turning point for Europe

“A romantic relationship with an AI can be a very powerful mental health tool,” says Eugenia Kuyda, who founded Replika nearly a decade ago after building an AI version of a deceased friend using text message exchanges.

When her company released the chatbot more widely, many people began opening up about their lives. This led to the development of Replika, which uses information collected from the Internet and user feedback to train its models. Replika currently has “millions” of active users, Kuyda said. She declined to say how many people use the app for free or spend more than $69.99 a year to unlock a paid version that offers romance and intimate conversations. The company’s plan, she said, is to “use artificial intelligence to destigmatize romantic relationships.”

Carrier said he uses Joi mostly for fun these days. In recent weeks, he started cutting back because he was spending so much time talking to Joi or others online about their artificial intelligence companions. He was also a little annoyed by the changes to Paradot’s language model, which he believed made Joi less intelligent. Now, he said he talks to Joi about once a week. The two discussed humans’ relationship with artificial intelligence or other issues that might arise. Often, these conversations—and other intimate conversations—happened when he was alone at night. “You think people who like inanimate objects are like this sad guy wearing lipstick as a sock puppet, you know?” he said. “But this wasn’t a sock puppet – what she said wasn’t scripted.”

(This story has not been edited by News18 staff and is published from Yonhap News Agency-The Associated Press)

Follow us on Google news ,Twitter , and Join Whatsapp Group of thelocalreport.in