Former Google CEO Eric Schmidt has warned advancing generative AI technologies, such as AI boyfriend and girlfriend companion apps, combined with societal factors like loneliness, could increase the risk of radicalization, particularly among young men.
Schmidt shared his concerns while appearing on NYU Stern Professor Scott Galloway’s The Prof G Show podcast last week.
Schmidt explained how many young men today feel increasingly hopeless due to seeing fewer pathways to success compared to women, who are more educated and make up a larger share of college graduates, leaving many men feeling left behind.
A recent study by the Pew Research Center found that 47% of U.S. women ages 25 to 34 have a bachelor’s degree, compared with 37% of men. In response, Schmidt said men turn to the online world and AI companion apps to ease their despair.
“So now imagine that the AI girlfriend or boyfriend is perfect—perfect visually, perfect emotionally,” Schmidt said. “The AI girlfriend, in this case, captures your mind as a man to the point where she, or whatever it is, takes over the way you’re thinking. You’re obsessed with her. That kind of obsession is possible, especially with people who are not fully formed.”
Schmidt cautioned that while AI offers significant opportunities, its risks for younger, impressionable users should be addressed.
“Parents are going to have to be more involved for all the obvious reasons, but, at the end of the day, parents can only control what their sons and daughters are doing within reason,” Schmidt said.
“We have all sorts of rules about the age of maturity: 16, 18, 21 in some cases. Yet, you put a 12- or 13-year-old in front of one of these things, and they have access to every evil as well as every good in the world, and they’re not ready to take it.”
A cold connection
A growing subset of the generative AI industry, AI companions are designed to simulate human interaction. But unlike AI chatbots like ChatGPT, Claude, or Gemini, AI companion apps are designed to mimic relationships.
Developers market them as judgment-free tools, supportive programs that offer connection and relief from loneliness or anxiety. Popular AI companion platforms include Character AI, MyGirl, CarynAI, and Replika AI.
“It’s about connection, feeling better over time,” Replika CEO Eugenia Kuyda previously told Decrypt. “Some people need a little more friendship, and some people find themselves falling in love with Replika, but at the end of the day, they’re doing the same thing.”
As Kuyda explained, Replika didn’t come from wanting to sell titillation but from a personal tragedy and her desire to keep talking to someone she had lost.
AI companions may offer temporary relief, but mental health professionals are raising red flags and warning that relying on AI companions to alleviate feelings of loneliness could hinder emotional growth.
“AI companions are designed to adapt and personalize interactions based on the user’s preferences, offering a tailored experience,” Sandra Kushnir, CEO of LA-based Meridian Counseling, told Decrypt. “They provide immediate responses without emotional baggage, fulfilling the need for connection in a low-risk environment. For individuals who feel unseen or misunderstood in their daily lives, these interactions can temporarily fill a gap.”
Kushnir warned that users might project human qualities onto the AI, only to be disappointed when they encounter the technology’s limitations, like forgetting past conversations and deepening the loneliness they were trying to alleviate.
“While AI companions can provide temporary comfort, they may unintentionally reinforce isolation by reducing motivation to engage in real-world relationships,” Kushnir said. “Over-reliance on these tools can hinder emotional growth and resilience, as they lack the authenticity, unpredictability, and deeper connection that human interactions offer.”
Legal quagmire
The rise in popularity of AI companions has brought increased scrutiny to the industry.
Last year, a 21-year-old man in England was put on trial for a plot to assassinate the late Queen Elizabeth II in 2021. He claimed that the plot was encouraged by his Replika AI companion.
In October, AI companion developer Character AI came under fire after an AI chatbot based on Jennifer Crecente, a teenage murder victim, was created on the platform.
“Character.AI has policies against impersonation, and the Character using Ms. Crecente’s name violates our policies,” a Character.AI spokesperson told Decrypt. “We are deleting it immediately and will examine whether further action is warranted.”
Later that month, Character AI introduced “stringent” new safety features following a lawsuit by the mother of a Florida teen who committed suicide after growing attached to an AI chatbot based on Daenerys Targaryen from “Game of Thrones.”
To curb these tragedies, Schmidt called for a combination of societal conversations and changes to current laws, including the much-debated Section 230 of the Communications Decency Act of 1996, which protects online platforms from civil liability for third-party content.
“Specifically, we’re going to have to have some conversations about at what age are things appropriate, and we’re also going to have to change some of the laws, for example, Section 230, to allow for liability in the worst possible cases.”
Edited by Sebastian Sinclair
Generally Intelligent Newsletter
A weekly AI journey narrated by Gen, a generative AI model.
Source link
Jason Nelson
https://decrypt.co/294430/ai-companions-fuel-radicalization-among-lonely-young-men-google-ceo
2024-12-02 22:57:08