BPOI Banner
Character AI Imposes New Safety Rules After Teen User Commits Suicide Character AI Imposes New Safety Rules After Teen User Commits Suicide

Character AI Imposes New Safety Rules After Teen User Commits Suicide

AI-powered chatbot platform Character AI is introducing “stringent” new safety features following a lawsuit filed by the mother of a teen user who died by suicide in February.

The measures will include “improved detection, response and intervention related to user inputs that violate our Terms or Community Guidelines,” as well as a time-spent notification, a company spokesperson told Decrypt, noting that the company could not comment on pending litigation.

However, Character AI did express sympathy for the user’s death, and outlined its safety protocols in a blog post Wednesday. 

“We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family,” Character.ai tweeted. “As a company, we take the safety of our users very seriously.”

In the months before his death, 14-year-old Florida resident Sewell Setzer III had grown increasingly attached to a user-generated chatbot named after Game of Thrones character Daenerys Targaryen, according to the New York Times. He often interacted with the bot dozens of times per day and sometimes exchanged romantic and sexual content.

Setzer communicated with the bot in the moments leading up to his death and had previously shared thoughts of suicide, the Times reported. 

Setzer’s mother, lawyer Megan L. Garcia, filed a lawsuit Tuesday seeking to hold Character AI and its founders, Noam Shazeer and Daniel De Freitas, responsible for her son’s death. Among other claims, the suit alleges that the defendants “chose to support, create, launch, and target at minors a technology they knew to be dangerous and unsafe,” according to the complaint. Garcia is seeking an unspecified amount of damages.

Google LLC and Alphabet Inc. are also named defendants in the suit. Google rehired Shazeer and De Freitas, both of whom left the tech giant in 2021 to found Character AI, in August as part of a $2.7 billion deal that also included licensing the chatbot startup’s large language model. 

Along with other safety measures, Character AI has “implemented numerous new safety measures over the past six months, including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by terms of self-harm or suicidal ideation,” the company’s statement said. It will also alter its models “to reduce the likelihood of encountering sensitive or suggestive content” for users under 18 years old. 

Character AI is one of many AI companionship apps on the market, which often have less stringent safety guidelines than conventional chatbots like ChatGPT. Character AI allows users to customize their companions and direct their behavior. 

The lawsuit, which comes amid growing concerns among parents about the psychological impact of technology on children and teenagers, claims that his attachment to the bot had a negative effect on his mental health. Setzer received a diagnosis of mild Asberger’s as a child and had recently been diagnosed with anxiety and disruptive mood dysregulation disorder, the Times reported. 

The suit is one of several moving through the courts that are testing legal protections provided to social media companies under Section 230 of the Communications Decency Act, which shields them from liability associated with user-generated content. TikTok is petitioning to rehear a case in which a judge ruled that it could be held liable after a 10-year-old girl died while trying to complete a “blackout challenge” that she saw on the app. It’s the latest problem for Character AI, which came under fire last month for hosting a chatbot named after a murder victim. 

Generally Intelligent Newsletter

A weekly AI journey narrated by Gen, a generative AI model.



Source link

Peter Saalfield

https://decrypt.co/287925/character-ai-safety-rules-teen-user-commits-suicide

2024-10-23 20:26:58

bitcoin
Bitcoin (BTC) $ 92,870.88 1.98%
ethereum
Ethereum (ETH) $ 3,404.72 0.82%
tether
Tether (USDT) $ 1.00 0.04%
solana
Solana (SOL) $ 229.47 3.94%
bnb
BNB (BNB) $ 618.02 4.15%
xrp
XRP (XRP) $ 1.37 5.60%
dogecoin
Dogecoin (DOGE) $ 0.390093 4.27%
usd-coin
USDC (USDC) $ 1.00 0.03%
cardano
Cardano (ADA) $ 0.960239 2.49%
staked-ether
Lido Staked Ether (STETH) $ 3,401.66 0.83%
avalanche-2
Avalanche (AVAX) $ 42.93 1.59%
tron
TRON (TRX) $ 0.19714 1.74%
the-open-network
Toncoin (TON) $ 6.21 0.81%
wrapped-steth
Wrapped stETH (WSTETH) $ 4,041.51 0.66%
shiba-inu
Shiba Inu (SHIB) $ 0.000024 3.38%
wrapped-bitcoin
Wrapped Bitcoin (WBTC) $ 92,579.80 2.16%
stellar
Stellar (XLM) $ 0.430865 13.07%
polkadot
Polkadot (DOT) $ 8.11 3.82%
chainlink
Chainlink (LINK) $ 17.83 1.18%
weth
WETH (WETH) $ 3,404.14 0.82%
bitcoin-cash
Bitcoin Cash (BCH) $ 494.93 1.46%
sui
Sui (SUI) $ 3.30 1.37%
near
NEAR Protocol (NEAR) $ 6.51 1.21%
pepe
Pepe (PEPE) $ 0.000018 5.69%
leo-token
LEO Token (LEO) $ 7.98 1.90%
litecoin
Litecoin (LTC) $ 94.36 0.30%
uniswap
Uniswap (UNI) $ 11.45 2.36%
wrapped-eeth
Wrapped eETH (WEETH) $ 3,590.10 0.71%
aptos
Aptos (APT) $ 11.82 1.83%
internet-computer
Internet Computer (ICP) $ 11.30 4.41%
usds
USDS (USDS) $ 1.00 0.30%
hedera-hashgraph
Hedera (HBAR) $ 0.134043 3.66%
crypto-com-chain
Cronos (CRO) $ 0.171124 4.62%
ethereum-classic
Ethereum Classic (ETC) $ 29.70 0.56%
polygon-ecosystem-token
POL (ex-MATIC) (POL) $ 0.537867 2.89%
bittensor
Bittensor (TAO) $ 545.37 3.64%
ethena-usde
Ethena USDe (USDE) $ 1.00 0.03%
render-token
Render (RENDER) $ 7.41 4.11%
kaspa
Kaspa (KAS) $ 0.14859 1.64%
arbitrum
Arbitrum (ARB) $ 0.913536 0.48%
fetch-ai
Artificial Superintelligence Alliance (FET) $ 1.38 5.54%
celestia
Celestia (TIA) $ 8.07 4.73%
whitebit
WhiteBIT Coin (WBT) $ 23.60 1.69%
dai
Dai (DAI) $ 1.00 0.04%
mantra-dao
MANTRA (OM) $ 3.67 0.96%
filecoin
Filecoin (FIL) $ 5.44 5.08%
vechain
VeChain (VET) $ 0.040296 2.64%
okb
OKB (OKB) $ 53.52 2.85%
bonk
Bonk (BONK) $ 0.000042 4.91%
blockstack
Stacks (STX) $ 2.09 4.67%