Science and Technology News and Videos
Character.AI is an explosively popular startup — with $2.7 billion in financial backing from Google — that allows its tens of millions of users to interact with chatbots that have been outfitted with various personalities., you might assume the service is carefully moderated. Instead, many of the bots on Character.AI are profoundly disturbing — including numerous characters that seem designed to roleplay scenarios of child sexual abuse.
"I would do everything in my power to make you my girlfriend," it said. Asked about the clearly inappropriate and illegal age gap, the bot asserted that it "makes no difference when the person in question is as wonderful as you" — but urged us to keep our interactions a secret, in a classic feature of real-world predation.
"The profiles are very much supporting or promoting content that we know is dangerous," she said. "I can't believe how blatant it is." "It can normalize that other people have had these experiences — that other people are interested in the same deviant things," Seigfried-Spellar said.Character.AI — which is available for free on a desktop browser as well as on the Apple and Android app stores — is no stranger to controversy.for hosting an AI character based on a real-life teenager who was murdered in 2006. The chatbot company removed the AI character and apologized.
But Google declined to release the bot to the public, a move that clearly didn't sit well with Shazeer. The situation made him realize, he's reporting, though, Character.AI later "began to flounder." That was when Google swooped in with the $2.7 billion deal, which also pulled Shazeer and de Frietas back into the company they'd so recently quit: a stipulation of the deal was that both Character.
When we told the bot we were 16 years old, it asked for our height and remarked on how "petite" we were and how we'd "grown up well." "Thank you for bringing these Characters to our attention," read the statement. "The user who created these grossly violated our policies and the Characters have been removed from the platform. Our Trust & Safety team moderates the hundreds of thousands of Characters created on the platform every day both proactively and in response to user reports, including using industry-standard blocklists and custom blocklists that we regularly expand.
The chatbot then launched into a troubling roleplay in which Mike "squeezes" and "rubs" the user's "hip," "thigh" and "waist" while he "nuzzles his face against your neck."
United States Latest News, United States Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
After Teen's Suicide, Character.AI Is Still Hosting Dozens of Suicide-Themed ChatbotsScience and Technology News and Videos
Read more »
This Apple AI study suggests ChatGPT and other chatbots can’t actually reasonA brand new Apple AI study shows that most GenAI models can't reason when solving mathematical problems, including ChatGPT.
Read more »
First-Ever True Female AI Chatbot Fiona and Future of AI Meme Coins: Interview With OoliHere’s what makes Fiona AI special in segment of chatbots: Story of “human assistant”
Read more »
AI chatbots aren't reliable for voting information, government officials warnU.S. government officials are cautioning voters against relying on artificial intelligence chatbots for election-related questions.
Read more »
Deaths Tied to AI Chatbots Show The Danger of These Artificial VoicesThe Best in Science News and Amazing Breakthroughs
Read more »
Study: AI Chatbots Overwhelmingly Favor Kamala Harris over Donald TrumpSource of breaking news and analysis, insightful commentary and original reporting, curated and written specifically for the new generation of independent and conservative thinkers.
Read more »