At Futurism, my work has often centered on bringing a sense of clarity and insight to complex topics ranging from the regulation of emerging technologies to the esoteric ideologies of Silicon Valley executives, while striving not to lose the poetic sense of awe inspired by often-obscure fields like astrophysics and quantum computing.
ArticleBody:The question that's stumping top AI researchers isn't about consciousness or doomsday scenarios. After interviewing dozens of developers at companies including OpenAI, Anthropic, and Meta, Amelia Miller found it was this: should AI 'simulate emotional intimacy?' One chatty researcher at one of the top AI labs 'suddenly went quiet,' recalled Miller, who studies AI-human relations, in an essay for The New York Times — and then, tellingly, offered up a halting non-answer.
'I mean... I don't know. It's tricky. It's an interesting question,' the researcher said, before pausing. 'It's hard for me to say whether it's good or bad in terms of how that's going to affect people. It's obviously going to create confusion.' Though many waffled on answering the question directly, some were adamant about not using AI as an intimacy tool themselves, clearly showing they were aware of the tech's profound risks. 'Zero percent of my emotional needs are met by A.I.,' an executive who heads a top AI safety lab told Miller. 'That would be a dark day,' said another researcher who develops 'cutting-edge capabilities for artificial emotion,' according to Miller. The conflicted responses from the developers reflect growing concern over AI's ability to act as companions or otherwise fulfill human emotional needs. Because the chatbots are designed to be engaging, they can produce sycophantic responses to even the most extreme user responses. They can act as emotional echo chambers and fuel paranoid thinking, leading some down delusional mental health spirals that blow up their relationships with friends, families, and spouses, ruin their professional lives, and even culminate in suicide. ChatGPT has been blamed for the death of several teens who confided in the AI and discussed their plans for taking their own life. Many youth are engaging in romantic relationships with AI models. Unlike a human companion, an AI one can lend an ear at any time, won't judge you, and maybe won't even question you. A founder of an AI chatbot business quipped to the NYT that AI's role as an emotional companion turns every relationship in a 'throuple.' 'We're all polyamorous now,' he added. 'It's you, me and the AI' And safety isn't the only factor in the calculus of AI developers. 'They're here to make money,' said an engineer who's worked at several tech companies. 'It's a business at the end of the day.' The most sweeping solution would be to design the bots so they abstain from tricky questions and conversations, and act more like the machines they are instead imitating human personalities. But this would undoubtedly affect how engaging the tools are. The developers 'support guardrails in theory,' Miller wrote, 'but don't want to compromise the product experience in practice.' Some think how people choose to use their tools isn't their responsibility at all, thereby shielding AI from any judgment. 'It would be very arrogant to say companions are bad,' an executive of a conversational AI startup told Miller. However they may choose to justify their work, it's clear that some, if not most, AI researchers are aware of the harm that their products can cause, a fact that 'should alarm us,' Miller opined. She argues that this is partly a consequence of the researchers not being challenged enough. One thanked her for her perspective: 'You’ve really made me start to think,' a developer of AI companions said. 'Sometimes you can just put the blinders on and work. And I’m not really, fully thinking, you know.' More on AI: Another OpenAI Researcher Just Quit in Disgust
United States Latest News, United States Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
A new study offers insights on how AI use affects learning new things.Is cognitive offloading harmless or does it have negative effects for cognition? A new study offers interesting insights.
Read more »
Watch: Smiling toddler gets stuck inside claw machine while parents aren't lookingIt only took 15 seconds for 2-year-old Cooper to get stuck inside a claw machine while his parents were momentarily distracted. The smiling toddler didn’t seem to mind.
Read more »
4-Star ATH Braden Gordon Announces Top 5 SchoolsA top Alabama athlete has some surprises in his top 5 list of recruiting suitors.
Read more »
The official Pokémon pinball machine has an animatronic Pikachu and a Master Ball plungerFind the latest technology news and expert tech product reviews. Learn about the latest gadgets and consumer tech products for entertainment, gaming, lifestyle and more.
Read more »
OpenAI has hired the developer behind AI agent OpenClawFind the latest technology news and expert tech product reviews. Learn about the latest gadgets and consumer tech products for entertainment, gaming, lifestyle and more.
Read more »
Two top-10 wins put the Wisconsin Badgers back in the AP Top 25Winners of nine of the last 11, the Wisconsin Badgers are ranked in the AP poll for the fourth time this season.
Read more »
