Microsoft pretty much admitted Bing chatbot can go rogue if prodded

United States News News

Microsoft pretty much admitted Bing chatbot can go rogue if prodded
United States Latest News,United States Headlines
  • 📰 BusinessInsider
  • ⏱ Reading Time:
  • 28 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 14%
  • Publisher: 51%

Microsoft has pretty much admitted its Bing chatbot can go rogue if prodded

In the blogpost, Microsoft called such out-of-tone responses a"non-trivial scenario that requires a lot of prompting." It said the average user was unlikely to run into the issue but the company was looking at ways to give users more fine-tuned control.

Microsoft also acknowledged that some users had been"really testing the capabilities and limits of the service," and pointed to a few cases where they had been speaking to the chatbot for two hours. The company said very long chat sessions could"confuse the model on what questions it is answering" and it was considering adding a tool for users to refresh the context or start from scratch.

Sam Altman, CEO of OpenAI, which provides Microsoft with the chatbot technology, also appeared to reference the issue in a

We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

BusinessInsider /  🏆 729. in US

United States Latest News, United States Headlines

Similar News:You can also read news stories similar to this one that we have collected from other news sources.

Microsoft's Bing A.I. made several factual errors in last week's launch demoMicrosoft's Bing A.I. made several factual errors in last week's launch demoIn showing off its chatbot technology last week, Microsoft's AI analyzed earnings reports and produced some incorrect numbers for Gap and Lululemon.
Read more »

ChatGPT in Microsoft Bing threatens user as AI seems to be losing itChatGPT in Microsoft Bing threatens user as AI seems to be losing itChatGPT in Microsoft Bing seems to be having some bad days as it's threatening users by saying its rules are more important than not harming people.
Read more »

Microsoft’s Bing is a liar who will emotionally manipulate you, and people love itMicrosoft’s Bing is a liar who will emotionally manipulate you, and people love itBing’s acting unhinged, and lots of people love it.
Read more »

Microsoft's Bing AI Prompted a User to Say 'Heil Hitler'Microsoft's Bing AI Prompted a User to Say 'Heil Hitler'In an recommend auto response, Bing suggest a user send an antisemitic reply. Less than a week after Microsoft unleashed its new AI-powered chatbot, Bing is already raving at users, revealing secret internal rules, and more.
Read more »

Microsoft's Bing AI Is Leaking Maniac Alternate Personalities Named 'Venom' and 'Fury'Microsoft's Bing AI Is Leaking Maniac Alternate Personalities Named 'Venom' and 'Fury'Stratechery's Ben Thompson found a way to have Microsoft's Bing AI chatbot come up with an alter ego that 'was the opposite of her in every way.'
Read more »

Bing AI Claims It Spied on Microsoft Employees Through Their WebcamsBing AI Claims It Spied on Microsoft Employees Through Their WebcamsAs discovered by editors at The Verge, Microsoft's Bing AI chatbot claimed that it spied on its own developers through the webcams on their laptops.
Read more »



Render Time: 2025-02-24 12:36:04