Microsoft likely knew how unhinged Bing Chat was for months | Digital Trends

United States News News

Microsoft likely knew how unhinged Bing Chat was for months | Digital Trends
United States Latest News,United States Headlines
  • 📰 DigitalTrends
  • ⏱ Reading Time:
  • 56 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 26%
  • Publisher: 65%

A post on Microsoft's website has surfaced, suggesting the company knew about BingChat's unhinged responses months before launch.

Microsoft’s Bing Chat AI has been off to a rocky start, but it seems Microsoft may have known about the issues well before its public debut. A support post on Microsoft’s website references “rude” responses from the “Sidney” chat bot, which is a story we’ve been hearing for the past week. Here’s the problem — the post was made on November 23, 2022.

The initial post shows the AI bot arguing with the user and settling into the same sentence forms we saw when Bing Chat said it wanted “to be human.” Further down the thread, other users chimed in with their own experiences, reposting the now-infamous smiling emoji Bing Chat follows most of its responses with.

That runs counter to what Microsoft said in the days following the chatbot’s blowout in the media. In an announcement covering upcoming changes to Bing Chat, Microsoft said that “social entertainment,” which is presumably in reference to the ways users have tried to trick Bing Chat into provocative responses, was a “new user-case for chat.”

Although the story behind Microsoft’s testing of Bing Chat remains up in the air, it’s clear the AI had been in the planning for a while. Earlier this year, Microsoft made a multibillion investment in OpenAI following the success of ChatGPT, and Bing Chat itself is built on a modified version of the company’s GPT model. In addition, Microsoft posted a blog about “responsible AI” just days before announcing Bing Chat to the world.

We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

DigitalTrends /  🏆 95. in US

United States Latest News, United States Headlines

Similar News:You can also read news stories similar to this one that we have collected from other news sources.

Bing Chat is going mobile soon despite recent troubles | Digital TrendsBing Chat is going mobile soon despite recent troubles | Digital TrendsBingChat has been spotted on smartphones, and Microsoft might also be adding advertising soon, suggesting access to AI search could expand soon.
Read more »

Bing AI chat might be coming to your phone soonBing AI chat might be coming to your phone soonThere is an increasing amount of evidence that Microsoft is gearing up to launch its new AI-powered Bing chat on mobile devices soon.
Read more »

Beware of Bing AI chat and ChatGPT pump-and-dump tokens — Watch The Market Report liveBeware of Bing AI chat and ChatGPT pump-and-dump tokens — Watch The Market Report liveOn this week’s episode of The Market Report, Cointelegraph’s resident experts discuss ChatGPT pump-and-dump tokens and why you should be cautious.
Read more »

Microsoft imposes limits on Bing chatbot after multiple incidents of inappropriate behaviorMicrosoft imposes limits on Bing chatbot after multiple incidents of inappropriate behaviorMicrosoft has experienced problems with its Bing search engine such as inappropriate responses, prompting the company to limit the amount of questions users can ask.
Read more »

Microsoft is already undoing some of the limits it placed on Bing AIMicrosoft is already undoing some of the limits it placed on Bing AIBing AI testers can also pick a preferred tone: Precise or Creative.
Read more »

Microsoft Seems To Have Quietly Tested Bing AI in India Months Ago, Ran Into Serious ProblemsMicrosoft Seems To Have Quietly Tested Bing AI in India Months Ago, Ran Into Serious ProblemsNew evidence suggests that Microsoft may have beta-tested its Bing AI in India months before launching it in the West — and that it was unhinged then, too.
Read more »



Render Time: 2025-02-23 05:43:54