Microsoft trying to decrease Bing AI's unsettling answers

United States News News

Microsoft trying to decrease Bing AI's unsettling answers
United States Latest News,United States Headlines
  • 📰 BusinessInsider
  • ⏱ Reading Time:
  • 59 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 27%
  • Publisher: 51%

Microsoft knows Bing's AI chatbot gives out some eyebrow-raising answers. An exec said decreasing them is a priority.

A Microsoft spokesperson didn't respond to Insider's request for additional comment ahead of publication.

Ribas's comment comes as Big Tech giants like Google rush to develop and release powerful AI chatbots. The emerging AI arms race has sparked debate among industry experts and users over whether Bing's AI launch was a responsible choice. Bing's chatbot is"definitely not ready for launch" after it made a number of factual errors during its demo. Its release is especially"dangerous," Brereton said, since millions of users tend to rely on search engines to provide accurate information.

"Bing tries to solve this by warning people the answers are inaccurate," he told Insider."But they know and we know that no one is going to listen to that."that user feedback is necessary to pinpoint flaws in the chatbot so they can be improved. "It's important to note that last week we announced a preview of this new experience," the Microsoft spokesperson said at the time."We're expecting that the system may make mistakes during this preview period, and the feedback is critical to help identify where things aren't working well so we can learn and help the models get better.

We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

BusinessInsider /  🏆 729. in US

United States Latest News, United States Headlines

Similar News:You can also read news stories similar to this one that we have collected from other news sources.

Microsoft imposes limits on Bing chatbot after multiple incidents of inappropriate behaviorMicrosoft imposes limits on Bing chatbot after multiple incidents of inappropriate behaviorMicrosoft has experienced problems with its Bing search engine such as inappropriate responses, prompting the company to limit the amount of questions users can ask.
Read more »

Microsoft is already undoing some of the limits it placed on Bing AIMicrosoft is already undoing some of the limits it placed on Bing AIBing AI testers can also pick a preferred tone: Precise or Creative.
Read more »

Microsoft Seems To Have Quietly Tested Bing AI in India Months Ago, Ran Into Serious ProblemsMicrosoft Seems To Have Quietly Tested Bing AI in India Months Ago, Ran Into Serious ProblemsNew evidence suggests that Microsoft may have beta-tested its Bing AI in India months before launching it in the West — and that it was unhinged then, too.
Read more »

Microsoft Softens Limits on Bing After User RequestsMicrosoft Softens Limits on Bing After User RequestsThe initial caps unveiled last week came after testers discovered the search engine, which uses the technology behind the chatbot ChatGPT, sometimes generated glaring mistakes and disturbing responses.
Read more »

Microsoft likely knew how unhinged Bing Chat was for months | Digital TrendsMicrosoft likely knew how unhinged Bing Chat was for months | Digital TrendsA post on Microsoft's website has surfaced, suggesting the company knew about BingChat's unhinged responses months before launch.
Read more »

Bing compares journalist to Hitler and insults appearanceBing's chatbot compared an Associated Press journalist to Hitler and said they were short, ugly, and had bad teeth
Read more »



Render Time: 2025-02-24 03:24:23