May 5, 2024

Ferrum College : Iron Blade Online

Complete Canadian News World

AI-powered Bing Chat gains three distinct personalities – Ars Technica

AI-powered Bing Chat gains three distinct personalities – Ars Technica

Bing Edwards / Ars Technica

On Wednesday, Microsoft employee Mike Davidson announce The company has introduced three distinct personality styles for its experimental AI-powered Bing Chat bot: Creative, Balanced, or Accurate. Microsoft was Tests Feature since February 24th with a limited set of users. Switching between modes produces different results that change them balance between precision and creativity.

Bing Chat is an AI-powered assistant based on an Advanced Large Language Model (LLM) developed by OpenAI. The main advantage of Bing Chat is that it can search the web and incorporate the results into its answers.

Microsoft announced Bing Chat on February 7, and soon after it went live, hostile attacks regularly drove an early version of Bing Chat to simulate frenzy, and users discovered that a bot could be coaxed into it. threatened they. Not long after that, Microsoft massively dialed back the Bing Chat revolutions by imposing strict limits on how long conversations could last.

Since then, the company has experimented with ways to bring back some of Bing Chat’s rude personality to those who wanted it but also allow other users to search for more subtle responses. This resulted in a new three-selection “conversational style” interface.

In our experiments with all three modes, we noticed that the “Creative” mode produced shorter, more off-the-wall suggestions that weren’t always safe or practical. The ‘accurate’ mode erred on the side of caution, sometimes not suggesting anything if it didn’t see a safe way to achieve a result. In between, the longest “balanced” mode often produces responses with the most detailed search results and citations from websites in their answers.

See also  "60 Minutes" made a pretty nasty claim about a Google AI chatbot

With large language models, unexpected errors (hallucinations) often increase as “creativity” increases, which usually means that the AI ​​model will deviate more from the information it has learned in its dataset. AI researchers often call this characteristic “TemperatureBut Bing team members say there’s more in the works with the new conversational patterns.

According to a Microsoft employee Michael BarakhinSwitching modes in Bing Chat changes fundamental aspects of the bot’s behavior, including switching between different AI models received additional training of human responses to its outputs. Different modes also use different initial prompts, which means Microsoft is tweaking the character selection prompt like the one revealed in the instant injection attack we wrote about in February.

While Bing Chat is still only available to those who have signed up on a waiting list, Microsoft continues to continually improve Bing Chat and other AI-powered Bing search features as it prepares to roll them out more broadly to users. Microsoft recently announced plans to integrate the technology into Windows 11.