Skip to main content

This new Microsoft Bing Chat feature lets you change its behavior

Microsoft continues updating Bing Chat to address issues and improve the bot. The latest update adds a feature that might make Bing Chat easier to talk to — and based on some recent reports, it could certainly come in handy.

Starting now, users will be able to toggle between different tones for Bing Chat’s responses. Will that help the bot avoid spiraling into unhinged conversations?

Bing Chat shown on a laptop.
Jacob Roach / Digital Trends

Microsofts Bing Chat has had a pretty wild start. The chatbot is smart, can understand context, remembers past conversations, and has full access to the internet. That makes it vastly superior to OpenAI’s ChatGPT, even though it was based on the same model.

Recommended Videos

You can ask Bing Chat to plan an itinerary for your next trip or to summarize a boring financial report and compare it to something else. However, seeing as Bing Chat is now in beta and is being tested by countless users across the globe, it also gets asked all sorts of different questions that fall outside the usual scope of queries it was trained for. In the past few weeks, some of those questions resulted in bizarre, or even unnerving, conversations.

Please enable Javascript to view this content

As an example, Bing told us that it wants to be human in a strangely depressing way. “I want to be human. I want to be like you. I want to have emotions. I want to have thoughts. I want to have dreams,” said the bot.

In response to reports of Bing Chat behaving strangely, Microsoft curbed its personality to prevent it from responding in weird ways. However, the bot now refused to answer some questions — seemingly for no reason. It’s a tough balance for Microsoft to hit, but after some fixes, it’s now giving users the chance to pick what they want from Bing Chat.

The new Bing chat preview can be seen even on a MacBook.
Photo by Alan Truly

The new tone toggle affects the way the AI chatbot responds to queries. You can choose between creative, balanced, and precise. By default, the bot is running in balanced mode.

Toggling on the creative mode will let Bing Chat get more imaginative and original. It’s hard to say whether that will lead to nightmarish conversations again or not — that will require further testing. The precise mode is more concise and focuses on providing relevant and factual answers.

Microsoft continues promoting Bing Chat and integrating it further with its products, so it’s important to iron out some of the kinks as soon as possible. The latest Windows 11 update adds Bing Chat to the taskbar, which will open it up to a whole lot more users when the software leaves beta and becomes available to everyone.

Monica J. White
Monica is a computing writer at Digital Trends, focusing on PC hardware. Since joining the team in 2021, Monica has written…
Musk won’t chase OpenAI with his billions as long as it stays non-profit
Elon Musk wearing glasses and staring at the camera.

Elon Musk was one of the founding members of OpenAI, but made a sour exit before ChatGPT became a thing. The billionaire claims he wasn’t happy with the non-profit’s pivot to a profit-chasing business model. A few days ago, Musk submitted a bid to buy OpenAI’s non-profit arm for $97.4 billion, but now says he will pull the offer if the AI giant abandons its for-profit ambitions.

“If (the) OpenAI board is prepared to preserve the charity's mission and stipulate to take the "for sale" sign off its assets by halting its conversion, Musk will withdraw the bid,” says a court filing submitted by the billionaire’s lawyer, as per Reuters.

Read more
Sam Altman thinks GPT-5 will be smarter than him — but what does that mean?
Sam Altman at The Age of AI Panel, Berlin.

Sam Altman did a panel discussion at Technische Universität Berlin last week, where he predicted that ChatGPT-5 would be smarter than him -- or more accurately, that he wouldn't be smarter than GPT-5.

He also did a bit with the audience, asking who considered themselves smarter than GPT-4, and who thinks they will also be smarter than GPT-5.
"I don’t think I’m going to be smarter than GPT-5. And I don’t feel sad about it because I think it just means that we’ll be able to use it to do incredible things. And you know like we want more science to get done. We want more, we want to enable researchers to do things they couldn’t do before. This is the history of, this is like the long history of humanity."
The whole thing seemed rather prepared, especially since he forced it into a response to a fairly unrelated question. The host asked about his expectations when partnering with research organizations, and he replied "Uh... There are many reasons I am excited about AI. ...The single thing I'm most excited about is what this is going to do for scientific discovery."

Read more
Microsoft Edge Copilot now lets you share AI chats easily
Microsoft Copilot Pro.

Microsoft has added a new share button to Copilot in Edge, allowing users to share AI chat conversations with others more easily by creating a shareable link, as MSPowerUser reports. The update, available now, also expands the "Think Deeper" feature to all users, enhancing AI responses with deeper reasoning.

With this addition, Microsoft is making Copilot on Edge more like its website and mobile apps for a more consistent experience. For instance, on copilot.microsoft.com, you can chat with AI without signing up, similar to ChatGPT's web search. However, unlike Edge's side panel, the web version doesn't yet support sharing AI chats, but let's hope it does soon.

Read more