Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

Microsoft: AI is no replacement for human expertise

Microsoft Copilot allows you to ask an AI assistant questions within Office apps.
Microsoft

Microsoft has updated the terms of service going into effect at the end of September and is clarifying that its Copilot AI services should not be used as a replacement for advice from actual humans.

AI-based agents are popping up across industries as chatbots are increasingly used for customer service calls, health and wellness applications, and even doling out legal advice. However, Microsoft is once again reminding its customers that its chatbots responses should not be taken as gospel. “AI services are not designed, intended, or to be used as substitutes for professional advice,” the updated Service Agreement reads.

Recommended Videos

The company specifically referred to its health bots as an example. The bots, “are not designed or intended as substitutes for professional medical advice or for use in the diagnosis, cure, mitigation, prevention, or treatment of disease or other conditions,” the new terms explain. “Microsoft is not responsible for any decision you make based on information you receive from health bots.”

The revised Service Agreement also detailed additional AI practices that are explicitly no longer allowed. Users, for example, cannot use its AI services for extracting data. “Unless explicitly permitted, you may not use web scraping, web harvesting, or web data extraction methods to extract data from the AI services,” the agreement reads. The company is also banning reverse engineering attempts to reveal the model’s weights or use its data “to create, train, or improve (directly or indirectly) any other AI service.”

“You may not use the AI services to discover any underlying components of the models, algorithms, and systems,” the new terms read. “For example, you may not try to determine and remove the weights of models or extract any parts of the AI services from your device.”

Microsoft has long been vocal about the potential dangers of generative AI’s misuse. With these new terms of service, Microsoft looks to be staking out legal cover for itself as its AI products gain ubiquity.

Andrew Tarantola
Former Digital Trends Contributor
Andrew Tarantola is a journalist with more than a decade reporting on emerging technologies ranging from robotics and machine…
Microsoft is killing this popular Word feature and replacing it with AI
Microsoft word document.

In a Microsoft Support blog post, the software giant announced the end of a helpful feature called Smart Lookup available in Word. It appears like an attempt to get users to use Microsoft's Copilot AI. The feature has been around since 2016, and it gives users definitions, relevant links, and synonyms directly inside of Word. Now, it's gone for good.

Nevertheless, if you right-click on a word and choose Search from the context menu, you will see only an empty search panel. Some users will see a message saying, "Sorry, something went wrong. Please try again," while others will see a blank space that never stops loading. Microsoft even removed the Smart Lookup feature from the standalone Office 2024 suite.

Read more
Microsoft already has its legal crosshairs set on DeepSeek
DeepSeek AI running on an iPhone.

The home page chat interface of DeepSeek AI. Nadeem Sarwar / Digital Trends

Microsoft, a primary investor in OpenAI, is now exploring whether the Chinese company DeepSeek used nefarious methods to train its reasoning models. According to Bloomberg Law the company now believes DeepSeek violated its terms of service by using its application programming interface (API) to train its recently announced R1 model.

Read more
Microsoft’s Copilot app has a new icon, and it’s causing problems
Copilot on a laptop on a desk.

Bad news if you have a PC with a low resolution since Microsoft's new Copilot app icon is almost impossible to decipher on them, according to Windows Central. Microsoft's new logo now includes a bit of text embedded in the icon, which, depending on the resolution of your screen, might be impossible to read.

The poor design has not gone unnoticed online. Users can barely read the icon on their screens when they pin it to the Taskbar, and the lower pixel density makes it even harder to read the icon's text. If you have a Surface Laptop Go, which has a very low resolution display, there is a good chance you had no idea it said "M365." When you first saw it, you may have confused it with text such as MJEG, M366, or M355.

Read more