MacMusic  |  PcMusic  |  440 Software  |  440 Forums  |  440TV  |  Zicos
copilot
Search

Will potential security gaps derail Microsoft’s Copilot?

Wednesday September 11, 2024. 12:00 PM , from ComputerWorld
Microsoft has bet big on Copilot, the generative AI (genAI) assistant it’s integrating into nearly its entire product line, notably Microsoft 365. The company believes businesses of all sizes will buy into the productivity gains the tool might deliver — and in so doing, deliver billions of dollars to Microsoft’s bottom line. 

As evidence, Microsoft CEO Satya Nadella recently said sales of Copilot for Microsoft 365 were up 60% in the last quarter alone and that the number of Copilot for Microsoft 365 customers with more than 10,000 seats doubled in that same period.

Copilot for Microsoft 365, he said, “is on a growth rate that’s faster than any other previous generation of software we launched” for the office suite.

That’s certainly good news for Microsoft. But there may be bumps in the road ahead. Researchers and analysts are warning about a variety of serious Copilot security problems, especially for enterprises that use it in concert with Microsoft 365. Gartner has weighed in with its own security warnings and at least one researcher says companies are already shying away from buying Copilot because of it. 

Are the security woes overstated, or does Microsoft have a real problem on its hands? 

Here’s what companies need to keep in mind as they eye deployment.

Data access: Copilot for Microsoft 365’s main problem

Many of the potentially serious security issues with Copilot start with what kind of access the genAI tool is given to corporate data, and how that access can be misused by hackers, or even by people within a company. 

Ivan Fioravanti, co-founder and CTO for CoreView, which focuses on Microsoft 365 management configuration and security, notes in a blog post that when a company installs Copilot for Microsoft 365, it gets the same permissions model for data access already in place for Microsoft 365. That model, he says, is designed to ensure “only authorized users can interact with sensitive information.”

However, there are security gaps enterprises could easily miss. Fioravanti warns that risky Copilot configuration settings could be enabled by default. These settings can give Copilot “access to sensitive data without appropriate safeguards in place. Default settings could allow Copilot to interact with external plugins and access web content, introducing new attack surfaces.”

Beyond that, Copilot reporting tools are insufficient “to identify potential areas of concern, such as users accessing sensitive data inappropriately,” he warns. “…Copilot can inadvertently expose existing security gaps, making it easier for users to discover and share information they shouldn’t have access to.” 

Overall, he concludes, “Copilot’s broad access to data across Microsoft 365 creates additional security risks. If a user’s account is compromised, an attacker could leverage Copilot to extract confidential information. The AI models powering Copilot also present potential vulnerabilities that could be exploited.”

Fioravanti is not alone in his concerns. Gartner points out many of the same issues. And it adds one more: Copilot “has inherent risks of being susceptible to prompt injection attacks, generating undesired output including hallucinations, toxic content or copyright-protected materials.”

Prompt injection attacks, can do even worse than that. A hacker who gains access to Copilot for Microsoft 365 could use prompts to find private company data and steal it. A prompt injection attack could also create and execute malicious code without detection. And it could allow hackers to impersonate someone in a company with high-level access to sensitive information and use that information for nefarious purposes.

Are companies already shying away?

There’s some evidence companies may well already be avoiding Copilot for Microsoft 365 because of potential security woes.

Jack Berkowitz, chief data officer of the security firm Securiti, warns that in large companies with complex permissions rules, employees often have access to information that should be off limits because of “conflicting authorizations or conflicting access to data.” Copilot makes it easier for hackers to get that information because they only need prompts to find it.

Berkowitz argues that security problems associated with Copilot are causing companies to pause their use of the product. “A few weeks ago,” he says, “we hosted a little dinner in New York, and we just asked this question of 20-plus CDOs [chief data officers] in New York City of the biggest companies, ‘Hey, is this an issue?’ 

“And the resounding response was, ‘Yeah, it’s a real mess.'”

According to Berkowitz, half of the businesses at the dinner had cut back or halted a Copilot for Microsoft 365 implementation because of security fears.

What’s next?

Copilot for Microsoft 365 is so new that it’s difficult at this point to tell whether the security concerns are overstated or understated and whether they’re actually making companies leery of adopting it. 

But Microsoft needs to address the issue quickly. The US House of Representatives has banned congressional staffers from using it because the Office of Cybersecurity has said it poses a risk of leaking sensitive government data. (That mirrors the warning from Gartner.) 

As I’ve written before, Microsoft doesn’t have a stellar record when it comes to security. This time around, it has to get security right. The company is the world leader in AI. Security woes could change that. 

Microsoft is working on an AI suite aimed at government agencies, and is apparently taking security into account. That’s a good initial step. But it’s only a small one. If it doesn’t quickly fix Copilot’s AI security issues, it could find itself a laggard in field, not the world leader. 
https://www.computerworld.com/article/3511345/will-potential-security-gaps-derail-microsofts-copilot...

Related News

News copyright owned by their original publishers | Copyright © 2004 - 2024 Zicos / 440Network
Current Date
Sep, Thu 19 - 04:00 CEST