How Does Security and Privacy Work with Microsoft 365 Copilot?

Security and Privacy with Microsoft 365 Copilot

How Does Security and Privacy Work with Microsoft 365 Copilot?


Announced last week, Microsoft 365 Copilot will be released Nov. 1 as an additional add-on for customers with Microsoft E3, E5, Business Standard or Business Premium licenses.

We previously discussed what Copilot is and what it can do for you and your organization. But to summarize, it’s a generative AI tool that will help you create content in the Microsoft 365 apps you use, such as Word, Excel, PowerPoint, Outlook and OneNote.

For example, you can ask Copilot to create a sales report in Word based on notes and figures in a OneNote file. Or you can ask Copilot to analyze data in an Excel sheet to identify trends or create visualizations.

The goal with Copilot is to help you create first drafts so you can spend more time fine-tuning and tweaking the content rather than on the busy work of putting it all together.

Because generative AI software like Copilot is so technologically advanced, it’s natural to wonder how it will handle security and keep personal information private.

Will Copilot Create Privacy Concerns Internally?

The short answer is no, but there are caveats.

Copilot will have access to your company’s data. But how does Copilot know if the employee asking for information is allowed to know the answer?

Copilot users only have access to data they’ve been granted permission to see in Microsoft 365. You may want to review your access permissions before implementing Copilot. We encourage you to use just enough access permissions, meaning your employees have the correct permissions to be able to fulfill their day-to-day duties — no more and no less.

For example, if your permissions are not properly set up, an employee could ask Copilot to find documents related to employee salaries or social security numbers — data that only your HR department or management team should need to access. It’s data your employees might not know how to find manually, but Copilot can.

With proper permissions, employees can only search and see what they are allowed to see.

Copilot automatically follows your organization’s Microsoft 365 security, compliance and privacy policies, so your data is always kept safe and under your control.

Will Copilot Create Security Concerns Externally?

The short answer is probably not, but new technology comes with new security problems. Microsoft is attempting to capture the AI market by leveraging the trust it has built in the security of its platforms.

Microsoft has implemented the same security measures for Copilot that it has been using on Microsoft 365 for years:

  • Copilot automatically follows your organization’s Microsoft 365 security, compliance and privacy policies, so your data is always kept safe and under your control.
  • Data stored in your organization’s Microsoft 365 tenant is encrypted and only available to the users within that tenant.
  • The Copilot large language models (LLM) that process your data also are protected by Microsoft 365 Service Boundary but exist outside of your tenant. This means Microsoft cannot view your data, and your data is not used to train the LLMs.

Any new piece of technology will have new security problems that will need to be addressed, but privacy and security always have been priorities for Microsoft. The company has taken steps so you can ensure data won’t be leaked externally but also won’t be leaked between users and groups internally.

Want to Learn More About AI Assistant Tools?

Do you want to upgrade your Microsoft license to incorporate Microsoft 365 Copilot? Contact us here to see how we can help keep your business running smoothly while increasing productivity, security and profitability.

Share this post