Security News > 2023 > October > Generative AI Security: Preventing Microsoft Copilot Data Exposure
In this post I'm going to focus specifically on data security and how your team can ensure a safe Copilot rollout.
Microsoft relies heavily on sensitivity labels to enforce DLP policies, apply encryption, and broadly prevent data leaks.
Microsoft paints a rosy picture of labeling and blocking as the ultimate safety net for your data.
The efficacy of label-based data protection will surely degrade when we have AI generating orders of magnitude more data requiring accurate and auto-updating labels.
It's critical to have a sense of your data security posture before your Copilot rollout.
Varonis protects thousands of Microsoft 365 customers with our Data Security Platform, which provides a real-time view of risk and the ability to automatically enforce least privilege.
News URL
Related news
- Microsoft Ignite 2024 Unveils Groundbreaking AI, Security, and Teams Innovations (source)
- Businesses turn to private AI for enhanced security and data management (source)
- Microsoft revised the controversial Copilot+ Recall feature (source)
- Microsoft overhauls security for publishing Edge extensions (source)
- Microsoft Edge begins testing Copilot Vision (source)
- Microsoft Issues Security Update Fixing 118 Flaws, Two Actively Exploited in the Wild (source)
- Week in review: Microsoft fixes two exploited zero-days, SOC teams are losing trust in security tools (source)
- CIOs want a platform that combines AI, networking, and security (source)
- Generative AI in Security: Risks and Mitigation Strategies (source)
- Unlocking the value of AI-powered identity security (source)