A bug has been causing Microsoft Copilot to read and summarize users’ confidential emails. The issue has been ongoing since late January, Microsoft said, due to a bug that bypasses data loss ...
The Register on MSN
Copilot spills the beans, summarizing emails it's not supposed to read
Data Loss Prevention? Yeah, about that... The bot couldn't keep its prying eyes away. Microsoft 365 Copilot Chat has been summarizing emails labeled “confidential” even when data loss prevention ...
Microsoft said the bug meant that its Copilot AI chatbot was reading and summarizing paying customers' confidential emails, bypassing data protection policies.
Microsoft deployed a fix for the bug, which shows the hazards of using AI in the workplace.
Microsoft says a Microsoft 365 Copilot bug has been causing the AI assistant to summarize confidential emails since late January, bypassing data loss prevention (DLP) policies that organizations rely ...
Microsoft has said a software bug in its Microsoft 365 Copilot service caused the AI assistant to summarise confidential emails, despite safeguards designed to prevent ...
Did your shopping list just get exposed? Or your location got tracked? This could very well happen to a Co-Pilot user. Recent news suggests that Co-Pilot AI is ...
CEO talks momentum while paid uptake remains minimal Only 3.3 percent of Microsoft 365 and Office 365 users who touch Copilot Chat actually pay for it, an awkward figure that landed alongside ...
The bug led Microsoft's Copilot Chat to read and outline the contents of confidential emails since January.
Microsoft has confirmed a bug in its Copilot AI, which inadvertently exposed customers' confidential emails for several weeks.
Microsoft Copilot 2026 adds nine upgrades like GPT 5.2 and two thinking modes, helping you get faster answers or deeper ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results