Microsoft Copilot accessed 3 million sensitive records per company in six months

Concentric AI found Copilot accessed nearly 3 million confidential records per organization in six months—more than half of all externally shared files. The traceability challenge: documenting which data informed each AI-generated output.

1 min read
Microsoft Copilot accessed 3 million sensitive records per company in six months
Photo by Hendri Sabri / Unsplash

Concentric AI's 2025 Data Risk Report tracked Copilot access patterns across several industries—technology, healthcare, government, and financial services—and found that each organization saw Copilot touch nearly 3 million confidential records in just the first half of this year, as reported by TechRadar's Wayne Williams. That number matters because it shows Copilot isn't just answering questions about your calendar; it's pulling from more than half of all externally shared files.

The report found that 57% of organization-wide shared data already contained privileged information, climbing to 70% in financial services and healthcare. Add to that: 2 million critical records per company shared with no restrictions, and over 400,000 records shared with personal accounts. Organizations also averaged more than 3,000 Copilot interactions during the survey period. For product counsel, this creates a traceability problem. When Copilot ingests millions of records to answer a single query, can you actually document which data informed the output?

The issue isn't whether Copilot respects permissions—it does—but whether your permission structure was ever designed for an AI that can read everything a user can access in seconds.

As data sharing risks escalate, Copilot found to be accessing millions of confidential business records
New report shows just how problematic AI in business can be