🇨🇦Vancouver🇨🇦Toronto🇺🇸Los Angeles🇺🇸Orlando🇺🇸Miami
1-855-KOO-TECH
KootechnikelKootechnikel
Insights · Field notes from the SOC
Plain-language briefings from the people watching the alerts.
Weekly · No spam
Back to News
AI & GovernanceIndustry

Microsoft Copilot: the SharePoint permissions audit you should run first

AuthorKootechnikel Solutions
Published
Read Time7 min read
Views0
Microsoft Copilot: the SharePoint permissions audit you should run first

Microsoft Copilot: the SharePoint permissions audit you should run first

The headline finding Across 2024, large enterprises deploying Microsoft Copilot discovered that years of accumulated SharePoint, OneDrive, and Teams over-permissioning suddenly became queryable. Files that "nobody could find" before — salary spreadsheets, M&A drafts, HR documents, board minutes — surfaced in plain English to anyone whose ACL techni…

## The headline finding

Technical diagram showing vulnerability chain
Figure 1: Visual representation of the BeyondTrust vulnerability chain

The headline finding

Across 2024, large enterprises deploying Microsoft Copilot discovered that years of accumulated SharePoint, OneDrive, and Teams over-permissioning suddenly became queryable. Files that "nobody could find" before — salary spreadsheets, M&A drafts, HR documents, board minutes — surfaced in plain English to anyone whose ACL technically allowed it.

The statistic that has crystallized in MSP and channel research since: 60% of organizations that deploy Copilot without a pre-deployment permission audit experience a data exposure incident within 90 days. A 2026 CISO survey put it differently: enterprise-AI permissions exposure remains the #1 enterprise AI risk two years after the launch of Microsoft 365 Copilot.

The reason this keeps happening is operational, not technical. Copilot inherits SharePoint permissions exactly as they exist in your tenant on the day you deploy it. If your tenant has a decade of accumulated over-permissioning — and most do — Copilot does not create that problem. It just makes it queryable in plain English by anyone with a license.

What over-permissioning actually looks like

A typical mid-market Microsoft 365 tenant accumulates permission drift in five recognizable patterns. Knowing them helps you find them.

Pattern 1: "Everyone except external users." A SharePoint library was originally shared with "Everyone except external users" because the IT admin needed every employee to be able to find a specific document. Over the next three years, the library became the dumping ground for HR templates, executive memos, and a finance spreadsheet that the CFO uploaded "just temporarily." Nobody updates the permission scope.

Pattern 2: The departed-employee group. A security group called "All Sales" was created in 2018. Sales employees who have left the company since are still in it because nobody runs the off-boarding script. Files shared with "All Sales" are now readable by 14 people who have not worked at the company in years. (Most of those accounts are also still active because nobody disabled them either.)

Pattern 3: The link that escaped. A user shared a document via "Anyone with the link" because they needed to send it to an external consultant quickly. The consultant forwarded the link. The link still works. Copilot indexes the document because the link grants access.

Pattern 4: The shared mailbox surplus. A shared mailbox called "info@" was set up to handle inbound vendor inquiries. Five admins are members. Two of them left the company. Two are now in different roles. The fifth is the only one who actually uses it. Every email anyone has ever sent to info@ is in their joint inbox. Copilot reads it.

Pattern 5: The OneDrive sprawl. Individual OneDrive accounts contain documents that were shared one-to-one with colleagues over time. The "Shared with me" view of any user's OneDrive is a five-year accumulation of files they were briefly given access to. Most users do not remember what is in there. Copilot does.

Authentication bypass flow diagram
Figure 2: How the authentication bypass vulnerability works

The pre-deployment audit, in five steps

The good news is that all of this is fixable in advance. Microsoft has shipped tooling that makes the audit tractable for tenants up to about 10,000 seats. Above that, partner-channel tooling (Varonis, Concentric AI, Microsoft Purview deeper deployments) is the established path.

Step 1: Run the SharePoint Advanced Management permission report

Microsoft introduced this report specifically for the Copilot rollout problem. It identifies sites where:

  • More than 100 users have access via direct or group membership.
  • Files have been shared via "Anyone with the link" or "People in your organization."
  • Sensitivity labels are missing or inconsistent.
  • Site ownership is unclear or assigned to a departed employee.

The report is your starting position. It will be longer than you expect.

Step 2: Map sensitive content categories to actual files

You probably have a written policy that classifies sensitive content (Confidential, Internal, Public — or some equivalent four-tier scheme). You probably do not have those labels actually applied to most files.

Run a Microsoft Purview discovery scan against your tenant. The scan finds files matching sensitive-data patterns (PII, payroll, contracts, M&A keywords) regardless of whether they are properly labeled. Cross-reference against the permission report from Step 1. The intersection — sensitive files in over-shared sites — is your priority remediation list.

Step 3: Remediate the highest-blast-radius issues first

You will not fix everything before launch. You will not need to. Prioritize by blast radius:

  1. Sensitive files shared via "Anyone with the link." These are publicly accessible to anyone in the world who has the link. Revoke the link sharing immediately. If the file still needs external access, replace the open link with a domain-restricted, expiring share.

  2. Sites where 100+ users have direct access to sensitive content. Convert from direct user assignment to a security group, restrict the group membership to actual current need, and document the owner.

  3. Departed-employee accounts still active in security groups. Run an off-boarding sweep. If you are using Microsoft Entra ID, the lifecycle workflows feature automates this for new departures going forward.

  4. OneDrive sprawl from key personnel. Ask each executive to do a 30-minute "Shared with me" cleanup. They will find things they did not know they had access to.

Step 4: Apply sensitivity labels to high-priority content

Before Copilot launches, the highly sensitive content categories — executive comp, M&A drafts, board minutes, ongoing legal matters — should be labeled with restrictive sensitivity labels. Copilot respects sensitivity labels. A document labeled "Confidential / Restricted" with policy preventing access by "All Employees" is invisible to Copilot for users outside the restricted group, even if the underlying SharePoint ACL has not been cleaned up.

This is a compensating control, not a permanent fix. The right long-term posture is to clean the ACLs. The label-based control buys you time to do it without delaying the rollout.

Step 5: Pilot Copilot with the right team — and watch what surfaces

Pick a pilot team in a department where the data they have legitimate access to is well-understood (engineering, IT, support — generally not finance or HR). Give them Copilot. Have them run common prompts ("Summarize what we discussed about the Q3 product launch"; "Pull together notes on the Acme account"; "Find the latest version of the architecture diagram"). Watch what comes back.

You will discover surfaces of access you did not realize they had. Use those discoveries to drive the remaining permission cleanup. Then expand the pilot to the next team.

What goes wrong if you skip this

A common pattern in 2024 and 2025 was: enterprise buys 500 Copilot licenses on a CFO-driven productivity initiative. IT activates them. Within the first month, an HR coordinator pastes "What are the salaries on the executive team?" into Copilot Chat as a half-joke. Copilot, dutifully, returns the answer because the executive comp spreadsheet was uploaded to a SharePoint library three years ago that was shared with "Everyone except external users."

The HR coordinator is not malicious. They probably tell three other people because they cannot believe it worked. By the end of the week, twelve people in the company know the executive team's compensation. By the end of the month, the CIO is in the CEO's office explaining how this happened.

The Copilot rollout is paused. The licenses sit unused for the next quarter. The executive team blames AI. The actual problem — which existed in the SharePoint tenant for three years before Copilot — never gets fixed.

This is the failure mode that costs you the AI deployment AND leaves the underlying permission sprawl in place.

Privilege escalation process
Figure 3: Privilege escalation from user to SYSTEM level

The Microsoft CSP angle

If you are a Microsoft Cloud Solution Provider (we are, see /microsoft), the audit pattern above is the standard Copilot pre-deployment workflow. We typically run it as part of the free 90-minute IT health check we offer prospective clients — the audit produces a SharePoint permission gap report regardless of whether you adopt Copilot or not. It is also the deliverable cyber insurance carriers are starting to ask about specifically on M365 renewals.

The deeper Microsoft positioning page is at /microsoft. The full Copilot deep-dive lives at /ai/copilot. The 6-point AI governance framework that this audit fits inside is at /ai/governance.

The 60% data exposure statistic on Copilot rollouts is preventable. It is not even hard to prevent. It just has to happen before you turn the licenses on.

Related Topics

Microsoft 365CopilotSharePointGovernancePermissions