Skip to content
Back to Blog

Is Microsoft Copilot HIPAA Compliant? What Healthcare Practices Need to Know

·Metrovolo HQ

The Short Answer: It Depends on Which Copilot

There is no single product called "Microsoft Copilot." The name now refers to a family of AI products with different capabilities, different data handling practices, and different HIPAA coverage. Before evaluating whether Copilot is safe for your practice, you need to know which one you are actually talking about.

Microsoft 365 Copilot is the AI assistant embedded in Word, Excel, Outlook, Teams, and other Microsoft 365 applications. It operates within your organization's Microsoft 365 tenant and can be covered by a Business Associate Agreement when configured correctly.

Microsoft Copilot Chat (formerly Bing Chat Enterprise) is a web-based chat interface with commercial data protection. It operates within your tenant boundary when you are signed in with an organizational account.

Copilot for Security is a standalone security operations tool. Microsoft added it to its BAA coverage in mid-2024.

Copilot Studio allows organizations to build custom AI agents. It can support HIPAA-aligned use, but only when restricted to HIPAA-eligible platform components with strict data loss prevention policies in place.

Copilot Health is a consumer feature launched in early 2026 that aggregates personal health data from wearable devices, lab results, and hospital records. It is explicitly not HIPAA compliant. Microsoft's VP of Health has stated publicly that HIPAA is not required for direct-consumer experiences. This product has no BAA and no regulatory backstop governing how it handles the health data users provide.

The distinction matters. A practice administrator who searches "is Copilot HIPAA compliant" and reads that Microsoft offers a BAA may assume that any Copilot product is safe to use with patient data. That assumption is wrong. Only specific Copilot products, deployed within a correctly configured enterprise tenant, fall under HIPAA coverage.

What HIPAA Requires for AI Tools

Any AI tool that processes protected health information must be covered by a Business Associate Agreement, must meet the Security Rule's technical safeguards — encryption at rest and in transit, access controls, and audit logging — and must be subject to administrative oversight. These requirements apply regardless of the vendor's size or reputation. We cover the full framework in our post on whether ChatGPT is HIPAA compliant, and the same requirements apply to Microsoft's products.

The critical point for this discussion: a BAA enables compliant use. It does not create it. The BAA is a contract between the practice and Microsoft that establishes Microsoft's obligations for safeguarding PHI. Whether the practice's actual deployment meets HIPAA's requirements depends entirely on how the practice configures and operates the tools covered by that agreement.

What Microsoft's BAA Actually Covers

Microsoft will sign a BAA for healthcare organizations using Microsoft 365. The BAA covers HIPAA-eligible services within your tenant — Exchange, SharePoint, OneDrive, Teams, and the compliance features built into those platforms. When Microsoft 365 Copilot operates within these services, it inherits their BAA coverage.

There are important boundaries.

Web search queries are explicitly excluded from HIPAA coverage. Microsoft's own documentation states that HIPAA compliance does not apply to web search queries, as they are not covered by the Data Protection Addendum or the BAA. This is significant because Copilot can invoke web search when responding to prompts. If a clinician asks Copilot a question and the response involves a web search that includes PHI from the prompt, that data has left the BAA boundary. The practice must disable web search in Copilot to prevent this — which also limits Copilot's ability to answer questions requiring current information.

Copilot Studio requires careful scoping. Custom agents built with Copilot Studio are only covered when they rely exclusively on HIPAA-eligible platform components, store PHI only in approved enterprise data stores, and enforce strict connector governance and data loss prevention policies. An agent that connects to a non-HIPAA-eligible service breaks coverage.

Third-party models may be excluded. Microsoft's documentation notes that certain third-party models used within the Copilot ecosystem — including Anthropic models — are currently excluded from data boundary commitments. The practice must verify which models are in use and whether they fall within the BAA's scope.

Coverage is at the service level, not the product level. The practice cannot assume that because "Copilot" is covered, every Copilot feature is covered. Coverage depends on which underlying Microsoft 365 services a particular Copilot interaction uses, and whether those specific services are designated as HIPAA-eligible under the executed BAA.

The Configuration Burden

This is where the practical reality diverges from the technical possibility.

Making Microsoft 365 Copilot HIPAA compliant is not a one-time setup. It requires ongoing configuration, monitoring, and enforcement across multiple systems. For a practice considering this path, here is what is actually involved:

Licensing. HIPAA-compliant use of Copilot requires Microsoft 365 E5 or equivalent security and compliance add-ons. E5 licensing provides the data loss prevention, sensitivity labels, audit logging, and advanced compliance features that HIPAA demands. E3 or Business Premium licensing does not include the full set of controls needed for defensible HIPAA compliance.

Data loss prevention policies. The practice must configure DLP policies that identify and protect PHI across all Microsoft 365 services that Copilot can access. These policies must be tested, monitored, and updated as the practice's workflows change.

Sensitivity labels. All documents, emails, and data stores containing PHI must be classified with sensitivity labels that Copilot respects. If PHI exists in an unlabeled SharePoint site, Copilot may surface it to users who should not have access.

Web search disabled. As noted above, web search must be disabled to maintain HIPAA coverage. This is a tenant-level or policy-level setting that must be actively configured and monitored.

Access controls. Copilot inherits the permissions model of the underlying Microsoft 365 tenant. If the practice's SharePoint permissions are overly broad — a common situation in small organizations that never invested in granular access controls — Copilot will surface PHI to anyone with access to the relevant sites. The practice must audit and tighten permissions before deploying Copilot, not after.

Audit logging. HIPAA requires that access to PHI be logged and auditable. The practice must ensure that Copilot interactions are captured in Microsoft's audit logs and that someone is reviewing them.

Staff training. Clinicians and administrative staff must understand what they can and cannot input into Copilot, which Copilot features are covered and which are not, and how to verify they are using the enterprise version rather than a consumer version on a personal device.

For a large health system with a dedicated IT security team, these requirements are manageable. They are table stakes for enterprise technology governance.

For the independent physician-owned practices that make up the majority of healthcare — groups of 5 to 30 providers without a full-time IT staff member, let alone a compliance engineer — this configuration burden is the equivalent of being told "yes, you can use it, you just need to build an enterprise compliance program first." The gap between having a Microsoft 365 license and having a HIPAA-compliant Copilot deployment is not trivial. It requires expertise most small practices do not have in-house.

The Infrastructure Question

Even with everything configured correctly, PHI is processed on Microsoft's infrastructure. Microsoft is a serious company with real security. This is not a consumer startup handling health data carelessly. But for healthcare practices, the question is worth asking directly: does the practice want its compliance position to depend on a third party's infrastructure, terms of service, and internal data handling practices?

Microsoft's terms can change. The scope of the BAA can be modified. New Copilot features may introduce data pathways that were not part of the original compliance assessment. The practice's ongoing HIPAA compliance requires continuous monitoring of what Microsoft is doing — not just what the practice has configured.

This is the same fundamental concern we discuss in our analysis of ChatGPT Enterprise. "Enterprise security features" add layers of contractual and technical protection on top of someone else's infrastructure. For practices that want the simplest defensible answer to "where is our patient data processed?", the answer is clearer when that data never leaves infrastructure the practice controls.

The Alternative: Private AI on Infrastructure You Control

Private AI deployed on infrastructure the practice controls eliminates the configuration burden entirely. There is no web search carve-out to manage because there is no web search. There is no question about which services are in scope because all processing happens within the practice's environment. There is no dependency on a third party's terms of service because the practice owns the infrastructure.

The capabilities are the same ones that make Copilot attractive in the first place: clinical documentation, prior authorization drafting, medical coding assistance, patient communication, record summarization, and administrative workflow automation. The difference is that every interaction happens within a security perimeter the practice controls, with a BAA that covers the entire deployment — not a subset of services within a larger platform.

Your staff does not need to understand which version of Copilot they are using, whether web search is enabled, or whether the document they are working with has the correct sensitivity label applied. They open the tool, do their work, and every interaction stays within the practice's environment. For an independent practice without dedicated IT staff, the operational simplicity is as important as the compliance posture.

Moving Forward

Microsoft Copilot is a legitimate option for healthcare organizations with the IT resources to configure, monitor, and maintain a HIPAA-compliant deployment. Large health systems and hospital networks with dedicated compliance teams can evaluate the BAA terms, implement the required controls, and manage the ongoing oversight that HIPAA demands.

For small and mid-size independent practices, the picture is different. The licensing costs, configuration complexity, and ongoing monitoring requirements create a burden that most practices are not equipped to manage. The gap between what an AI policy says and what actually happens in practice is already one of the biggest compliance challenges in healthcare. Adding a complex enterprise configuration requirement on top of that gap does not close it.

Private AI on infrastructure the practice controls offers a simpler compliance posture for the organizations that need simplicity most. Your staff gets AI capabilities that rival what Copilot provides. Your practice gets a defensible answer to every compliance question about where patient data is processed.

Learn how Metrovolo deploys HIPAA-compliant private AI for healthcare practices, or book a demo to see how it works for your practice.

Ready to see private AI in action?