Carolinas healthcare providers are asking the same question in 2026: can we use Microsoft 365 Copilot with patient data? The short answer is yes, with the right setup. The complete answer is more specific than that, and getting it wrong is not a minor paperwork issue.
North Carolina is home to some of the largest health systems in the Southeast. Atrium Health in Charlotte, WakeMed and UNC Health in the Triangle, ECU Health and Vidant in Greenville, Novant Health stretching from Winston-Salem to the Piedmont. The smaller practices, specialty groups, and independent health systems across both Carolinas are watching the enterprise AI deployments at these large systems and asking when they can do the same thing.
The answer is not about Copilot’s capabilities. It is about your Microsoft 365 tenant configuration, your Business Associate Agreement, and a few specific decisions that determine whether AI output can legally touch protected health information.
HIPAA does not prohibit using AI with protected health information. It requires that any vendor accessing, processing, or storing PHI on your behalf signs a Business Associate Agreement with your organization and meets the technical and administrative safeguards required by the Security Rule.
The AI tool itself is not the compliance question. The question is whether the vendor is a business associate and whether you have a signed BAA in place before PHI touches the system.
Microsoft will sign a BAA for Microsoft 365. This happens under their standard licensing agreement for healthcare customers. The BAA covers Exchange Online, SharePoint Online, Teams, and the Microsoft 365 Copilot features that operate within the tenant boundary. It does not cover every Microsoft service by default. Bing-connected features, capabilities that route data outside the tenant for processing, and any add-on that sends data to a third-party model are separate questions that require their own review.
The tenant configuration HIPAA compliance actually requires
Getting Copilot into a HIPAA-compliant state is not a default install. Several settings need to be explicitly configured.
Web grounding must be turned off for PHI contexts. Copilot can be configured to include Bing search results when answering prompts. When that feature is active, any prompt that includes patient context could send that data outside your tenant boundary. For healthcare use cases involving PHI, disable web grounding in the Copilot admin settings. Your responses will be limited to what is inside your Microsoft 365 tenant, which is exactly what you want in a clinical environment.
Data loss prevention policies need to cover Copilot interactions. Microsoft Purview DLP policies can be extended to govern Copilot prompts and responses. This means the same rules that prevent a staff member from emailing a spreadsheet with patient identifiers can also flag or block Copilot prompts that include structured PHI. Build these policies before clinical staff begin prompting Copilot with patient data, not after.
Audit logging must be enabled with appropriate retention. The HIPAA Security Rule requires audit controls: records of who accessed what and when. Microsoft 365 Audit Standard includes Copilot interaction logs. Confirm that audit logging is active and that your retention period aligns with your HIPAA audit policy. For most practices, 90 days is the operational minimum. Many set six years to match the HIPAA document retention standard.
Copilot interaction data must not be used to train Microsoft models. By default, Microsoft does not use your Microsoft 365 Copilot prompts or responses to train their foundational models. This is governed by the Microsoft Products and Services Data Protection Addendum. Verify the DPA in your Microsoft agreement before deployment. Do not rely on verbal assurances from sales conversations.
The Copilot use cases working in Carolinas healthcare right now
The use cases seeing the most adoption across Carolinas healthcare practices in 2026 are not always the ones that get the press coverage.
Clinical documentation drafts are the highest-volume deployment. Not AI writing the note from scratch, but AI drafting a structured summary from dictation, from an existing encounter template, or from a prior visit note for the clinician to review and sign. The time savings on documentation has a measurable impact on burnout metrics in primary care and specialty practices, where administrative time regularly competes with patient time.
Prior authorization assembly is the second major area. The information needed for a PA request is almost always already present in the chart, the formulary, and the payor requirement document. Assembling it into the correct format for each payor is a clerical task that takes 20 to 45 minutes per request. With Copilot pulling from existing documents inside the tenant, the task becomes 10 to 15 minutes of review rather than assembly. At a practice handling 50 prior auth requests per week, the savings compound materially.
Administrative communication drafts cover everything from patient letters to referral letters to insurance appeal letters. These have consistent structure, defined required elements, and require human review before anything is sent. They are well-suited to first-draft AI generation within the Microsoft 365 environment.
Where the compliance risks actually live
Practices that encounter HIPAA issues with AI are not usually doing something deliberately wrong. The patterns that create real risk:
Shadow AI on employee devices. A clinical assistant discovers that a consumer AI app can draft a patient letter quickly, starts using it informally, and includes enough identifying information in the prompt that the activity constitutes an unauthorized disclosure under HIPAA. This is not a Microsoft 365 configuration issue. It is an employee training and policy issue. If you are deploying Copilot, your Acceptable Use Policy needs to explicitly address AI tool usage and prohibit PHI in tools that have no BAA.
Copilot configured with broader permissions than intended. Microsoft 365 Copilot accesses the data the signed-in user has permission to access. If your SharePoint permissions are broad and a staff member has read access to more records than their role requires, Copilot will surface that data in responses. This is a permissions architecture problem, not a Copilot problem. The HIPAA minimum necessary principle is not just a policy requirement. It is also the prerequisite for safe Copilot deployment.
Outdated BAAs that predate Copilot. Practices that deployed Teams or Microsoft 365 several years ago sometimes have agreements that were signed before Copilot existed as a product. The Copilot features were added to BAA coverage after the product launched. If your BAA was signed before 2023, review it or request an updated agreement before enabling Copilot for clinical staff.
A deployment checklist for Carolinas healthcare practices
Before enabling Copilot for clinical or administrative staff who handle PHI, work through this sequence:
- Confirm your Microsoft 365 BAA explicitly covers Copilot features and was signed after 2023
- Disable web grounding in Copilot admin settings for all healthcare users
- Extend Purview DLP policies to cover Copilot prompts and responses
- Confirm audit logging is enabled with a retention period that matches your HIPAA policy
- Review SharePoint and Teams permissions for minimum necessary access across clinical roles
- Update your Acceptable Use Policy to address AI tool usage and prohibit PHI in non-BAA tools
- Complete a security risk analysis update to document the new AI workload
Steps 1 through 4 are tenant configuration. Steps 5 through 7 involve process and governance decisions that take more time. Budget two to four weeks for the full sequence if your permissions and policies require cleanup.
Why this matters for Carolinas healthcare specifically
Healthcare is one of the most AI-active sectors across North and South Carolina right now, partly because the workforce shortage in nursing and primary care has created real operational pressure to reduce the administrative load on clinical staff. Practices in Greenville, Fayetteville, and across the rural Eastern NC counties face this pressure more acutely than urban systems, because they have fewer administrative staff to absorb the documentation burden when clinical volume increases.
AI tools that can compress documentation and prior auth time without creating compliance exposure are not a nice-to-have in this environment. For practices already operating on thin margins, they are becoming a material operational issue.
The HIPAA barrier is real but not prohibitive. The configuration work is specific and manageable. The risk is not in deploying AI carefully with proper setup. The risk is in avoiding AI because compliance feels unclear and then watching staff turn to consumer tools that actually are a compliance problem.
Microsoft 365 Copilot, configured correctly and backed by a current BAA, is a compliant choice for Carolinas healthcare practices willing to do the setup work properly.
Devsoft Solutions works with healthcare organizations across North and South Carolina on Microsoft 365, Copilot, and HIPAA-compliant AI deployments. If you are evaluating Copilot for your practice or health system, get in touch.