FirmAdapt
FirmAdapt
LIVE DEMO
Back to Blog
AI complianceregulatoryhealthcareHIPAAPHI

Independent Practice Associations and the Shared AI Risk Problem

By Basel IsmailMay 4, 2026

Independent Practice Associations and the Shared AI Risk Problem

IPAs have a structural problem with AI that almost nobody is talking about. When 40 or 200 independent practices share administrative infrastructure, billing systems, and increasingly clinical decision support tools, the AI adoption choices of any single practice can create HIPAA exposure across the entire association. And right now, most IPAs have zero governance framework to address this.

How IPAs Actually Share Risk

The typical IPA model pools resources. Practices share EHR platforms, claims processing systems, care coordination tools, and credentialing databases. Under HIPAA, these shared arrangements create a web of business associate agreements and, in many cases, organized health care arrangements (OHCAs) under 45 CFR 164.520(d). The IPA itself often functions as a business associate to its member practices, or sometimes as a covered entity in its own right depending on how it handles claims.

This is fine when everyone is using the same vetted software stack. It becomes a problem when Dr. Smith's practice starts feeding patient data into a generative AI tool for clinical note summarization, and that tool's API sends PHI to a cloud endpoint with no BAA in place. Under the HIPAA Privacy Rule, the minimum necessary standard at 45 CFR 164.502(b) requires that disclosures be limited to the minimum necessary for the intended purpose. An AI tool ingesting full clinical notes to generate a summary is almost certainly receiving more than the minimum necessary, and if that tool is connected to shared infrastructure, the blast radius extends well beyond Dr. Smith's practice.

The Governance Gap at the IPA Level

Most IPAs have compliance committees. They handle credentialing disputes, payer contract negotiations, and quality reporting. What they typically lack is any mechanism to evaluate, approve, or restrict AI tools adopted by member practices.

Think about how software adoption actually works in an IPA. A practice administrator hears about an AI scribe tool at a conference. They sign up for a trial. The tool integrates with the shared EHR via API. Now that tool has access to data flowing through shared infrastructure, potentially including data from patients of other practices in the network. The IPA's compliance committee may not even know this happened until something goes wrong.

The HHS Office for Civil Rights has been clear that ignorance is not a defense. The 2023 OCR guidance on tracking technologies, issued in December 2022 and updated in March 2024, explicitly addressed scenarios where covered entities deploy technologies that transmit PHI to third parties without proper authorization or BAAs. OCR imposed a $1.3 million settlement on Montefiore Medical Center in 2024 for insider access issues, and the agency has signaled that AI-related enforcement is a priority. When OCR investigates a breach at an IPA member practice, the shared infrastructure means the IPA itself, and potentially other member practices, end up in scope.

The BAA Chain Problem

Here is where it gets technically interesting. Under 45 CFR 164.502(e), a covered entity must have a BAA with any business associate that creates, receives, maintains, or transmits PHI on its behalf. If Practice A uses an AI tool that processes PHI from the shared EHR, and that AI vendor has no BAA with the IPA, you now have an unauthorized disclosure. But the chain is murkier than that. If the IPA's EHR vendor allows third-party integrations, the EHR vendor's own BAA with the IPA may contain provisions about downstream subcontractors. The HITECH Act's 2013 Omnibus Rule extended direct liability to business associates and their subcontractors, meaning the AI vendor could be directly liable under HIPAA even without a BAA. But the absence of the BAA is itself a violation for the covered entity.

In a 2024 enforcement action, OCR settled with a health system for $950,000 partly because a business associate's subcontractor had access to PHI without proper agreements in place. The structural parallel to an IPA member practice plugging in an unapproved AI tool is obvious.

What IPA Governance Should Look Like

IPAs need to treat AI tool adoption the way they treat formulary decisions in a medical group: with a structured evaluation and approval process that applies network-wide.

  • Pre-approval requirements. Any AI tool that will process PHI or connect to shared infrastructure should require IPA-level review before deployment. This includes tools marketed as "HIPAA compliant," a phrase that means nothing without verification of actual technical and administrative safeguards.
  • Centralized BAA management. The IPA should maintain a registry of approved AI vendors with executed BAAs. Member practices should be contractually prohibited from integrating unapproved tools with shared systems.
  • Technical segmentation. Where possible, AI tools used by individual practices should be sandboxed from shared infrastructure. If Practice A wants to use an AI scribe, the data flow should be isolated so that only Practice A's patient data is in scope.
  • Incident response coordination. The IPA needs a breach response protocol that accounts for AI-related incidents. Under the Breach Notification Rule at 45 CFR 164.408, business associates must notify covered entities within 60 days of discovering a breach. If the AI vendor is the source, the IPA needs to coordinate notification across all potentially affected member practices.
  • Ongoing monitoring. AI tools change. Models get updated. Data processing practices shift. A BAA signed in January may not reflect the vendor's architecture in July. IPAs should require periodic re-certification of approved tools.

The Contractual Leverage Issue

One practical challenge is that many IPA participation agreements were drafted before AI was a consideration. They typically address data sharing for treatment, payment, and health care operations, but they do not address what happens when a member practice introduces a new technology that creates risk for the network. Updating these agreements is not glamorous work, but it is necessary. IPAs that do not add AI governance provisions to their participation agreements are essentially hoping that all 150 of their member practices will independently make good decisions about AI procurement. That is a bet most compliance officers would not want to take.

California's Knox-Keene Act, which governs many IPAs operating as risk-bearing organizations, adds another layer. The DMHC has been increasingly attentive to how delegated entities manage data, and an AI-related breach at a Knox-Keene licensed IPA could trigger both federal HIPAA enforcement and state regulatory action.

The Practical Reality

Most IPAs are not ignoring AI because they think it is unimportant. They are ignoring AI governance because they are under-resourced and the problem feels abstract until something breaks. The compliance committee meets quarterly, has a packed agenda, and AI governance is competing with prior authorization reform, value-based contract renegotiation, and a dozen other priorities. But the risk is compounding. Every month that passes without an AI governance framework is another month where member practices may be adopting tools that create network-wide exposure.

How FirmAdapt Addresses This

FirmAdapt's architecture was built for exactly this kind of multi-entity compliance challenge. The platform allows an IPA to establish centralized AI governance policies, maintain a registry of approved tools with verified BAA status, and monitor data flows across shared infrastructure. Member practices can adopt AI tools within the approved framework without creating uncontrolled risk for the network.

For IPAs specifically, FirmAdapt provides the ability to enforce technical segmentation between member practice data environments while maintaining unified compliance reporting. When OCR comes asking questions, the IPA can demonstrate that it had a governance framework in place, that member practices were operating within defined boundaries, and that data flows were monitored. That documentation is the difference between a defensible position and a seven-figure settlement.

Ready to uncover operational inefficiencies and learn how to fix them with AI?
Try FirmAdapt free with 10 analysis credits. No credit card required.
Get Started Free