Hospital Mergers, Due Diligence, and the AI Audit Question Nobody Asks
Hospital Mergers, Due Diligence, and the AI Audit Question Nobody Asks
I've been reading through a stack of healthcare M&A due diligence checklists lately, the kind that law firms and consultants circulate when a health system is acquiring a smaller physician practice or specialty group. They cover the expected ground: payer contracts, malpractice tail coverage, real estate leases, credentialing, Stark Law compliance, outstanding litigation. What they almost never cover is whether the target organization is using AI tools that touch protected health information.
This is a genuine gap, and it's going to start costing acquirers real money.
The Landscape Right Now
Healthcare M&A activity remains robust. Kaufman Hall reported 65 announced hospital and health system transactions in 2023, and physician practice acquisitions continue at a pace that's hard to track precisely because many are small enough to avoid antitrust scrutiny. When a 400-bed system acquires a 12-physician orthopedic group, the deal team is focused on revenue cycle integration, EHR migration, and whether the practice's referral patterns will survive the transition.
Meanwhile, that orthopedic group might be running AI-powered clinical decision support for imaging reads, using a chatbot for patient intake, feeding data into a third-party analytics platform for surgical outcome predictions, or letting front-desk staff paste patient questions into a general-purpose LLM. Nobody on the deal team asks about any of this, because the diligence questionnaire was last updated in 2019.
Why This Is a HIPAA Problem
Under HIPAA, when you acquire a covered entity, you inherit its compliance posture. If the target practice has been transmitting PHI to an AI vendor without a Business Associate Agreement in place, the acquiring system now owns that violation. The HHS Office for Civil Rights has been clear on this point. The HIPAA Privacy Rule at 45 CFR 164.502(a) restricts disclosures of PHI to business associates, and those disclosures require a BAA under 45 CFR 164.504(e). No BAA, no permissible disclosure. Period.
OCR's enforcement history shows they don't care much about the size of the entity that created the problem. In 2023, OCR settled with Yakima Valley Memorial Hospital for $240,000 over unauthorized access to PHI by security guards. The dollar amount was modest, but the investigation consumed years of institutional attention. Now scale that to a scenario where an acquired practice has been routing patient data through an AI tool with no BAA, no risk assessment, and no access controls, and you can see how the acquiring system could be looking at a much larger remediation burden.
There's also the question of the HIPAA Security Rule's risk analysis requirement at 45 CFR 164.308(a)(1). If the acquired entity never conducted a risk analysis that included its AI tools, the acquiring system inherits that deficiency too. OCR has made risk analysis failures the single most common finding in enforcement actions; it showed up in 22 of the 25 enforcement actions resolved in 2022 and 2023.
The AI-Specific Questions That Should Be on Every Healthcare Diligence Checklist
Here's what I'd want to see added to the standard questionnaire, broken into categories:
Inventory and Classification
- List all AI, machine learning, or automated decision-support tools currently in use, including any general-purpose tools like ChatGPT, Google Gemini, or Claude that staff may use in workflows involving patient information.
- For each tool, identify whether it processes, stores, or transmits PHI or data derived from PHI.
- Identify whether each tool is embedded in the EHR, integrated via API, or used as a standalone application.
Contractual and Compliance Status
- For each AI tool that touches PHI, provide the executed BAA.
- Provide documentation of the most recent HIPAA risk analysis that includes AI tool usage.
- Identify whether any AI vendor has access to PHI in a de-identified form, and if so, provide the de-identification methodology and confirm compliance with 45 CFR 164.514(b) (expert determination) or 164.514(b)(2) (safe harbor).
- Confirm whether any AI vendor's terms of service permit the vendor to use input data for model training. This is a critical question. If a vendor is training on your patients' data, you may have a secondary use problem under the Privacy Rule's minimum necessary standard.
Governance and Oversight
- Describe any internal policies governing staff use of AI tools with patient data.
- Provide training records related to AI tool usage and PHI handling.
- Identify who within the organization has authority to approve new AI tool deployments, and whether that approval process includes privacy and security review.
Clinical and Liability Considerations
- For AI tools used in clinical decision-making, identify whether the tool has FDA clearance or falls under an enforcement discretion pathway.
- Describe any adverse events, near-misses, or patient complaints related to AI tool outputs.
- Identify whether malpractice coverage explicitly addresses AI-assisted clinical decisions.
The Successor Liability Angle
This isn't purely a HIPAA question. State attorneys general have independent authority to enforce HIPAA violations under the HITECH Act, Section 13410(e), and they've used it. The Indiana AG's 2023 settlement with CarePointe for $45,000 over a data breach involved a relatively small entity, but the investigation process itself was disruptive. If an acquired practice's AI usage leads to a breach that surfaces post-closing, the acquiring system is the one holding the bag.
There's also the FTC's increasing interest in health data practices. The FTC's enforcement action against BetterHelp in 2023, resulting in a $7.8 million settlement, involved sharing health information with third parties for advertising. While BetterHelp wasn't a HIPAA-covered entity, the principle translates: regulators are paying attention to where health data flows, and AI tools are a new and largely unexamined vector for those flows.
What Acquirers Should Do Right Now
If you're on the buy side of a healthcare acquisition, add AI tool usage to your diligence workstream today. Don't wait for your standard questionnaire to be updated by committee. The practical steps are straightforward: request the AI tool inventory early in diligence, flag any tool that lacks a BAA as a material compliance gap, and build remediation costs into your deal model. If the target can't produce an inventory at all, treat that as a finding in itself, because it means there's no governance framework and you'll need to build one post-close.
For targets, the advice is even simpler: get your house in order before you go to market. A clean AI compliance posture won't make or break a deal, but a messy one can delay closing, reduce your purchase price, or create indemnification obligations that eat into your proceeds.
How FirmAdapt Addresses This
FirmAdapt's platform is built to give organizations a defensible, auditable record of how AI tools interact with regulated data. For healthcare organizations on either side of a transaction, this means you can produce a clear inventory of AI usage, demonstrate that BAAs and access controls are in place, and show that risk analyses have been conducted and documented. The compliance-first architecture means these controls exist by default rather than being retrofitted after someone raises the question in diligence.
For acquirers specifically, FirmAdapt provides a framework for evaluating a target's AI compliance posture against HIPAA requirements and integrating the target's AI workflows into the acquiring system's governance structure post-close. If the diligence question is "show me how PHI flows through your AI tools," FirmAdapt makes that question answerable.