FirmAdapt
FirmAdapt
LIVE DEMO
Back to Blog
AI complianceregulatoryhealthcareHIPAAPHI

Why Healthcare AI Pilots Keep Failing the Procurement Stage

By Basel IsmailMay 3, 2026

Why Healthcare AI Pilots Keep Failing the Procurement Stage

A pattern keeps repeating across health systems. An innovation team finds a promising AI tool, runs a pilot, gets clinicians excited, maybe even presents results to the C-suite. Then the vendor questionnaire lands on the desk of legal, compliance, or information security. And the whole thing stalls, sometimes permanently.

This is not a story about bad technology. Most of these tools work fine in a technical sense. The problem is that many AI vendors build for the demo, not for the procurement gauntlet. And in healthcare, the procurement gauntlet is where reality lives.

The Typical Arc of a Failed Pilot

It usually starts with a department head or clinical champion who sees a vendor presentation at HIMSS or HLTH. The tool looks great. Maybe it automates prior authorizations, summarizes clinical notes, or flags high-risk patients. A pilot gets approved, often with a limited dataset or a sandbox environment that sidesteps the harder compliance questions.

The pilot goes well enough to justify a broader rollout. At that point, the vendor gets handed a security questionnaire, a BAA redline, and a data governance review. And things fall apart in predictable ways.

Failure Mode 1: The BAA Negotiation Collapses

Under HIPAA, any vendor that creates, receives, maintains, or transmits protected health information on behalf of a covered entity is a business associate. 45 CFR 164.502(e) and 164.504(e) require a Business Associate Agreement before PHI changes hands. This is not optional, and it is not a formality.

Where AI vendors stumble is in the BAA's specifics. Health systems want to know exactly how PHI is processed, where it is stored, who has access, and what happens at contract termination. Many AI companies, especially startups, use third-party model providers (OpenAI, Anthropic, Google) as subprocessors. Each subprocessor relationship needs to be disclosed and contractually covered under 45 CFR 164.502(e)(1)(ii). When a vendor cannot clearly articulate its subprocessor chain, or when the underlying model provider refuses to sign a BAA, the deal dies.

This became more visible after the HHS Office for Civil Rights issued its December 2022 bulletin on tracking technologies, which reminded covered entities that even metadata and analytics data can constitute PHI when linked to a patient's interaction with a health system. The bulletin triggered a wave of vendor reassessments, and many AI tools that had been operating in gray areas suddenly faced direct scrutiny.

Failure Mode 2: Data Use and Model Training Ambiguity

Health system attorneys have gotten very good at asking one question that kills deals: "Will any of our data be used to train or improve your models?"

If the answer is yes, or even "it depends," you have a problem. HIPAA's minimum necessary standard (45 CFR 164.502(b)) requires that PHI use be limited to what is necessary for the specific purpose. Using patient data to improve a commercial model that benefits other customers is a hard sell under that standard, and arguably impermissible without explicit authorization.

The FTC has also weighed in here. In its 2023 enforcement action against BetterHelp ($7.8 million settlement), the Commission made clear that using health-related data for purposes beyond what consumers were told about violates Section 5 of the FTC Act. While BetterHelp was not a traditional covered entity, the signal to the market was unmistakable: regulators are watching how AI companies handle health data for secondary purposes.

Vendors that cannot contractually guarantee data isolation, with clear language that customer data will not be used for model training, refinement, or benchmarking, routinely fail procurement review.

Failure Mode 3: The Security Posture Does Not Hold Up

HIPAA's Security Rule (45 CFR Part 164, Subparts A and C) requires administrative, physical, and technical safeguards for ePHI. Health systems typically operationalize this through frameworks like HITRUST CSF or SOC 2 Type II. When a vendor cannot produce a current HITRUST certification or SOC 2 Type II report, the security team has no efficient way to validate compliance. Manual assessment is expensive and slow, so many procurement offices simply move on.

The numbers here are worth noting. The average cost of a healthcare data breach reached $10.93 million in 2023, according to IBM's Cost of a Data Breach Report. That figure has led risk committees to set increasingly high bars for vendor security. A promising AI tool without HITRUST r2 certification or an equivalent is fighting uphill from the start.

Failure Mode 4: No Clear Audit Trail

Healthcare organizations face audit risk from OCR, CMS, state attorneys general, and accreditation bodies. When an AI system makes or influences clinical or operational decisions, the organization needs to demonstrate what data went in, what logic was applied, and what output was produced. This is both a HIPAA requirement (the Security Rule's audit controls standard at 45 CFR 164.312(b)) and a practical necessity for defending clinical decisions.

Many AI vendors treat logging as an afterthought. They can tell you what the model output was, but not what specific data inputs drove that output, or which version of the model was running at the time. For a health system's compliance team, that gap is disqualifying.

What a Procurement-Friendly AI Vendor Looks Like

The vendors that actually make it through healthcare procurement tend to share a few characteristics:

  • Clean subprocessor disclosure. Every entity that touches PHI is identified, with corresponding BAA coverage. No surprises during legal review.
  • Contractual data isolation. Explicit, non-negotiable language that customer data is not used for model training, improvement, or any secondary purpose. This is table stakes.
  • Current security certifications. HITRUST r2, SOC 2 Type II, or both. Ideally with a bridge letter if the certification cycle does not align with the procurement timeline.
  • Granular audit logging. Full traceability of inputs, outputs, model versions, and user actions. Exportable in formats that integrate with the health system's existing compliance infrastructure.
  • Deployment flexibility. The option to run within the customer's cloud environment or on-premises, so PHI never leaves the organization's control boundary. This is increasingly a hard requirement for large health systems and academic medical centers.
  • Incident response specificity. Not just a generic breach notification clause, but a detailed incident response plan with defined timelines that meet or exceed the HIPAA Breach Notification Rule's 60-day window (45 CFR 164.404) and any applicable state laws that impose shorter deadlines.

None of this is exotic. It is just thorough. The gap between AI vendors that clear procurement and those that do not is almost never about the quality of the underlying model. It is about whether the vendor built its operations, contracts, and infrastructure with the assumption that a sophisticated buyer would actually read the fine print.

How FirmAdapt Addresses This

FirmAdapt was built with healthcare procurement requirements as a design constraint, not a retrofit. The platform supports deployment within customer-controlled environments, maintains strict data isolation with contractual guarantees against secondary use, and provides granular audit logging that maps to HIPAA Security Rule requirements and HITRUST CSF controls. Subprocessor relationships are fully documented and covered under BAAs before any PHI is processed.

For compliance officers and general counsel evaluating AI platforms, FirmAdapt's architecture is designed to answer the hard questions before they get asked. The goal is to make the vendor questionnaire a confirmation exercise, not a discovery process.

Ready to uncover operational inefficiencies and learn how to fix them with AI?
Try FirmAdapt free with 10 analysis credits. No credit card required.
Get Started Free
Why Healthcare AI Pilots Keep Failing the Procurement Stage | FirmAdapt