FirmAdapt
FirmAdapt
LIVE DEMO
Back to Blog
AI complianceregulatoryhealthcareHIPAAPHI

The Difference Between HIPAA-Compliant and HIPAA-Adjacent AI Tools

By Basel IsmailMay 2, 2026

The Difference Between HIPAA-Compliant and HIPAA-Adjacent AI Tools

I've been reviewing AI vendor security documentation for healthcare clients for the better part of two years now, and there's a pattern that keeps showing up. A vendor's marketing page says "HIPAA-compliant" in bold text, sometimes even in the hero banner. You dig into their terms of service, their BAA (if one exists), and their actual technical architecture, and what you find is something much less definitive. The product might be built on infrastructure that could support HIPAA compliance, but the product itself has never touched PHI in a way that's been independently validated, and the vendor has no intention of signing a Business Associate Agreement that covers their AI features.

This gap between marketing language and operational reality is worth understanding in detail, because the liability doesn't land on the vendor. It lands on you.

What HIPAA Actually Requires of AI Vendors

Let's be precise. Under the HIPAA Privacy Rule (45 CFR Part 160 and Part 164, Subparts A and E) and the Security Rule (Subpart C), any entity that creates, receives, maintains, or transmits PHI on behalf of a covered entity is a business associate. If an AI tool processes, stores, or has access to PHI in any form, the vendor operating that tool is a business associate and must sign a BAA. Full stop.

The BAA isn't a formality. It's a binding legal document that obligates the vendor to implement administrative, physical, and technical safeguards; report breaches within 60 days; and submit to the same enforcement framework that applies to covered entities. After the HITECH Act of 2009 and the 2013 Omnibus Rule, business associates became directly liable for HIPAA violations. OCR has enforced this aggressively. In 2024, the agency settled with a business associate, Inmediata Health Group, for $250,000 after a server misconfiguration exposed over 1.5 million individuals' ePHI. Business associates are not in a gray area. They are fully in scope.

The "HIPAA-Adjacent" Playbook

So what does a HIPAA-adjacent AI tool actually look like? Here are the most common patterns I see.

  • Infrastructure-level compliance claims. The vendor hosts on AWS GovCloud or Azure with a signed Microsoft BAA, and then tells you their product is HIPAA-compliant because the underlying cloud is. This is like saying your house is fireproof because the foundation is concrete. The application layer, the data flows, the model training pipeline, the logging and access controls within the software itself; none of that is covered by the cloud provider's BAA. AWS's BAA covers AWS services. It does not cover what a third-party developer builds on top of them.
  • BAA exclusions for AI features. Some vendors will sign a BAA for their core product but explicitly carve out AI or machine learning features. Read the BAA carefully. I've seen language that excludes "beta features," "AI-assisted functionality," or "third-party model integrations" from the scope of the agreement. If the AI feature is what you're buying, that exclusion is a problem.
  • De-identification hand-waving. A vendor tells you their AI tool doesn't handle PHI because the data is "de-identified" before it reaches the model. Under HIPAA, de-identification has a specific legal standard (45 CFR 164.514). You either use the Safe Harbor method, which requires removal of 18 specific identifier types, or the Expert Determination method, which requires a qualified statistical expert to certify that re-identification risk is very small. A vendor saying "we strip out names and dates" is not de-identification under the statute. If even one of those 18 identifiers persists, or if the data can be re-identified through combination with other available data, it's still PHI.
  • No audit trail for model interactions. HIPAA's Security Rule requires audit controls (45 CFR 164.312(b)). If an AI tool processes a query containing PHI, there needs to be a log of who submitted it, when, what data was involved, and what the system did with it. Many AI tools, particularly those built on top of large language models, don't maintain this kind of granular audit trail. The query goes in, the response comes out, and nothing in between is logged in a way that satisfies 164.312(b).

How to Read Between the Lines

When you're evaluating an AI vendor for use with PHI, there are specific questions that separate genuine compliance from marketing.

Ask for the BAA first, not last

Request the BAA before the demo. If the sales team hesitates, redirects you to a "security overview" page, or says the BAA is "in development," you have your answer. A vendor that handles PHI routinely will have a BAA ready to send within hours.

Read the BAA scope carefully

Check whether the BAA covers all services you intend to use, including AI and ML features. Look for exclusion language. If the BAA references a "Covered Services" exhibit, get that exhibit and confirm the AI functionality is listed.

Ask where model inference happens

Does the query leave your environment? Is it sent to a third-party API (OpenAI, Anthropic, Google)? If so, does the vendor have a BAA with that third party? OpenAI's API terms, as of early 2025, do allow enterprise customers to negotiate BAAs for certain configurations, but the default API terms do not include one. The chain of custody matters. Every entity that touches PHI needs to be covered.

Request a SOC 2 Type II report

SOC 2 isn't HIPAA, but a Type II report covering the Security and Availability trust service criteria tells you whether the vendor has been independently audited on their controls over a sustained period (typically 6 to 12 months). If a vendor can't produce one, their compliance claims are self-attested, which means they're worth roughly what you paid for them.

Ask about data retention and model training

Does the vendor use your data to train or fine-tune their models? Under HIPAA's minimum necessary standard (45 CFR 164.502(b)), a business associate should only use PHI for the purposes specified in the BAA. Using PHI to improve a general-purpose model almost certainly exceeds that scope. This was a central issue in the Cerebral, Inc. enforcement action in 2024, where OCR investigated the telehealth company for sharing PHI with third-party platforms, including AI and analytics tools, without proper authorization. The proposed penalty was $7.1 million.

The Risk Is Asymmetric

Here's what makes this particularly frustrating. If a vendor misrepresents their HIPAA compliance and a breach occurs, OCR will investigate the covered entity as well as the business associate. The covered entity's obligation under 45 CFR 164.308(b)(1) is to obtain satisfactory assurances from its business associates. "We relied on their marketing" is not a satisfactory assurance. A signed BAA is. A vendor risk assessment is. Due diligence documentation is.

The penalties reflect this. OCR's enforcement actions in 2023 and 2024 have increasingly targeted covered entities for failures in business associate oversight. The average settlement in OCR's 2023 enforcement actions exceeded $1 million. And those are settlements; litigation costs and reputational damage come on top.

Where FirmAdapt Fits

FirmAdapt was built from the ground up to operate within regulated environments, not adapted after the fact. The platform's architecture ensures that PHI never leaves a covered entity's compliance boundary, model inference occurs within environments covered by executed BAAs, and every interaction is logged with the granularity required by 45 CFR 164.312(b). FirmAdapt signs BAAs as a standard part of onboarding, not as an afterthought.

For healthcare organizations evaluating AI tools, FirmAdapt provides the documentation, audit trails, and contractual commitments that make vendor risk assessments straightforward. The goal is to make compliance verifiable rather than something you have to take a vendor's word for.

Ready to uncover operational inefficiencies and learn how to fix them with AI?
Try FirmAdapt free with 10 analysis credits. No credit card required.
Get Started Free
The Difference Between HIPAA-Compliant and HIPAA-Adjacent AI | FirmAdapt