The HIPAA BAA Gap: Why ChatGPT Plus, Team, and Self-Serve Enterprise Will Get You Fined
The HIPAA BAA Gap: Why ChatGPT Plus, Team, and Self-Serve Enterprise Will Get You Fined
OpenAI's own documentation says it clearly, but you have to know where to look. The company will sign a Business Associate Agreement only for sales-managed Enterprise and Edu accounts. Not ChatGPT Plus. Not ChatGPT Team. Not even the self-serve version of ChatGPT Enterprise that you can spin up with a credit card. If you are a covered entity or a business associate under HIPAA and you are feeding protected health information into any of those products, you do not have a valid BAA in place. And under 45 CFR 164.502(e), that alone is a violation.
I keep seeing health systems, group practices, and healthtech startups assume they are covered because they are paying for an "Enterprise" tier. The naming is genuinely confusing, and I think OpenAI could do more to clarify it. But the compliance burden is on you, not on them.
Which OpenAI Products Qualify for a BAA (and Which Do Not)
Let me lay this out plainly:
- ChatGPT Free: No BAA. No data processing agreement relevant to HIPAA. OpenAI may use your inputs for model training unless you opt out. Absolutely not appropriate for PHI.
- ChatGPT Plus ($20/month): No BAA available. OpenAI's terms of service govern. You can disable training on your data, but that is not the same as HIPAA compliance. There is no access control, no audit logging suitable for 45 CFR 164.312, and no BAA.
- ChatGPT Team ($25-30/user/month): No BAA. OpenAI states that business data is not used for training, and you get a workspace with admin controls. But OpenAI does not offer a BAA for Team accounts. Period.
- ChatGPT Enterprise (self-serve): This is the one that trips people up. OpenAI introduced a self-serve Enterprise option in 2024 that lets you sign up without going through a sales process. It includes SSO, admin console features, and data encryption at rest. It looks like the real thing. But OpenAI's BAA is only available through the sales-managed Enterprise contract. If you did not negotiate directly with an OpenAI sales representative and execute a BAA as part of that process, you do not have one.
- ChatGPT Enterprise (sales-managed): BAA available. This requires direct engagement with OpenAI's sales team, a negotiated contract, and explicit execution of the BAA. OpenAI confirms this in their trust documentation.
- ChatGPT Edu: BAA available through institutional agreements, again sales-managed.
- OpenAI API (Platform): OpenAI updated its Data Processing Addendum in early 2024 and now supports BAAs for API customers, but again, this requires a specific agreement. Using the API with a standard pay-as-you-go account does not automatically include a BAA. You need to request and execute one.
What Covered Entities Get Wrong
The most common mistake is conflating "enterprise-grade security features" with "HIPAA compliance." They are related but not equivalent. A product can offer AES-256 encryption, SOC 2 Type II certification, and role-based access controls while still failing to meet HIPAA requirements, because HIPAA compliance is not purely a technical standard. It is a contractual and administrative framework. Without a signed BAA, there is no contractual obligation for OpenAI to report breaches to you within the timeframes required by 45 CFR 164.410, no obligation to limit uses and disclosures of PHI, and no obligation to make records available to HHS during an investigation.
The second mistake is assuming that de-identification solves the problem. Some compliance teams tell clinicians to strip out names and dates before pasting notes into ChatGPT. In theory, if data is de-identified per the Safe Harbor method under 45 CFR 164.514(b), it is no longer PHI and HIPAA does not apply. In practice, clinical notes are extraordinarily difficult to fully de-identify. A combination of diagnosis, treatment plan, age range, and geographic region can re-identify a patient, especially in rural areas or for rare conditions. If your de-identification process is not airtight, you are transmitting PHI to a vendor without a BAA.
Third, people underestimate enforcement. HHS OCR has been increasingly aggressive. The 2023 settlement with Lafourche Medical Group for $480,000 involved a phishing breach, but OCR's investigation revealed the absence of a proper risk analysis and missing BAAs with vendors. In December 2023, OCR sent a bulletin specifically warning about the use of online tracking technologies by covered entities, reinforcing that any technology receiving PHI needs a BAA. The fine schedule under the HITECH Act, as amended, goes up to $2,067,813 per violation category per year (adjusted for inflation as of 2024). And state attorneys general can bring separate actions under their own health privacy laws.
The API Is Not Automatically Safe Either
A lot of healthtech companies build internal tools on the OpenAI API and assume the API's default terms are sufficient. They are not. The standard API terms of service include a Data Processing Addendum, but the DPA alone does not constitute a BAA. OpenAI's trust page specifies that customers who need a BAA for API usage must contact them to execute one. If your engineering team spun up an API integration and your legal team never followed up with OpenAI on the BAA, you have a gap. This is especially common at startups where the engineering and compliance functions do not talk to each other frequently enough.
What About Microsoft Azure OpenAI?
Worth noting: Microsoft offers Azure OpenAI Service, which runs OpenAI models within Azure's infrastructure. Microsoft does include Azure OpenAI Service in its HIPAA BAA for Azure customers, and Azure has been on Microsoft's HIPAA-covered services list since 2024. If you are already an Azure customer with a Microsoft BAA in place, this is a more straightforward path to using GPT-4 class models with PHI. The models are the same; the compliance wrapper is different. Many health systems already have Microsoft enterprise agreements that include the BAA, which makes Azure OpenAI the path of least resistance for HIPAA-regulated use cases.
What a Proper BAA Posture Looks Like
A valid BAA should, at minimum, address the required elements in 45 CFR 164.504(e)(2): permitted uses and disclosures, safeguards the business associate will implement, breach notification obligations, requirements to return or destroy PHI at termination, and the obligation to ensure any subcontractors agree to the same restrictions. You should also confirm that the vendor's BAA covers the specific product or service you are using. A BAA that covers "Azure services" does not necessarily cover a different product from the same parent company. Read the scope clauses carefully.
How FirmAdapt Handles This
FirmAdapt was built with the assumption that every customer in healthcare needs a BAA before any data touches the platform. We execute BAAs as a standard part of onboarding for all healthcare customers, not as an add-on or a sales-tier unlock. Our architecture routes AI processing through infrastructure where we maintain direct contractual and technical control over data handling, so there is no ambiguity about which vendor is responsible for what. PHI is never sent to a third-party AI provider without a BAA chain that covers every link.
We also maintain audit logs that map to the access controls and activity tracking requirements in the HIPAA Security Rule, specifically 45 CFR 164.312(b) and 164.312(d). If OCR comes knocking, you can produce documentation showing exactly what data was processed, by whom, when, and under what authorization. The compliance layer is not bolted on after the fact; it is the foundation the product is built on.