FirmAdapt
FirmAdapt
LIVE DEMO
Back to Blog
AI complianceregulatoryhealthcareHIPAAPHI

Medical Device Manufacturers, FDA AI Guidance, and the HIPAA Overlay

By Basel IsmailMay 3, 2026

Medical Device Manufacturers, FDA AI Guidance, and the HIPAA Overlay

FDA has been steadily building its regulatory framework for AI and ML-enabled medical devices since the 2021 action plan, and the predetermined change control plan (PCCP) concept is now central to how manufacturers think about iterative algorithm updates. But there is a layer of complexity that does not get enough attention in most discussions: what happens when these adaptive devices are also processing protected health information under HIPAA. The intersection creates compliance obligations that are genuinely tricky, and getting them wrong exposes manufacturers to enforcement from two federal agencies simultaneously.

The FDA AI/ML Framework in Brief

FDA's approach to AI/ML-enabled Software as a Medical Device (SaMD) has evolved through several key documents. The September 2021 action plan laid out five pillars, and the March 2023 draft guidance on predetermined change control plans gave manufacturers the first concrete framework for how to get premarket authorization for algorithms that are designed to change over time. The final guidance on PCCPs landed in December 2024, and it clarified what FDA expects in terms of describing both the what and the how of planned modifications.

A PCCP has two core components: a description of specific planned modifications (the SaMD Pre-Specifications, or SPS) and a defined methodology for implementing those changes in a controlled manner (the Algorithm Change Protocol, or ACP). If your 510(k), De Novo, or PMA submission includes an adequate PCCP, you can implement the described modifications without submitting a new premarket submission each time. This is a significant efficiency gain for manufacturers building devices that learn from real-world data.

As of October 2024, FDA had authorized over 950 AI/ML-enabled devices, up from 521 in 2023. The pace is accelerating. Radiology dominates (around 75% of cleared devices), but cardiology, pathology, and ophthalmology are growing fast. Many of these devices, particularly those that improve their performance based on patient data collected during clinical use, are exactly the ones where the HIPAA overlay becomes relevant.

Where HIPAA Enters the Picture

Here is the core issue. When an AI-enabled medical device processes, stores, or transmits PHI, the manufacturer's regulatory posture under HIPAA depends on its role in the data flow. If the manufacturer is acting as a business associate of a covered entity (a hospital, a health plan, a clearinghouse), then the full weight of the HIPAA Privacy Rule, Security Rule, and Breach Notification Rule applies. And most manufacturers of SaMD that ingests patient data to improve its algorithms are business associates, whether or not they have fully internalized that fact.

The 2013 Omnibus Rule made this explicit by expanding the definition of business associate to include subcontractors and entities that create, receive, maintain, or transmit PHI on behalf of a covered entity. A manufacturer whose device sits in a hospital's radiology workflow and sends imaging data back for model retraining is squarely within this definition. The business associate agreement (BAA) is not optional; it is a statutory requirement under 45 CFR 164.502(e).

The PCCP and PHI Retraining Tension

This is where it gets interesting. The whole point of a PCCP is to allow your algorithm to evolve. The ACP describes how you will retrain or update the model, including what data you will use and how you will validate changes. But if the retraining data includes PHI, every step in that pipeline has HIPAA implications.

  • Minimum necessary standard. Under 45 CFR 164.502(b), a business associate must limit its use of PHI to the minimum necessary to accomplish the intended purpose. If your ACP calls for retraining on a broad dataset of patient imaging, you need to demonstrate that the scope of data accessed is genuinely necessary for the algorithmic improvement described in your SPS.
  • De-identification requirements. If you want to avoid HIPAA constraints on your retraining data, you can de-identify it under the Safe Harbor method (45 CFR 164.514(b)) or the Expert Determination method. But de-identification that strips too much clinical context can degrade model performance, creating a direct tension with your FDA validation requirements.
  • Security Rule controls for data in transit. Retraining pipelines that move PHI from clinical sites to manufacturer infrastructure must comply with the administrative, physical, and technical safeguards in 45 CFR 164.308, 164.310, and 164.312. Encryption in transit and at rest is addressable rather than required, but try explaining to OCR after a breach why you chose not to encrypt PHI flowing through a cloud-based ML training pipeline.
  • Breach notification for model inversion or data leakage. ML models can be vulnerable to model inversion attacks and membership inference attacks that effectively reconstruct or confirm the presence of training data. If your retrained model inadvertently leaks PHI, you may have a reportable breach under 45 CFR 164.404. OCR has not yet brought an enforcement action specifically on this theory, but the legal framework supports it.

Dual Documentation Burden

FDA expects your PCCP to describe your change control methodology with enough specificity that the agency can evaluate it during premarket review. HIPAA expects you to maintain policies and procedures, conduct risk assessments (45 CFR 164.308(a)(1)(ii)(A)), and document your safeguards. These two documentation regimes are not naturally aligned, and most manufacturers maintain them in separate silos, which creates gaps.

For example, your PCCP might describe a retraining protocol that uses "real-world performance data collected from deployed devices." FDA reviewers will evaluate whether your validation methodology is sound. But that same description, read through a HIPAA lens, raises questions about consent, data use limitations in your BAA, minimum necessary compliance, and whether your risk assessment accounts for the specific threats introduced by the retraining pipeline. If your quality/regulatory team and your privacy/security team are not coordinating on this, you have a problem you may not discover until an audit or a breach.

Enforcement Reality

OCR's enforcement budget has fluctuated, but the agency collected $4.2 million in HIPAA penalties in 2023 and has shown increasing interest in technology-related violations. The Pixel tracking enforcement wave in 2022 and 2023, which targeted hospitals using Meta Pixel and similar tools that transmitted PHI to third parties, demonstrated OCR's willingness to pursue novel data flow scenarios. An AI retraining pipeline that moves PHI to a manufacturer's cloud environment is not conceptually different from a tracking pixel that sends patient data to an ad platform; the mechanism differs, but the regulatory analysis is the same.

On the FDA side, the agency has been relatively collaborative with manufacturers on PCCP submissions so far, but the December 2024 final guidance tightened expectations around ACP specificity. Vague descriptions of retraining methodology are less likely to be accepted. This actually helps with HIPAA compliance, because a more specific ACP forces you to think through exactly what data you need and how it flows, which maps directly onto the minimum necessary analysis.

Practical Recommendations

  • Integrate your PCCP and HIPAA risk assessment. Your Algorithm Change Protocol should be developed in consultation with your privacy officer, not just your regulatory affairs team. The retraining data pipeline is both a quality system concern and a HIPAA concern.
  • Pressure-test your de-identification strategy. If you plan to de-identify retraining data to avoid HIPAA constraints, validate that the de-identified dataset still supports the model performance claims in your FDA submission. Document this analysis.
  • Map your BAA obligations to your SPS. Your business associate agreements with covered entities should explicitly address algorithmic retraining as a permitted use. If your BAA only covers "treatment, payment, and healthcare operations" in generic terms, it may not clearly authorize the data use your PCCP contemplates.
  • Build breach detection into your ML pipeline. Standard HIPAA breach detection focuses on unauthorized access and exfiltration. For ML-enabled devices, you also need monitoring for model inversion vulnerabilities and unintended memorization of training data.

How FirmAdapt Addresses This

FirmAdapt's platform is built to handle exactly this kind of multi-framework overlap. For medical device manufacturers operating under both FDA and HIPAA, FirmAdapt maps compliance controls across both regimes simultaneously, so your PCCP documentation and your HIPAA risk assessment draw from a unified data flow model rather than duplicative, siloed processes. The platform flags conflicts between your planned algorithmic changes and your BAA obligations before they become audit findings.

Because FirmAdapt's architecture is compliance-first, PHI handling controls are embedded at the infrastructure level rather than bolted on after the fact. This means retraining pipelines, data access logs, and minimum necessary analyses are tracked within the same system that manages your FDA quality documentation. For manufacturers scaling their AI/ML device portfolios, this integrated approach reduces the coordination burden between regulatory, privacy, and engineering teams considerably.

Ready to uncover operational inefficiencies and learn how to fix them with AI?
Try FirmAdapt free with 10 analysis credits. No credit card required.
Get Started Free
Medical Device Manufacturers, FDA AI Guidance, and the HIPAA | FirmAdapt