FirmAdapt
FirmAdapt
LIVE DEMO
Back to Blog
AI complianceregulatorydefenseITARCMMCNIST 800-171

NIST SP 800-171 Rev 3 and the AI Tooling Question Nobody Asked the Authors

By Basel IsmailMay 10, 2026

NIST SP 800-171 Rev 3 and the AI Tooling Question Nobody Asked the Authors

NIST published SP 800-171 Revision 3 in May 2024, finalizing the update after a lengthy comment period that drew over 3,000 responses. The revision reorganized the security requirements, aligning them more tightly with SP 800-53 Rev 5, and expanded from 110 practices in 14 families to a restructured set of requirements across the same families but with notably different granularity. If you work in the defense industrial base and handle Controlled Unclassified Information, you already know this document is the backbone of your System Security Plan. What you may not have fully reckoned with is how poorly the framework addresses the AI tools your teams are almost certainly already using.

What Rev 3 Actually Changed

The big structural shift is the move away from the "basic" and "derived" requirement taxonomy from Rev 2. Requirements now map directly to SP 800-53 controls, which makes the traceability cleaner but also raises the bar. Several requirements that were previously implied are now explicit. The Access Control family, for example, got more specific about session management and account types. The Audit and Accountability family tightened requirements around audit record content and correlation. System and Communications Protection picked up additional requirements around boundary protection and cryptographic mechanisms.

NIST also introduced Organization-Defined Parameters, or ODPs, which give assessors and organizations flexibility but also create new documentation obligations. You now need to specify values for things like session lock timeouts, audit retention periods, and access attempt thresholds in your SSP. This is a meaningful change because CMMC Level 2 assessors under 32 CFR Part 170 (the final CMMC rule published in October 2024) will be evaluating against these ODPs. Vague answers will not pass.

What Rev 3 did not do, and this is the gap worth talking about, is address how AI tools interact with CUI in any systematic way. The word "artificial intelligence" does not appear in the document. Neither does "machine learning," "large language model," or "generative AI." The framework treats data processing as a function of systems and users, with controls designed around traditional IT architectures. That made sense in 2020. It is increasingly incomplete in 2025.

The CUI Processing Gap

Consider a concrete scenario. A contracts manager at a mid-tier defense subcontractor uses an AI summarization tool to process a 200-page technical data package marked CUI. The tool is cloud-hosted. It ingests the document, processes it through a model, and returns a summary. Where did the data go during inference? Was it logged? Was it used for model training? Was it transmitted to a third-party API endpoint outside the authorization boundary defined in the SSP?

Under Rev 3, several requirement families are implicated here. SC (System and Communications Protection) requires protection of CUI at rest and in transit, including cryptographic protections. MP (Media Protection) governs how CUI is stored and sanitized. AU (Audit and Accountability) requires logging of events that could affect CUI confidentiality. AC (Access Control) requires that only authorized users and processes access CUI. SI (System and Information Integrity) requires monitoring for unauthorized changes and exfiltration.

The problem is that most SSPs were written to describe on-premise servers, VPNs, endpoint protection, and user access policies. They describe a world of known systems with defined boundaries. An AI tool, especially a SaaS-based one, introduces a processing layer that may sit entirely outside the described boundary. If your SSP does not explicitly account for how AI tools handle CUI, you have an undocumented data flow. And an undocumented data flow is, by definition, a finding under CMMC assessment.

This Is Not Hypothetical

The Department of Defense has been clear about the seriousness of CUI protection failures. The False Claims Act, specifically 31 U.S.C. 3729-3733, has been used to pursue contractors who misrepresent their cybersecurity compliance. In 2022, Aerojet Rocketdyne settled a False Claims Act case for $9 million over allegations that it misrepresented its compliance with NIST 800-171 requirements. The DOJ's Civil Cyber-Fraud Initiative, launched in October 2021, was created specifically to go after this kind of gap. If your SSP says you protect CUI within a defined boundary, and your employees are feeding CUI into AI tools outside that boundary, you have a representation problem that carries real legal exposure.

The Cybersecurity Maturity Model Certification program compounds this. Under the final rule, Level 2 certification requires a third-party assessment by a C3PAO. Assessors will be looking at your SSP, your POA&Ms, and your actual implementation. They will ask how data flows through your environment. If someone on your team is using Copilot, ChatGPT, Claude, or any other AI assistant to process CUI, and that tool is not documented in your SSP with appropriate controls mapped, you will have a gap. The assessor is not going to ignore it because AI is new and everyone is figuring it out.

What You Should Be Doing Now

First, inventory your AI tool usage. This sounds basic, but most organizations have not done it rigorously. Shadow AI is the new shadow IT, and it is arguably more dangerous because the data exposure is less visible. You need to know which tools are being used, by whom, on what data, and through what infrastructure.

  • Map AI tools to your authorization boundary. If a tool processes CUI, it needs to be inside your boundary or explicitly addressed as an external service with appropriate controls. This means updating your SSP, your network diagrams, and your data flow documentation.
  • Evaluate AI vendor compliance. Does the vendor meet FedRAMP Moderate baseline or equivalent? Do they offer a government or isolated tenant? What are their data retention and training data policies? If the vendor uses your input data for model improvement, that is a CUI spillage event.
  • Apply Rev 3 requirements explicitly to AI processing. Map each AI tool against the relevant control families. SC-8 (Transmission Confidentiality and Integrity), SC-28 (Protection of Information at Rest), AC-3 (Access Enforcement), AU-3 (Content of Audit Records), and SI-4 (System Monitoring) are good starting points.
  • Update your POA&M if you have gaps. Honest documentation of known gaps with remediation timelines is far better than an SSP that claims compliance you cannot demonstrate. Assessors and DOJ investigators both respond better to organizations that identified and are actively addressing issues.

The Rev 3 ODPs actually give you a useful mechanism here. When you define your organization-specific parameters, you can include AI-specific values. For example, your session management ODP can address AI tool session behavior. Your audit logging ODP can specify what events AI tools must log. This is not a workaround; it is using the framework as designed, just applied to a technology the authors did not specifically contemplate.

How FirmAdapt Addresses This

FirmAdapt was built to process sensitive data within compliance boundaries, not bolted onto a consumer AI platform after the fact. The architecture keeps CUI processing within defined, documentable authorization boundaries, which means when you update your SSP to account for AI tooling, you can actually describe what is happening, where the data goes, and what controls are in place. The platform generates the audit records and access logs that Rev 3 requires, mapped to the specific control families your assessor will evaluate.

For defense contractors preparing for CMMC Level 2 assessments, FirmAdapt provides a way to adopt AI tooling without creating the undocumented data flows that turn into findings. The compliance documentation is built into the product, not produced as an afterthought. If you are going to use AI on CUI, and your competitors certainly will, the question is whether you can do it in a way that survives a C3PAO assessment and a DOJ inquiry. FirmAdapt is designed to make that answer straightforward.

Ready to uncover operational inefficiencies and learn how to fix them with AI?
Try FirmAdapt free with 10 analysis credits. No credit card required.
Get Started Free