DFARS 252.204-7012 and the AI Tool Inventory Question
DFARS 252.204-7012 and the AI Tool Inventory Question
DFARS 252.204-7012, "Safeguarding Covered Defense Information and Cyber Incident Reporting," has been a fixture in defense contracts since its final rule in October 2016. Most contractors know the basics: protect covered defense information (CDI) on your systems, implement the 110 security controls in NIST SP 800-171, and report cyber incidents to the DoD within 72 hours. What fewer contractors have reckoned with is how AI tools interact with these requirements, and whether your current compliance program even knows which AI tools are touching CDI.
A Quick Refresher on What 7012 Actually Requires
The clause flows down to any contractor or subcontractor whose information systems process, store, or transmit CDI. CDI includes controlled technical information, export-controlled data, and other categories marked or identified in the contract. The operative requirement is "adequate security," defined by reference to NIST SP 800-171 Rev 2 (with Rev 3 finalized in May 2024, though the transition timeline under CMMC 2.0 is still playing out).
Three obligations matter here:
- Adequate security for CDI. You need all 110 controls from NIST SP 800-171 implemented, or documented in a Plan of Action and Milestones (POA&M) with a System Security Plan (SSP) that accounts for any gaps.
- Cyber incident reporting. Within 72 hours of discovery, report to the DoD via the DIBNet portal. Preserve images of affected systems and relevant monitoring data for 90 days.
- Flow-down. The clause must be included in subcontracts where CDI will be involved. No exceptions for commercial items, which is unusual and intentional.
The 32 CFR Part 236 final rule and subsequent CMMC rulemaking (32 CFR Part 170, published December 2023) layer assessment and certification requirements on top. But 7012 itself remains the contractual hook. If you violate it, you have a potential False Claims Act problem, not just a compliance gap. The DOJ's Civil Cyber-Fraud Initiative, launched in October 2021, has already produced settlements. In 2022, Aerojet Rocketdyne settled for $9 million over allegations of misrepresenting its NIST 800-171 compliance. The $11.7 million settlement with Penn State in October 2024 drove the point home further.
Where AI Tools Create a Problem You Might Not Have Scoped
Here is the scenario that should concern you. An engineer on a defense program pastes technical data into a commercial AI coding assistant to debug a script. A contracts manager feeds sections of a controlled technical document into a summarization tool. A program manager uses an AI-powered project management platform that ingests file attachments for auto-categorization. In each case, CDI has potentially left your controlled environment and entered a system you do not own, have not assessed, and cannot report on.
NIST SP 800-171 control 3.1.1 requires you to limit system access to authorized users, processes acting on behalf of authorized users, and devices. Control 3.1.2 limits access to the types of transactions and functions that authorized users are permitted to execute. Control 3.13.1 requires monitoring, control, and protection of communications at external boundaries and key internal boundaries. A third-party AI tool that ingests CDI is, functionally, an external system boundary. If it is not in your SSP, you have an undocumented external connection processing controlled information.
The media access controls in the 3.8 family are relevant too. When CDI gets pasted into a prompt window, it is being transferred to external media in a meaningful sense. If the AI provider retains prompts for training (many do, unless you have negotiated otherwise), your data is now persisted on systems outside your security boundary with no contractual protections, no FedRAMP authorization, and no incident reporting obligation flowing back to you.
The Inventory You Should Already Have
NIST SP 800-171 control 3.4.1 requires organizations to establish and maintain baseline configurations and inventories of organizational systems. If AI tools are being used in connection with CDI, they need to be in that inventory. Period. This is not a forward-looking recommendation; it is a current requirement.
A useful AI tool inventory for 7012 compliance should include, at minimum:
- Tool name and provider. Including version, deployment model (cloud, on-premises, hybrid), and whether it is FedRAMP authorized or equivalent.
- Data flows. What types of information can the tool access? Can users input free text? Does it pull from connected repositories?
- Retention and training policies. Does the provider retain prompt data? Use it for model training? What are the contractual terms?
- Authentication and access controls. How does the tool authenticate users? Does it integrate with your identity provider? Can you enforce role-based access?
- Incident response provisions. If the provider experiences a breach, what is their notification timeline? Does it align with your 72-hour obligation under 7012?
- Flow-down status. Are your subcontractors using AI tools? Have you addressed this in your flow-down requirements?
If you do not have this inventory, you have a gap in your SSP. And if your SSP does not reflect reality, your CMMC assessment (whenever it arrives) is going to surface that gap in an uncomfortable way.
The Incident Reporting Angle
Consider a scenario where CDI is exposed through a third-party AI tool that suffers a breach. Under 7012(c), you must report cyber incidents that affect CDI or the contractor's information system. If CDI was in the AI provider's environment and that environment is compromised, you have a reportable incident. But you can only report what you know about. If the AI tool is not in your inventory, you may not learn about the breach in time, or at all. You will have failed both the safeguarding and reporting requirements simultaneously.
The 72-hour clock is unforgiving. It starts at discovery, not at confirmation. The DoD has been clear that "discovery" means the moment you have information suggesting an incident may have occurred. Waiting for the AI vendor to complete their investigation before you report is not a defensible position.
Practical Steps for Right Now
First, survey your workforce. Find out what AI tools people are actually using, not what you have approved. Shadow AI adoption is rampant across every sector; defense contractors are not immune. Second, update your SSP and network diagrams to reflect any AI tools that interact with CDI. Third, review your acceptable use policies and make AI-specific provisions explicit. Fourth, check your subcontractor flow-downs. If your subs are using AI tools on CDI and you have not addressed it, that is your risk, not theirs.
Finally, consider whether the AI tools you want to use can actually operate within your security boundary. On-premises or private-tenant deployments that keep data inside your assessed environment are a fundamentally different risk profile than commercial SaaS tools where your prompts traverse infrastructure you cannot audit.
How FirmAdapt Addresses This
FirmAdapt was built with exactly this kind of regulatory constraint in mind. The platform operates within a compliance-first architecture, meaning data handling, retention, and access controls are designed around frameworks like DFARS 7012 and NIST SP 800-171 from the ground up, rather than bolted on after the fact. For defense contractors, this means AI functionality that does not create undocumented external system boundaries or uncontrolled data flows.
FirmAdapt also provides the kind of auditable logging and access controls that support SSP documentation and incident response obligations. If your compliance program needs AI capabilities without introducing the inventory and boundary problems described above, FirmAdapt is designed to fit within your existing security architecture rather than outside it.