SOX 404 Internal Controls When Your Financial Statement Process Touches AI
SOX 404 Internal Controls When Your Financial Statement Process Touches AI
If you have an AI tool anywhere in your financial close process, your ICFR documentation requirements just got more complicated. Not impossibly so, but meaningfully. And the PCAOB has started paying attention in ways that should make your controller's office sit up and think carefully about how they describe what happens between the trial balance and the filed 10-K.
Let me walk through where the friction points actually are, because most of the guidance out there is either too abstract or written by people who have never mapped a close process.
Where AI Actually Shows Up in the Close
The financial close is not a single event. It is a sequence of judgments, reconciliations, and reviews that typically spans 10 to 25 business days. AI tools are increasingly embedded at specific points: automated journal entry preparation, lease classification under ASC 842, revenue recognition estimates under ASC 606, allowance for credit loss modeling under CECL (ASC 326), and intercompany reconciliation matching.
Each of these represents a point where an AI system is either making or materially influencing an assertion that ends up in the financial statements. Under SOX Section 404(b), management must assess the effectiveness of ICFR, and for accelerated filers, the external auditor must attest to that assessment. The COSO 2013 framework, which virtually every public company uses for its ICFR evaluation, requires that controls be designed, implemented, and operating effectively. When you insert a model that learns or adapts, you are introducing a control component that behaves differently from a static Excel formula or an ERP configuration.
The Documentation Problem
PCAOB AS 2201 (the integrated audit standard) requires auditors to evaluate the design and operating effectiveness of controls over significant accounts and relevant assertions. When an AI tool touches a significant account, the auditor needs to understand what the tool does, how it was validated, what inputs it consumes, and how management monitors its outputs.
Here is where companies stumble. Traditional ICFR documentation describes a control in terms of who performs it, what they review, how often, and what evidence they retain. A typical control description might read: "The Assistant Controller reviews the monthly lease liability rollforward and compares it to the subledger output, investigating variances greater than $50,000."
When an AI tool generates that subledger output, or classifies the leases feeding into it, the control description needs to extend backward into the tool itself. You need to document:
- The model's purpose and scope within the close process
- Training data sources and how they were validated
- Input data integrity controls (completeness, accuracy, authorization)
- Model governance, including version control and change management
- Output validation procedures, including thresholds for human review
- Monitoring for model drift or degradation over reporting periods
This is not optional enrichment. The SEC's 2023 enforcement action against Cassava Sciences (SEC Administrative Proceeding File No. 3-21613) reinforced that internal controls must address the reliability of data and processes feeding financial reporting, regardless of whether those processes are manual or automated. While that case involved research data rather than AI specifically, the principle maps directly: if the process that generates your numbers is unreliable, your ICFR assessment is deficient.
IT General Controls Get Heavier
Your IT general controls (ITGCs) already cover change management, access controls, and computer operations for financially relevant systems. Adding an AI model to a financially relevant process means that model's environment falls within ITGC scope.
Practically, this means your SOX compliance team needs to coordinate with whoever manages the AI tool on questions like: Who can retrain the model? Is retraining a change that triggers your change management control? How is access to the training data restricted? Is there an audit trail for model version changes?
The PCAOB's 2023 Staff Report on inspection observations noted recurring deficiencies in how auditors evaluated IT controls over automated processes, specifically calling out insufficient understanding of the "completeness and accuracy of system-generated data and reports." AI-generated outputs are squarely in that category. If your auditor cannot trace from the AI output back through the model logic to validated input data, expect a control deficiency finding.
The Materiality Threshold Question
Not every use of AI in the close process will be in scope for SOX 404. The scoping exercise still follows the top-down, risk-based approach outlined in AS 2201. If an AI tool is used to auto-categorize immaterial expense line items, it probably falls below the threshold. But if it is generating estimates for your allowance for credit losses on a $2 billion loan portfolio, it is clearly a relevant control point.
The gray area is where most companies live. An AI tool that performs intercompany matching might seem low risk until you realize it is responsible for eliminations that, if wrong, would misstate consolidated revenue. The scoping analysis needs to consider not just the dollar amount the tool touches but the nature of the assertion and the potential for misstatement.
What Auditors Are Actually Asking
Based on conversations with audit partners at three of the Big Four over the past year, here is what they are increasingly requesting when AI touches the close:
- A model risk management framework or equivalent documentation, even if you are not a bank subject to SR 11-7
- Evidence of periodic back-testing or validation of model outputs against actual results
- Clear delineation of which judgments the model makes versus which judgments a human makes
- Documented escalation procedures for when the model produces unexpected outputs
- Completeness and accuracy testing of the data flowing into the model
The SR 11-7 reference is worth pausing on. The Federal Reserve's 2011 guidance on model risk management was written for banks, but it has become the de facto framework that auditors reference when evaluating any company's AI governance in a financial reporting context. If you are a public company using AI in your close and you have not at least mapped your practices against SR 11-7's three lines of defense, you are behind where your auditor expects you to be.
Practical Steps for the Next Filing Cycle
If you are preparing for your next SOX 404 assessment and AI tools have entered your close process since the last cycle, here is a reasonable approach:
- Inventory AI touchpoints. Map every point where an AI or ML tool interacts with data that feeds financial statements. Include tools used by third-party service providers covered by SOC 1 reports.
- Extend control descriptions. For each AI touchpoint in a significant account, update your process-level control documentation to address model inputs, logic, outputs, and monitoring.
- Align ITGCs. Confirm that your change management, access control, and operations controls explicitly cover AI model environments.
- Engage your auditor early. Do not wait for fieldwork. Walk your auditor through the AI components during planning so they can scope their testing appropriately.
- Document the human overlay. For every AI-assisted judgment, document what the human reviewer evaluates, what thresholds trigger further investigation, and what evidence they retain.
How FirmAdapt Addresses This
FirmAdapt's architecture was built with auditability as a core requirement, not an afterthought. Every AI-assisted output in the platform includes a full provenance chain: input data sources, model version, decision logic, confidence scores, and the human review actions taken on the output. This maps directly to the documentation requirements that SOX 404 assessments and PCAOB inspections demand when AI participates in financial reporting processes.
For companies that need to demonstrate ICFR effectiveness over AI-assisted processes, FirmAdapt provides the evidence trail that both management and external auditors require. The platform's change management logging, role-based access controls, and version-controlled model governance are designed to satisfy ITGC requirements without requiring your SOX team to build a parallel documentation layer on top of the AI tool.