During a recent financial audit, a team of external auditors encountered an unexpected situation.
The audited company—a distributor of consumer goods—had integrated a generative AI system to automate internal financial reporting, accelerating the monthly close and providing management with decision-useful information.
During the substantive procedures phase, the team found expense-classification inconsistencies. Several maintenance items had been recorded as capital improvements to fixed assets, affecting both the income statement and balance sheet.
The AI model had learned categorization rules from incomplete and poorly labeled historical data, making the outputs not compliant with applicable accounting standards. This raised a key question: how should procedures adapt to validate not only data, but also the AI models that process it?
The Rise of Generative AI in Business
Common uses include automated preparation of financial and management reports, cash-flow and customer-behavior predictions, document generation for corporate communications and regulatory filings, and contract analytics. According to the PwC Global AI Study 2025, 41% of large companies integrate generative AI in financial processes and 27% use it to produce information that feeds financial statements or regulatory reports.
New Risks for Financial Auditing
- Training-data quality and provenance
- Traceability/explainability of model logic
- Uncontrolled model updates (retraining/parameter changes)
- Misuse and data-governance gaps
- Regulatory implications
Adapting Audit Procedures
- Review model technical documentation (training data, algorithms, criteria, last update).
- Parallel re-performance: reclassify samples manually under standard accounting rules and compare to AI outputs.
- Evaluate input/output controls and human review checkpoints before financial-statement inclusion.
- Change-management review: analyze retraining and parameter changes for impact on results.
This surfaced not only the error but also weaknesses in internal control over AI-enabled processes.
Good Practices for Auditors
- Map AI usage across the financial cycle.
- Evaluate data governance (quality, integrity, security).
- Require traceability and explainability, involving data-science experts when needed.
- Consistency and accuracy testing vs. manual or legacy systems.
- Human validation controls before regulatory/financial use.
- Document model changes and authorizations.
A Changing Role for the Financial Auditor
Auditors must understand AI model limitations, evaluate controls in hybrid (people + AI) environments, and translate technology risks into financial-statement impacts—becoming a trust validator of both figures and enabling technology.
Conclusion: AI—Yes, With Control and Evidence
The goal is to ensure AI models, the data they process, and their outputs are reliable, auditable, and aligned with applicable regulations. The future of financial auditing includes reviewing not only balance sheets, but also the code, algorithms, and data behind them.