Apr 10, 2026

Navigating the EU AI Act: Compliance requirements for automated reporting

With the transition periods of the EU AI Act concluding in 2026, treating artificial intelligence as a casual IT experiment is a regulatory liability. Discover the three pillars of AI governance that audit and accounting firms must implement to remain compliant.

1. Executive summary

With the transition periods of the EU AI Act concluding in 2026, treating artificial intelligence as a casual IT experiment is now a major regulatory liability. For audit and accounting firms, AI governance is a boardroom mandate.

Systems that process financial data or support professional judgments are subject to stringent transparency, data handling and oversight requirements. Firms can no longer rely on opaque tools. They must prove that their automated reporting workflows are secure, interpretable and ultimately controlled by a human professional.

The table below maps the key pillars of the EU AI Act to practical, operational solutions for financial firms.


EU AI Act pillar
Regulatory requirement
Operational solution

Data governance (Article 10)

Systems must operate on verified, unbiased information without compromising proprietary client data sets.

Secure infrastructure that grounds AI exclusively in internal firm data and official legislation, actively blocking public internet reliance.

Transparency (Article 13)

Users must be able to interpret the system's output and understand its underlying operation.

Pipeline observability that permanently logs every AI action, model version and reasoning trace directly within the engagement file.

Human oversight (Article 14)

Systems must be designed to allow natural persons to effectively oversee them and intervene when necessary.

Human-in-the-loop workflows where the professional explicitly approves AI proposals before final execution.

2. Introduction: AI leaves the wild west

Early AI adoption in the finance sector was largely driven by individual accountants using public chatbots to speed up daily tasks. This era of 'shadow IT' allowed professionals to experiment, but it created an environment completely devoid of formal quality control.

Today, the legal landscape has fundamentally shifted. Using consumer-grade AI to process confidential client data violates professional secrecy rules and fails the rigorous compliance tests introduced by the EU AI Act. Regulators now scrutinise not just the financial output, but the digital machinery used to generate it. For leadership teams within accounting firms, the strategic focus must shift from what AI can do, to how AI is controlled.

3. Pillar 1: Data governance and grounding

Article 10 of the EU AI Act mandates high-quality data and robust data governance practices. For financial reporting, this means AI systems must operate on verified, unbiased information.

The operational reality is that firms must move away from models that rely on broad, unverified internet training data. If an AI agent drafts an accounting policy for a complex lease agreement, it cannot base its reasoning on a blog post it read two years ago. It must base its reasoning on the exact text of the current regulatory framework.

To achieve this, firms are deploying secure infrastructures that 'ground' the AI in factual, internal firm data and official legislation. Standardised integration frameworks, such as the Model Context Protocol (MCP), are increasingly used to securely bridge AI models with proprietary databases. This ensures the AI retrieves its knowledge from a locked, trusted vault rather than guessing based on public data patterns, perfectly satisfying the regulatory demand for data integrity.

4. Pillar 2: Human oversight (The loop)

Article 14 of the EU AI Act explicitly mandates human oversight. AI systems must be designed in a way that allows natural persons to effectively oversee them, comprehend their limitations and intervene. In software architecture, this is achieved by putting a human either 'in the loop' or 'on the loop'.

A human-on-the-loop system allows the AI to perform automated tasks continuously while the human monitors a dashboard, retaining the power to override the output if something goes wrong.

A human-in-the-loop system is far stricter. The AI proposes an action, such as mapping a complex trial balance to a reporting taxonomy or flagging a compliance risk, but the workflow physically stops until a human professional explicitly approves it.

For the preparation and auditing of financial statements, the human-in-the-loop approach is strictly required. Fully autonomous autopilots cannot be used for signing off official accounts. The technology acts as a highly capable preparer, but the registered professional remains the ultimate, accountable decision maker.

5. Pillar 3: Transparency and the digital audit trail

Article 13 requires transparency and the provision of information, meaning users and regulators must be able to interpret the system's output.

In the event of an inspection by a quality reviewer, such as the AFM or the SRA, claiming that "the AI made a mistake" is not a valid professional defence. Firms must be able to reconstruct the exact context of the engagement at the time the AI was used.

This requires deep pipeline observability. Firms must implement systems that permanently log the digital audit trail in the engagement file. This includes recording every specific tool the AI called, the exact prompt that was executed, the version of the AI model used and the chain of thought the model followed to reach its conclusion. Without this transparent trace, the AI remains a black box, rendering the firm non-compliant.

6. Conclusion

The EU AI Act is often viewed as a regulatory hurdle, but it is better understood as a blueprint for safe, professional AI adoption. It forces the industry to abandon experimental habits and adopt enterprise-grade governance.

Firms that invest proactively in secure infrastructure, human-in-the-loop workflows and traceable pipelines will be able to confidently scale their AI usage. Those who continue to rely on ad-hoc tools and unmonitored chat interfaces will face unacceptable compliance risks. In 2026, the competitive advantage belongs to the firms that can prove their AI is as disciplined as their accountants.

Founded by a Dutch Chartered Accountant

See Studio or MCP servers for your firm.

Book a 30-minute demo. We'll show you how your trial balance becomes a compliant report or how MCP servers enable domain expertise.

What product are you interested in?

Founded by a Dutch Chartered Accountant

See Studio or MCP servers for your firm.

Book a 30-minute demo. We'll show you how your trial balance becomes a compliant report or how MCP servers enable domain expertise.

What product are you interested in?

Founded by a Dutch Chartered Accountant

See Studio or MCP servers for your firm.

Book a 30-minute demo. We'll show you how your trial balance becomes a compliant report or how MCP servers enable domain expertise.

What product are you interested in?