---
title: "AI Loan Origination Automation: EBA, AI Act and DORA Guide | Impetora"
description: "How banks automate loan origination with AI under EBA/GL/2020/06, EU AI Act Annex III 5(b), DORA and GDPR Article 22, with a vendor-neutral compliant design."
url: https://impetora.com/answers/ai-loan-origination-automation-europe
locale: en
datePublished: 2026-04-28
dateModified: 2026-04-28
author: Impetora
---

# AI loan origination automation: how banks deploy it under EU rules

> Loan origination is the highest-stakes AI deployment most retail and SME banks attempt. The output is a credit decision, the regulatory perimeter is the densest in the AI Act, and the supervisory expectations stack across the EBA Loan Origination guidelines, DORA, GDPR Article 22 and SR 11-7-equivalent model risk standards. Done well, automation cuts decision time from days to minutes. Done carelessly, it triggers Annex III 5(b) without the controls to support it [1].

*Updated 2026-04-28. By Impetora.*

## What does AI loan origination actually cover?

Loan origination automation spans the workflow from application intake through underwriting, KYC and AML checks, affordability assessment, decisioning, pricing, documentation and disbursement. AI components show up at every step: optical character recognition for document intake, classification for fraud screening, scoring for creditworthiness, language models for narrative summarisation and decisioning rules engines orchestrating the whole flow. The scoring step is where Annex III 5(b) bites. The other steps are subject to GDPR, DORA ICT-resilience rules and the EBA Loan Origination guidelines, but they are not high-risk under the AI Act on their own.

## What does EBA/GL/2020/06 require for automated origination?

The EBA Guidelines on Loan Origination and Monitoring entered into force on 30 June 2021 and apply to all credit institutions in the EU. Section 4.3 covers the use of automated models in creditworthiness assessment. Banks must understand the methodology, data inputs and assumptions, document the model in a way that allows independent challenge, monitor performance over time and ensure that automated outcomes can be reviewed and overridden where appropriate [2]. The guidelines also require explicit policy documentation on when human review is mandatory, what data points feed the decision, and what bias and fairness controls apply. They are the European supervisory baseline for any origination AI deployment.

## How does the EU AI Act stack on top?

Annex III 5(b) of Regulation (EU) 2024/1689 designates AI systems used to evaluate creditworthiness or establish credit scores of natural persons as high-risk. The narrow exception for fraud detection does not apply to origination scoring. Once high-risk classification triggers, Chapter III obligations apply in full: risk-management system (Article 9), data governance (Article 10), technical documentation (Article 11), logging (Article 12), transparency (Article 13), human oversight (Article 14), accuracy and robustness (Article 15) and post-market monitoring (Article 72). For most banks, the gap is not the model itself but the documentation, logging and monitoring stack. Article 12 in particular requires automatic recording of events for the system's lifetime, retained for a minimum period defined by the provider and never less than six months.

## Where does GDPR Article 22 fit?

An automated lending decision that produces a legal effect (granting or denying credit) is the textbook Article 22 case. The bank must rely on contract necessity (the typical basis for credit applications), implement suitable safeguards including the right to human intervention, expression of view and contest, and provide meaningful information about the logic, significance and envisaged consequences in the privacy notice and at decision point. The 2024 EDPB guidelines reaffirm that "meaningful information" does not require disclosing trade secrets or full model internals, but does require enough that the customer can meaningfully exercise their rights. A boilerplate "we use automated processing" sentence does not satisfy the standard.

## How does DORA affect origination AI vendor selection?

Loan origination is unambiguously a critical or important function for a bank. AI vendors providing origination components are therefore in scope of DORA Articles 28 to 30 third-party ICT risk obligations: pre-contractual due diligence, register of information, mandatory contract clauses including audit rights, sub-processor controls, exit and transition planning, and incident-reporting cooperation [3]. Click-through SaaS terms do not satisfy DORA. Vendor selection must include a documented exit plan that lets the bank migrate the workload to an alternative provider, and the contract must give the bank and its supervisor on-site audit rights.

## What does a defensible automated-origination design look like?

Five layers. Data layer with documented lineage, exclusion of protected attributes and validated proxies. Model layer with version control, technical documentation aligned with Article 11 and independent validation. Decisioning layer with explicit policy on automated-approval bands, mandatory human review thresholds and escalation paths. Logging layer that meets Article 12 retention requirements. Monitoring layer with drift detection, fairness testing across protected classes and performance thresholds tied to governance triggers. The most common audit finding is a model that scores well in development but lacks a written policy on which scores trigger automated approval, which require human review and who has override authority. The score is not the decision; the policy that maps score to action is what supervisors examine.

## Frequently asked questions

### Can a bank fully auto-approve loans without human review?

Yes for narrow segments and inside a documented policy band, with Article 22 safeguards (right to human intervention, contest and expression of view) available on request. Outside that band, human review must be mandatory. The policy itself must be approved by governance and reviewed periodically.

### Does AML and KYC fall under Annex III 5(b)?

No. AML transaction monitoring and KYC identity verification are outside the creditworthiness perimeter. They are governed by AMLD/AMLR, the FATF standards, EBA guidance and (separately) GDPR. They become AI-Act-relevant only if they include components that evaluate creditworthiness.

### What logging is required under AI Act Article 12?

Automatic recording of events for the system's lifetime, retained for a minimum defined by the provider (never less than six months for high-risk systems). Logs must enable traceability of the system's operation, identification of situations that may result in risk, and post-market monitoring. For loan origination this typically means: input features, model version, score output, decision, override events and human reviewer identity.

### How long does an EU AI Act conformity assessment take for origination AI?

Annex III high-risk systems generally use the internal control conformity assessment under Annex VI, conducted by the provider. There is no notified-body involvement for most credit-scoring AI. The bank, as deployer, has separate obligations under Articles 26 and 27 including a fundamental rights impact assessment and registration in the EU database.

### Can a bank use a third-party origination AI under DORA?

Yes, with proper due diligence, mandatory contract clauses (Article 30), inclusion in the register of information, documented exit and transition plan, and a sub-contracting chain map. The bank remains fully responsible for compliance regardless of outsourcing.

## Sources cited

1. Regulation (EU) 2024/1689 (Artificial Intelligence Act). European Union, Official Journal, 2024-07-12. https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32024R1689
2. Guidelines on loan origination and monitoring (EBA/GL/2020/06). European Banking Authority, 2020-05-29. https://www.eba.europa.eu/regulation-and-policy/credit-risk/guidelines-on-loan-origination-and-monitoring
3. Regulation (EU) 2022/2554 (DORA). European Union, Official Journal, 2022-12-14. https://eur-lex.europa.eu/eli/reg/2022/2554/oj
4. Regulation (EU) 2016/679 (GDPR). European Union, Official Journal, 2016-04-27. https://eur-lex.europa.eu/eli/reg/2016/679/oj
5. Guidelines 1/2024 on automated decisions under Article 22 GDPR. European Data Protection Board, 2024. https://www.edpb.europa.eu/our-work-tools/our-documents/guidelines_en
