Venvera
Learn

DORA GAP ASSESSMENT: HOW TO SCORE YOUR READINESS

·Alexander Sverdlov

The most expensive mistake I've seen compliance teams make with DORA isn't getting a technical requirement wrong. It's spending six months working intensely on the wrong things while critical gaps sit unaddressed in areas they haven't even evaluated yet.

A payment institution I spoke with last year had built an excellent ICT incident response procedure — detailed, well-tested, aligned to DORA's classification criteria. What they hadn't done was map a single sub-outsourcing arrangement for any of their 47 ICT providers. Their Register of Information was structurally incomplete in a way that would fail NCA validation, and they had no visibility into that gap because their gap assessment had focused on the policy and procedure layer rather than the data and reporting layer.

A good DORA gap assessment doesn't just ask "have you done the thing?" It scores maturity across every domain that DORA touches, weights the gaps by regulatory priority and remediation effort, and produces an output that your management team can actually act on. This article gives you that framework — a scoring methodology you can run yourself, domain by domain, and use to produce a gap assessment report that tells you exactly where you stand and what to fix first.

📋 What this article gives you: A complete domain-by-domain DORA gap assessment framework with scoring criteria for each area, a maturity model you can apply consistently across your organisation, guidance on how to weight and prioritise the gaps you find, a ready-to-use scoring template structure, and a clear view of what "good" looks like in each domain so you can set realistic targets.

📌 Jump to section

  1. Why most DORA gap assessments fall short
  2. The maturity scoring model
  3. The seven DORA domains — scored
  4. Weighting and prioritisation
  5. Turning your scores into an action plan
  6. Common gap profiles by entity type
  7. When to reassess
⚡ TL;DR Score your DORA readiness across seven domains: ICT risk management, incident management, digital resilience testing, ICT third-party risk, Register of Information, information sharing, and governance. Rate each domain 1–4 on a defined maturity scale. Weight scores by regulatory enforcement priority. The result tells you where your highest-risk gaps are and in what order to address them.

Why Most DORA Gap Assessments Fall Short

The gap assessment approaches most compliance teams use — whether developed internally or inherited from a consultant — tend to share three structural weaknesses that limit their usefulness.

They conflate existence with adequacy. Asking "do you have an ICT risk management framework?" and recording "yes" tells you almost nothing about whether that framework meets DORA's specific requirements. A framework written in 2019 to satisfy EBA outsourcing guidelines may have no coverage of the new DORA requirements around ICT concentration risk, third-party monitoring, or proportionality. Existence-based assessments produce false comfort.

They focus on the policy layer and ignore the operational layer. DORA is unusual among EU financial regulations in that it has specific, verifiable operational outputs — particularly the Register of Information and the xBRL-CSV submission. A gap assessment that only evaluates whether policies and procedures exist, without evaluating whether the organisation can actually produce those outputs, misses the most immediate regulatory risk.

They don't produce a prioritised action plan. A list of gaps without a clear hierarchy of urgency and effort is difficult for management to act on. Different gaps carry different levels of regulatory risk, and remediation resources are always finite. An effective gap assessment must tell you not just what's missing but in what order to fix it.

⚠️ The audit-readiness illusion Some teams run their DORA gap assessment by mapping against a control framework — ISO 27001, NIST CSF, or a DORA-specific control library from a GRC platform — and measuring control implementation. This produces an audit-readiness view, which is valuable, but it does not assess DORA operational readiness. You can have 90% of your controls implemented and still be unable to produce a valid Register of Information submission. Both dimensions matter and must be assessed separately.

The Maturity Scoring Model

Use a four-level maturity scale for each domain. Four levels — rather than five or ten — gives you enough granularity to distinguish meaningfully different states without creating false precision that the data doesn't support. Each level has a precise definition so that scores are consistent across domains and across assessors.

1
Initial
No structured approach exists

The requirement is either unknown to the organisation, known but not started, or addressed only through ad-hoc efforts with no documentation, ownership, or repeatability. A score of 1 means the gap is significant and remediating it requires building from scratch.

2
Developing
Partial approach exists but has material gaps

Something is in place — a policy exists, a process has been started, some data has been collected — but it does not meet DORA's requirements in full. Coverage is incomplete, documentation is inadequate, or the approach hasn't been tested. A score of 2 means work is underway but significant remediation is still needed.

3
Defined
Meets DORA requirements but limited evidence of effectiveness

The requirement is addressed in a documented, structured way that aligns with DORA's text. Policies, procedures, and data exist and are maintained. However, the approach hasn't been tested under realistic conditions, evidence of effectiveness is limited, or the process depends heavily on specific individuals rather than embedded practices. A score of 3 is regulatory baseline.

4
Optimised
Fully implemented, tested, and continuously improved

The requirement is met comprehensively, evidence of effectiveness exists, the approach has been tested or audited, and continuous improvement processes are in place. The organisation could demonstrate compliance to an NCA supervisor with confidence. A score of 4 is the target for high-priority domains.

One important calibration note: a score of 3 means you meet the regulatory baseline — not that you are done. Supervisory scrutiny, TLPT exercises, and NCA inspections will probe beyond baseline adequacy. For domains that carry the highest regulatory enforcement risk, targeting a 4 is worth the investment.

The Seven DORA Domains — Scored

DORA's requirements organise naturally into seven domains. For each domain, the following sections give you the scoring criteria for levels 1 through 4, the key evidence you would expect to see at each level, and the specific sub-requirements that are most commonly underdeveloped in first assessments.

D1

ICT Risk Management Framework

Articles 5–15

DORA requires a comprehensive, documented ICT risk management framework that covers identification, protection, detection, response, and recovery. It must be reviewed at least annually, approved by the management body, and proportionate to the entity's scale and complexity.

Score What this looks like Key evidence expected
1 No ICT risk management framework aligned to DORA exists; existing frameworks (ISO 27001, NIST) have not been assessed against DORA requirements None
2 Framework exists but has material gaps against DORA Articles 5–15 (e.g., missing ICT asset classification, no ICT business continuity policy, or management body approval not documented) Gap analysis against DORA articles; draft framework documents
3 Documented, management-approved framework covering all DORA Articles 5–15 requirements; annual review cycle established; ICT asset register maintained Approved framework document; ICT asset register; board/management approval minutes; annual review records
4 Framework has been independently reviewed or audited; continuous improvement cycle is documented; ICT risk appetite is formally defined and integrated into decision-making Internal audit report; risk appetite statement; improvement log; integrated ICT risk reporting to management body
🔍 Commonly underdeveloped: ICT asset classification (Articles 8–9), the ICT business continuity policy distinct from general BCP (Article 11), and the management body's documented role in approving and overseeing the framework (Article 5(2)).
D2

ICT Incident Management

Articles 17–23

DORA requires a documented ICT-related incident management process with specific classification criteria aligned to the RTS, and mandatory major incident reporting to the NCA within strict timelines — 4 hours for initial notification, 24 hours for intermediate report, one month for final report.

Score What this looks like Key evidence expected
1 No DORA-specific incident classification process; existing ITSM incident management has not been assessed against DORA's major incident criteria None
2 DORA classification criteria mapped to existing process but not integrated; reporting timelines known but no tested workflow for meeting the 4-hour initial notification requirement Classification mapping document; NCA contact details identified
3 Documented DORA-aligned incident classification procedure integrated into ITSM; NCA reporting templates prepared; 4/24-hour timeline workflow exists; responsible roles assigned and communicated Incident classification procedure; reporting templates; RACI for NCA notification; evidence of staff awareness
4 Process tested through tabletop exercises or live incidents; post-incident reviews conducted; lessons learned fed back into procedure updates; metrics tracked on classification accuracy and reporting timeliness Exercise records; post-incident review reports; metrics dashboard; procedure update log
🔍 Commonly underdeveloped: The DORA RTS classification criteria differ from standard ITSM severity models. The 4-hour initial notification requirement is operationally demanding and requires 24/7 escalation capability that many teams haven't built. Voluntary reporting for significant cyber threats (Article 19) is frequently overlooked entirely.
D3

Digital Operational Resilience Testing

Articles 24–27

DORA requires annual testing of ICT tools and systems. Significant entities are additionally subject to Threat-Led Penetration Testing (TLPT) every three years. Testing must be proportionate to the entity's profile and must cover critical systems and functions.

Score What this looks like Key evidence expected
1 No DORA-aligned testing programme; existing penetration testing not mapped to DORA scope or critical functions None
2 Annual testing exists but scope has not been formally tied to DORA critical functions; vulnerability assessments conducted but findings remediation not tracked against DORA requirements Existing test reports; gap analysis against DORA testing scope
3 Annual DORA-scoped testing programme documented and executed; critical functions and systems identified as in-scope; findings tracked to remediation; TLPT applicability assessed Annual testing plan; scoping documentation linked to CIFs; test reports; remediation tracking; TLPT applicability assessment
4 TLPT completed (if applicable) or roadmap in place; testing results integrated into risk management; third-party testers meet DORA criteria; continuous testing elements operational TLPT completion report; NCA notification; tester accreditation records; continuous testing metrics
🔍 Commonly underdeveloped: Formal determination of whether TLPT applies (many entities haven't made this assessment); testing scope explicitly linked to the critical function classification in the RoI; third-party tester selection documented against DORA's criteria for TLPT providers.
D4

ICT Third-Party Risk Management

Articles 28–30

DORA requires a documented ICT third-party risk management policy, a due diligence process before entering new arrangements, ongoing monitoring of providers, exit strategies for critical arrangements, and compliance with the 12 mandatory Article 30(2) contract provisions.

Score What this looks like Key evidence expected
1 No DORA-specific ICT TPRM policy; existing vendor management processes have not been assessed against DORA Article 28–30 requirements; Article 30(2) contract provisions not reviewed None
2 ICT TPRM policy drafted; Article 30(2) review underway but incomplete; due diligence process exists for new providers but not DORA-aligned; exit strategies not documented for critical arrangements Draft TPRM policy; partial Article 30(2) contract review results
3 Approved DORA-aligned TPRM policy; Article 30(2) review completed and tracked for all critical arrangements; due diligence process updated; exit strategies documented for CIF-supporting providers Approved TPRM policy; Article 30(2) tracker; due diligence templates; exit strategy documents for critical providers
4 Ongoing provider monitoring operational; concentration risk assessment completed and reviewed; exit strategies tested; new contracts contain all Article 30(2) provisions as standard Provider monitoring reports; concentration risk assessment; exit strategy test records; standard contract template with Article 30(2) clauses
🔍 Commonly underdeveloped: Article 30(2) contract clause review for existing (legacy) contracts; ICT concentration risk assessment across all providers; documented exit strategies that have been tested, not just written; sub-outsourcing data collection from providers.
D5

Register of Information

Article 28(3) + ITS

The Register of Information must be maintained continuously and submitted annually to the NCA in xBRL-CSV format. It covers 15 interconnected tables with 200+ fields and must pass 117 EBA validation rules. This domain assesses both the completeness of the data and the organisation's ability to produce a valid submission.

Score What this looks like Key evidence expected
1 No register built; no understanding of xBRL-CSV format requirements; provider data held in ad-hoc spreadsheets or vendor management system without DORA field mapping None
2 Register partially populated using ESA template; data collection underway but B04 (sub-outsourcing) and B10 (Article 30 provisions) substantially incomplete; no xBRL-CSV export tested Partially completed ESA template; data collection status tracker
3 All 15 tables populated with complete, accurate data; xBRL-CSV package produced and tested against EBA validation rules; first NCA submission completed (with or without initial rejections resolved) Complete register; xBRL-CSV package; NCA submission receipt or feedback file with resolved errors
4 Year-round maintenance process operational; updates triggered by contract events within defined timeframes; LEI validity monitoring in place; submission passed validation on first attempt Maintenance process documentation; event-triggered update log; LEI monitoring reports; clean submission receipt
🔍 Commonly underdeveloped: Sub-outsourcing data (B04) — almost universally incomplete in first assessments; Article 30(2) contract provisions tracker (B10); referential integrity between tables; year-round maintenance process (most entities treat this as an annual project rather than a continuous obligation).
D6

Information Sharing

Article 45

DORA enables and encourages voluntary participation in cyber threat intelligence sharing arrangements among financial entities. While not mandatory, regulatory expectations around participation are emerging and the domain warrants assessment.

Score What this looks like Key evidence expected
1 No awareness of or participation in threat intelligence sharing under Article 45; no assessment of relevant arrangements available None
2 Article 45 assessed; relevant industry or sector sharing arrangements identified; participation decision documented but not yet active Assessment of relevant sharing arrangements; participation decision record
3 Active participation in at least one recognised threat intelligence sharing arrangement; process for consuming and acting on shared intelligence in place Membership documentation; intelligence consumption process; evidence of acting on received intelligence
4 Active contributor as well as consumer of threat intelligence; sharing process integrated with incident response; intelligence used to update risk assessments and testing scope Contribution records; integration evidence; examples of intelligence-driven risk updates
D7

Governance and Management Body Accountability

Article 5

DORA places explicit accountability on the management body — not just the ICT or compliance function — for digital operational resilience. Article 5 requires the management body to define, approve, oversee, and be accountable for the ICT risk management framework. Members must maintain adequate knowledge and skills.

Score What this looks like Key evidence expected
1 DORA responsibilities not formally assigned to management body; ICT risk is managed at operational level without board-level oversight or accountability None
2 Management body aware of DORA obligations; ICT risk reported to board level but approval of ICT risk framework not formally documented; no training programme for management body members Board reporting evidence; DORA awareness communications
3 Management body has formally approved the ICT risk management framework; ICT risk is a standing agenda item; individual accountability for DORA compliance assigned at management body level; training completed Board approval minutes; governance framework with DORA responsibilities; training completion records; board agenda extracts
4 Management body actively challenges and scrutinises ICT risk reporting; individual performance objectives include DORA-related responsibilities; skills assessment conducted and gaps addressed; management body receives regular DORA programme updates Challenge evidence in board minutes; performance objective documentation; skills assessment records; DORA programme reporting to board
🔍 Commonly underdeveloped: Documented evidence of management body approval (not just awareness); training records for board members on digital operational resilience; individual accountability assignments at management body level rather than just delegation to the CIO or CISO.

Weighting and Prioritisation

Not all gaps carry equal regulatory weight. Once you've scored each domain, apply a weighting that reflects NCA enforcement priority and the practical consequence of a gap in each area. The following weightings reflect the relative regulatory stakes based on the domains NCAs have focused on most in supervisory activity since DORA became applicable.

Domain Regulatory weight Rationale
D5 — Register of Information Critical Direct NCA submission with hard deadlines; failures are immediately visible to regulators; first area of supervisory focus
D2 — Incident Management Critical 4-hour notification failures are immediately apparent; regulatory consequences of missed reporting are severe and well-documented
D4 — ICT Third-Party Risk High Systemic risk concern for regulators; Article 30(2) gaps are verifiable through contract review; concentration risk a macro-prudential priority
D7 — Governance High Management body accountability is a recurring NCA supervisory focus; individual accountability gaps are reputationally and legally significant
D1 — ICT Risk Management Medium Foundational requirement; NCA inspections will review; longer remediation timeframe but high baseline adequacy expected
D3 — Testing Medium Annual testing is a firm obligation; TLPT is significant for in-scope entities but applies only to a subset
D6 — Information Sharing Lower Voluntary; NCA scrutiny lower relative to other domains; good-faith assessment and participation decision sufficient for most entities

Turning Your Scores into an Action Plan

Once you've scored each domain and applied the weighting, you have two inputs for your action plan: the gap size (how far below 3 each domain scores) and the regulatory weight (how urgently that gap needs to close). Plotting these against each other gives you a prioritisation grid.

Gap size → Reg. weight ↓ Score 1 (Large gap) Score 2 (Moderate gap) Score 3+ (At baseline)
Critical weight 🔴 Immediate Stop everything else 🟠 Urgent Next 30 days 🟡 Monitor Maintain and improve
High weight 🟠 Urgent Next 30 days 🟡 Planned Next 60–90 days 🟢 Monitor Continuous improvement
Medium / Lower weight 🟡 Planned Next 90 days 🟢 Roadmap Next 6 months 🟢 Done Sustain

Any domain scoring 1 with Critical or High regulatory weight should produce an immediate escalation to the management body with a time-bound remediation plan. These are not backlog items — they represent the gaps most likely to produce supervisory consequences before the next annual submission cycle.

Common Gap Profiles by Entity Type

Gap patterns vary significantly by entity type. Knowing which profile most resembles your organisation helps you calibrate where to focus your assessment effort and where you're most likely to find the critical gaps.

🏦 Credit institutions and large investment firms

These entities typically have the most mature ICT risk management frameworks — often built on EBA outsourcing guidelines, ISO 27001, or NIST — so D1 and D7 scores tend to be higher. The most common gaps are in D5 (Register of Information) at the operational output level (xBRL-CSV submission and sub-outsourcing completeness), and in D4 around the Article 30(2) contract clause review for legacy contracts signed before DORA. Incident classification under DORA's RTS thresholds (D2) is frequently underdeveloped relative to the maturity of the broader ITSM programme.

💳 Payment institutions and e-money institutions

These entities often have strong operational awareness of ICT dependencies — they're in the business of real-time payment processing and understand downtime risk intimately. But their formal governance documentation (D7) frequently lags their operational reality, and their ICT risk management framework (D1) may not have been updated to reflect DORA's specific requirements. The Register of Information (D5) is often the most significant project-scale gap.

🛡️ Insurance undertakings

Insurance entities frequently have strong risk governance frameworks from Solvency II, which helps D1 and D7. The biggest gaps tend to be in domains specific to DORA's ICT focus: digital resilience testing (D3) is often limited to basic vulnerability assessments without the DORA-required scope definition; ICT third-party risk (D4) is managed through existing vendor governance but hasn't been assessed for DORA-specific requirements; and the Register of Information (D5) is frequently in a very early stage for smaller insurance entities.

🚀 Fintechs and newer financial entities

Fintech entities often have high technical sophistication — cloud-native architectures, modern security tooling, DevSecOps practices — but formal governance documentation is frequently absent or thin. D7 (governance) and D1 (documented framework) are the most common low scores. The ICT dependency picture is often complex (high cloud concentration, many API-based integrations) which makes D4 and D5 scope definition challenging. The good news: their sub-outsourcing data is often more readily available because cloud providers are generally more transparent about their infrastructure chains than legacy IT providers.

When to Reassess

A gap assessment is a point-in-time view. DORA is a continuous obligation. Run a full reassessment in the following circumstances:

  • Annually, at minimum — ideally timed 4–5 months before your NCA submission deadline, giving enough time to act on what you find before the register needs to be finalised.
  • Following a significant ICT incident — a major incident is both a test of your D2 maturity and a signal that your D1 framework may have gaps that weren't apparent on paper.
  • After a material change in your ICT provider landscape — a new critical provider, an acquisition, or a significant contract renegotiation changes your D4 and D5 picture enough to warrant reassessment of those domains.
  • When DORA technical standards are updated — the ITS and RTS have already been revised since initial publication. Each update potentially changes what "score 3" looks like in one or more domains.
  • Before NCA supervisory visits or inspections — running a fresh assessment before a scheduled supervisory engagement gives you an accurate current picture and avoids the risk of presenting an outdated view of your readiness.

Know your DORA score before your NCA does.

Venvera's built-in gap assessment module maps your readiness across all seven domains, tracks remediation progress, and connects directly to your Register of Information — so your gap data and your compliance data stay in sync year-round.

Run your gap assessment → Venvera.com

Frequently Asked Questions

How long does a DORA gap assessment typically take?

For a standalone financial entity, a thorough gap assessment covering all seven domains takes between 4 and 8 weeks when done properly — including stakeholder interviews across ICT, compliance, legal, procurement, and the management body. For larger groups or entities with complex ICT landscapes, 10–12 weeks is more realistic. Assessments that run faster than 4 weeks usually mean evidence gathering has been superficial.

Should we use an external consultant for the DORA gap assessment?

External consultants bring objectivity and pattern recognition from working across multiple institutions. They're particularly valuable for D5 (Register of Information) where the technical xBRL requirements are specialised, and for D3 (testing) where TLPT applicability and tester selection involve regulatory nuance. However, an external assessment is only as good as the internal stakeholder access and evidence sharing that supports it. A hybrid approach — internal self-assessment validated by an external review — often produces the most actionable output.

Does a DORA gap assessment need to be shared with the NCA?

There is no mandatory requirement to submit a gap assessment to your NCA. However, NCAs may request evidence of your DORA compliance programme during supervisory visits, and a well-documented gap assessment with a remediation plan demonstrates proactive engagement with your obligations — which is generally viewed positively. Be cautious about gap assessments that document material failures without an accompanying remediation plan, as these can become supervisory findings if disclosed.

Can we use our existing ISO 27001 or NIST assessment as a DORA gap assessment?

Partially. Your ISO 27001 or NIST assessment gives you useful signal on D1 (ICT risk management framework) and D3 (testing), but it does not cover D5 (Register of Information), D4 (DORA-specific third-party requirements including Article 30(2)), D2 (DORA-specific incident classification and timelines), or D7 (management body accountability under Article 5). You would need to supplement any existing framework assessment with DORA-specific evaluation in these domains before you have a complete picture.

What score should we be targeting across all domains?

A score of 3 in every domain represents the regulatory baseline — you meet DORA's requirements as written. For Critical-weight domains (D5 and D2), targeting 4 is strongly advisable because the consequences of failure in these areas are immediate and visible to regulators. For Medium and Lower weight domains, reaching 3 and maintaining it is a reasonable target for most entities. The goal is not to achieve perfect scores everywhere — it is to ensure no high-weight domain is left below 2, and that any domain scoring 1 has an immediate remediation plan with board-level visibility.

Written by the Venvera compliance team. Venvera is a purpose-built DORA compliance platform for European financial entities. Last updated: February 2026.

AS

Alexander Sverdlov

CEO & Founder

Alexander is the CEO and founder of Venvera, leading the development of multi-framework compliance solutions for European regulated entities.

RELATED POSTS