ERMITS
ERMITS Advisory Supply Chain · Privacy · Threat

Illustrative demo only. Names, scores, and findings are placeholders—not an assessment of any real organization.

Sample report

Sector presets

Same brief layout for every industry—the preset swaps the illustrative institution and reframes language and priorities. For inputs, artifact levels, and evidence depth, see How it works.

Meridian Federal Credit Union

Cyber Exposure Brief™ · March 14, 2026 · Financial services (NCUA-supervised) · 280 employees · Multi-state

Elevated exposure 9 priorities identified

How to read this brief

The Exposure Index and five domain scores name what is driving risk so priorities stay traceable—not a single opaque total.

The brief is built so leadership can follow the logic: what was measured, what it implies, and which actions matter first. Each domain below is a lens on exposure—not a separate silo—so discussion stays tied to outcomes instead of tool names.

On this demo page, treat scores and counts as illustrative. In a live engagement, the same layout carries your evidence, owners, and dates.

Operational exposure

Critical operations concentration, failure points, and recovery paths that shape outage and disruption risk.

Vendor pressure

Third-party evidence quality, tiering, concentration, and whether alternates or exit plans exist.

Security gap

Gaps in control execution that raise exploitability—identity, backups, detection, and response readiness.

Data sensitivity

How sensitive the data is and how well classification, retention, and disposal limit blast radius.

Regulatory pressure

Supervisory and audit expectations plus how ready evidence is when pressure spikes.

Methodology & evidence notes

Domain scores aggregate indicators tied to evidence; priorities rank by severity, exploitability, and time-to-impact—not a plain average of domains.

Typical inputs

  • Identity & access: MFA, privileged paths, lifecycle, admin hardening.
  • Resilience: backup coverage, immutability, tested recovery, credible RTO/RPO.
  • Third party: tiering, attestation currency, concentration, alternates.
  • Data: classification, retention and disposal, sensitive flows.
  • Governance: policy vs practice, audit trail, exception handling.

How scoring behaves

  • Composite: concentration can dominate (for example one critical vendor path).
  • Evidence: current proof lowers uncertainty; missing proof raises it.
  • Priorities: owner, target window, and “done” evidence—not a generic finding list.
  • Traceability: each issue points back to the domain bars in this brief.

Illustrative scores on this demo page only.

Cyber Exposure Index™

Composite exposure sits in the Composite exposure sits in the elevated band driven by identity and third-party concentration.
Lowest domain scores are Security gap (36) and Vendor pressure (41); strong governance execution can still be undermined by concentrated dependencies and weak recovery readiness. Use ranked priorities below to sequence remediation before your next supervisory touchpoint.

Operational exposure
58
Vendor pressure
41
Security gap
36
Data sensitivity
60
Regulatory pressure
71
Domain exposure shape

Each spoke is one domain score on a 0–100 scale (higher = more exposure). The amber polygon connects those five points so you can see the overall “bulge” of the profile at a glance.

Illustrative demo
Scores on each axis 0–100 (exposure)
  • Op Operational exposure 58
  • Vn Vendor pressure 41
  • Se Security gap 36
  • Da Data sensitivity 60
  • Re Regulatory pressure 71

Same values as the domain bars above — the radar is a second view of the same five numbers.

Ranked priorities (by risk velocity)

Domain breakdown

Operational exposure

What this means: how quickly disruption becomes member impact due to operational concentration and recovery dependencies.

Critical operations mapped72%
Single points of failure surfaced48%
Recovery dependencies documented54%
Vendor pressure

What this means: how much critical service delivery depends on third parties, and whether evidence and exit readiness match that dependency.

Critical vendor inventory accuracy48%
Alternate provider readiness44%
Security gap

What this means: the practical distance between current controls and what is required to resist common attack paths.

MFA coverage (member + admin)62%
Backup recovery tested38%
Data sensitivity

What this means: where regulated/PII data lives and flows, and whether retention/disposal practices reduce the blast radius of an incident.

High-risk data flows mapped56%
Retention & disposal evidence40%
Regulatory pressure

What this means: the level of supervisory expectation and how prepared the institution is to produce audit-ready evidence on demand.

Policy-to-evidence traceability52%
audit-ready evidence package46%

30 / 60 / 90 cascade narrative

Execution spine (illustrative)

Days 1–30: Enforce MFA on all member and admin paths; freeze net-new privileged accounts without phishing-resistant factors; open vendor evidence requests for top five concentration vendors.

Days 31–60: Classify member data flows by sensitivity; align retention to GLBA-aligned schedules; tabletop ransomware with legal, compliance, and member comms.

Days 61–90: Formalize vendor tiering and attestation calendar; publish board-ready metrics on operational exposure, vendor pressure, and recovery KPIs tied to the Exposure Index.

Assumptions, limitations, and recommended next steps

Assumptions & limitations

  • Snapshot in time: exposure changes with vendors, identity posture, and operations.
  • Evidence quality matters: missing/expired artifacts increase uncertainty and may elevate results.
  • Not a certification: this is a diagnostic brief, not actuarial assurance or compliance attestation.

Next steps (what “good” looks like)

  • Assign owners for each priority and confirm target windows (72h / 30d / 60d).
  • Collect “done” evidence (exports, tickets, tabletop minutes, vendor attestations).
  • Re-score after remediation to confirm exposure reduction and update board reporting.

Six findings. Three owners needed. Ninety days to audit-ready.

The advisory review connects this brief to playbooks, evidence requests, and workshop cadence—without replacing your NCUA or internal audit program.

Request an advisory review