ERMITS
ERMITS Advisory Supply Chain · Privacy · Threat

Illustrative demo only. This sample explains how the Cyber Exposure Brief is structured and how sector context changes the narrative, priorities, and reporting emphasis. Any names, scores, and findings shown below are illustrative examples—not an assessment of any real institution.

Sample report

Sector context changes the same reporting system

Use these presets to see how score interpretation, management language, and action priorities shift by industry without changing the reporting structure.

Illustrative only. The value here is the reporting logic, not the placeholder names or numbers. How it works describes required inputs, artifact levels, and how evidence depth scales across engagements.

Sector presets

Choose an industry version

Each preset keeps the same layout while adjusting operating model, regulatory pressure, and executive narrative.

Meridian Federal Credit Union

Cyber Exposure Brief™ · March 14, 2026 · Financial services (NCUA-supervised) · 280 employees · Multi-state

Elevated exposure 9 priorities identified

How to read this brief

This report summarizes cyber exposure using a composite index and five domains. The intent is to make the output board-ready: clear definitions, traceable drivers, and an action list sequenced by risk velocity. Where this demo shows numeric values, treat them as illustrative placeholders—the important part is what each dimension means and how it informs priorities.

Operational exposure

Concentration of critical operations, single points of failure, and recovery dependencies that increase outage and disruption risk.

Vendor pressure

Third-party dependency risk: evidence quality, tiering discipline, concentration, and alternate-provider readiness.

Security gap

Control coverage and execution gaps that increase exploitability (identity assurance, backup recovery, detection and response readiness).

Data sensitivity

Sensitivity of data handled (PII and regulated data) and the maturity of classification, retention, and disposal practices that limit blast radius.

Regulatory pressure

External pressure factors that raise compliance risk (supervisory expectations, audit cadence, and evidence readiness).

Methodology & evidence notes

The Exposure Index and domain scores are designed to be explainable: every score is supported by specific indicators, each indicator maps to observable evidence, and the “Ranked priorities” list is generated by combining severity, exploitability, and time-to-impact (risk velocity).

Typical inputs

  • Identity & access: MFA coverage, privileged access, account lifecycle, admin path hardening.
  • Resilience: backup scope, immutability, recovery testing, RTO/RPO realism.
  • Third party: vendor tiering, evidence currency, concentration, alternates and exit plans.
  • Data: classification, retention/disposal, encryption boundaries, high-risk flows.
  • Governance: policies vs execution signals, auditability, exception discipline.

How scoring behaves

  • Composite ≠ average: concentration risks can dominate (e.g., one critical vendor).
  • Evidence-weighted: stronger, current evidence can reduce uncertainty; missing evidence increases it.
  • Actionable outputs: priorities describe an owner, target window, and “done” evidence.
  • Explainability: each priority ties back to the domain drivers shown below.

Demo note: values shown here are illustrative and meant to demonstrate the reporting structure.

Cyber Exposure Index™

Composite exposure sits in the Composite exposure sits in the elevated band driven by identity and third-party concentration.
Lowest domain scores are Security gap (36) and Vendor pressure (41); strong governance execution can still be undermined by concentrated dependencies and weak recovery readiness. Use ranked priorities below to sequence remediation before your next supervisory touchpoint.

Operational exposure
58
Vendor pressure
41
Security gap
36
Data sensitivity
60
Regulatory pressure
71
Domain exposure shape

Each spoke is one domain score on a 0–100 scale (higher = more exposure). The amber polygon connects those five points so you can see the overall “bulge” of the profile at a glance.

Illustrative demo
Scores on each axis 0–100 (exposure)
  • Op Operational exposure 58
  • Vn Vendor pressure 41
  • Se Security gap 36
  • Da Data sensitivity 60
  • Re Regulatory pressure 71

Same values as the domain bars above — the radar is a second view of the same five numbers.

Ranked priorities (by risk velocity)

Domain breakdown

Operational exposure

What this means: how quickly disruption becomes member impact due to operational concentration and recovery dependencies.

Critical operations mapped72%
Single points of failure surfaced48%
Recovery dependencies documented54%
Vendor pressure

What this means: how much critical service delivery depends on third parties, and whether evidence and exit readiness match that dependency.

Critical vendor inventory accuracy48%
Alternate provider readiness44%
Security gap

What this means: the practical distance between current controls and what is required to resist common attack paths.

MFA coverage (member + admin)62%
Backup recovery tested38%
Data sensitivity

What this means: where regulated/PII data lives and flows, and whether retention/disposal practices reduce the blast radius of an incident.

High-risk data flows mapped56%
Retention & disposal evidence40%
Regulatory pressure

What this means: the level of supervisory expectation and how prepared the institution is to produce audit-ready evidence on demand.

Policy-to-evidence traceability52%
audit-ready evidence package46%

30 / 60 / 90 cascade narrative

Execution spine (illustrative)

Days 1–30: Enforce MFA on all member and admin paths; freeze net-new privileged accounts without phishing-resistant factors; open vendor evidence requests for top five concentration vendors.

Days 31–60: Classify member data flows by sensitivity; align retention to GLBA-aligned schedules; tabletop ransomware with legal, compliance, and member comms.

Days 61–90: Formalize vendor tiering and attestation calendar; publish board-ready metrics on operational exposure, vendor pressure, and recovery KPIs tied to the Exposure Index.

Assumptions, limitations, and recommended next steps

Assumptions & limitations

  • Snapshot in time: exposure changes with vendors, identity posture, and operations.
  • Evidence quality matters: missing/expired artifacts increase uncertainty and may elevate results.
  • Not a certification: this is a diagnostic brief, not actuarial assurance or compliance attestation.

Next steps (what “good” looks like)

  • Assign owners for each priority and confirm target windows (72h / 30d / 60d).
  • Collect “done” evidence (exports, tickets, tabletop minutes, vendor attestations).
  • Re-score after remediation to confirm exposure reduction and update board reporting.

Six findings. Three owners needed. Ninety days to audit-ready.

The advisory review connects this brief to playbooks, evidence requests, and workshop cadence—without replacing your NCUA or internal audit program.

Request an advisory review