Why Maturity Measurement Matters
Zero Trust is not a binary state. You do not wake up one morning with Zero Trust fully implemented. It is a spectrum, and every organization sits at a different point on that spectrum for each pillar of the architecture. Without a structured way to measure where you are, you cannot determine where to invest next, whether your investments are producing results, or how to communicate progress to leadership.
Maturity measurement also prevents the most common organizational failure: declaring victory too early. An organization that deploys MFA and ZTNA may feel it has achieved Zero Trust. Without maturity metrics that encompass microsegmentation, data protection, workload identity, and continuous monitoring, significant gaps remain invisible to leadership and unaddressed in the budget.
The Cybersecurity and Infrastructure Security Agency (CISA) published a Zero Trust Maturity Model in 2023 that provides a structured framework for measuring progress across five pillars. This model, combined with practical metrics derived from operational data, gives engineers and security leaders a concrete way to assess and communicate Zero Trust maturity.
The CISA Zero Trust Maturity Model
CISA’s model defines maturity across five pillars, each evaluated at three levels: Traditional, Advanced, and Optimal. Understanding these pillars and levels provides the scaffolding for a practical maturity assessment.
Pillar 1: Identity
The identity pillar measures how robustly your organization verifies the identity of users and entities requesting access.
- Traditional: Passwords are the primary authentication mechanism. MFA is deployed for some users, typically administrators. Identity is managed by a single directory (Active Directory) with limited federation. Access decisions are based on group membership, and entitlements are reviewed annually or not at all.
- Advanced: MFA is enforced for all user accounts. Conditional access policies evaluate device posture and location before granting access. Just-in-time access is implemented for administrative accounts. Identity governance tools perform quarterly access reviews, and unused entitlements are automatically flagged for revocation.
- Optimal: Phishing-resistant MFA (FIDO2, certificate-based) is the standard. Continuous authentication re-evaluates identity throughout sessions. Behavioral analytics detect anomalous authentication patterns in real time. Workload identity (non-human entities) uses certificate-based authentication with short-lived credentials. Entitlements are continuously evaluated and right-sized based on actual usage.
Pillar 2: Devices
The device pillar assesses how your organization verifies the health and compliance of devices requesting access.
- Traditional: Corporate devices are managed through a basic MDM solution. Device compliance is checked at enrollment but not continuously. BYOD devices have unrestricted access to corporate resources. There is no integration between device posture and access decisions.
- Advanced: Device compliance is evaluated at access time. Non-compliant devices (missing patches, disabled disk encryption, absent EDR agent) are denied access to sensitive resources. The device inventory is comprehensive and continuously updated. BYOD access is limited to a subset of resources through containerization or virtual desktop solutions.
- Optimal: Device health is continuously monitored and fed into the policy engine in real time. If a device’s posture degrades mid-session (EDR detects a threat, patch compliance lapses), access is automatically revoked or stepped up. Device certificates are managed through an automated certificate lifecycle, and hardware attestation (TPM-based) verifies device integrity at boot.
Pillar 3: Networks
The network pillar evaluates segmentation, traffic encryption, and network-level access controls.
- Traditional: Network segmentation is coarse-grained (DMZ, internal, guest). East-west traffic is largely uninspected. VPNs provide full network access to remote users. DNS queries are unfiltered and unmonitored.
- Advanced: Microsegmentation is implemented for high-value assets. East-west traffic between segments is filtered and logged. ZTNA replaces VPN for new application access patterns. DNS filtering blocks connections to known malicious domains.
- Optimal: Microsegmentation covers all workloads, with policies derived from observed traffic patterns and continuously refined. All traffic (north-south and east-west) is encrypted via mTLS. Network access decisions are identity-aware, not just IP-based. Software-defined perimeters hide infrastructure from unauthorized users, making reconnaissance infeasible.
Pillar 4: Applications and Workloads
- Traditional: Applications authenticate users at login and maintain persistent sessions. Service-to-service communication uses static API keys or shared secrets. Application security testing occurs periodically, if at all.
- Advanced: Applications integrate with the centralized identity provider for authentication and authorization. Service-to-service communication uses short-lived tokens or mTLS. Security testing is integrated into the CI/CD pipeline. Application-level access logging feeds into the SIEM.
- Optimal: Applications implement fine-grained authorization (attribute-based access control or relationship-based access control). Service meshes enforce zero-trust communication policies between all services. Runtime application self-protection (RASP) detects and blocks exploits in real time. Continuous authorization evaluates user behavior within the application and can terminate sessions based on anomalous activity.
Pillar 5: Data
- Traditional: Data classification is informal or nonexistent. Encryption at rest is applied to some databases. Data loss prevention is limited to email scanning. Data access is controlled at the application level with no centralized data governance.
- Advanced: Data is classified by sensitivity level, and access policies vary by classification. Encryption at rest and in transit is standard for all data stores. DLP policies are enforced across email, cloud storage, and endpoints. Data access is logged and auditable.
- Optimal: Data-centric security policies travel with the data regardless of location. Rights management controls persist even when data is shared externally. Data access decisions incorporate user context, device posture, and behavioral signals. Anomalous data access patterns (bulk downloads, access to unusual datasets) trigger automated investigation and response.
Practical Metrics for Engineering Teams
While the CISA model provides strategic direction, engineering teams need operational metrics that can be measured continuously and acted upon. These metrics should be derived from system telemetry, not self-assessments.
- MFA coverage percentage: The percentage of user accounts and service accounts that require MFA for authentication. Target: 100% for user accounts, with service accounts using certificate-based or workload identity authentication.
- Average credential lifetime: The mean duration of active credentials across all credential types (user sessions, API tokens, service account passwords, TLS certificates). Lower is better. A mature Zero Trust environment has average credential lifetimes measured in hours, not months.
- Standing privilege ratio: The percentage of administrative permissions that are permanently assigned versus just-in-time. A ratio approaching zero indicates mature privilege management.
- Microsegmentation coverage: The percentage of workloads protected by microsegmentation policies (as opposed to operating in flat network segments). Measure at the workload level, not the VLAN level.
- East-west encryption percentage: The percentage of internal service-to-service traffic encrypted with TLS or mTLS. In a mature Zero Trust environment, this should approach 100%.
- Mean time to revoke (MTTR): The average time between detection of a compromised credential or non-compliant device and revocation of its access. This metric reveals the effectiveness of your automated response capabilities. Target: under 5 minutes for automated scenarios.
- Policy violation rate: The number of access requests that are denied by Zero Trust policies, categorized by reason (failed MFA, non-compliant device, unauthorized resource access, anomalous behavior). Tracking this over time reveals both security improvements and potential policy misconfigurations.
Conducting a Maturity Assessment
A practical maturity assessment combines the strategic CISA framework with operational metrics. The process involves four steps:
- Baseline measurement: For each of the five CISA pillars, determine your current maturity level (Traditional, Advanced, or Optimal). Support each assessment with data from operational metrics. If you claim Advanced maturity in the identity pillar, your MFA coverage metric should demonstrate near-100% deployment.
- Gap analysis: For each pillar, identify the specific capabilities required to advance to the next maturity level. These gaps become your engineering backlog. Prioritize based on risk impact: advancing the identity pillar from Traditional to Advanced (deploying universal MFA) likely reduces more risk than advancing the data pillar from Advanced to Optimal.
- Target setting: Define a target maturity level for each pillar based on your organization’s risk appetite, regulatory requirements, and resource constraints. Not every pillar needs to reach Optimal. An organization with minimal regulated data may target Optimal for identity and network but Advanced for data.
- Progress tracking: Measure operational metrics monthly and reassess pillar maturity quarterly. Publish a maturity scorecard that shows progress across all pillars. This scorecard becomes the primary communication tool for executive reporting and budget justification.
Avoiding Assessment Pitfalls
Maturity assessments are only useful if they are honest. Common pitfalls include overrating maturity based on deployed tools rather than enforced policies (deploying MFA does not count if exceptions exist for 30% of accounts), measuring coverage based on intended scope rather than actual scope (microsegmentation of the production environment means nothing if the staging environment is flat and contains production data clones), and treating the assessment as a compliance exercise rather than an engineering tool.
The value of maturity measurement is not in the score itself but in the clarity it provides. It transforms “we need to improve our security” from a vague aspiration into “we need to increase our east-west encryption percentage from 40% to 90% by Q3, which requires deploying mTLS across the payment processing service mesh.” Specificity drives action, and action drives maturity.
