What the Security Assessment Domain Is For
The Security Assessment domain establishes the self-governance layer of the CMMC Level 2 framework. Its four requirements address how the contractor documents its security program, how it assesses the effectiveness of its own controls, how it plans remediation for the deficiencies those assessments reveal, and how it monitors the program on an ongoing basis. The domain is small in control count, but its output includes the most consequential artifacts in the entire compliance program. The System Security Plan is the foundational document. The Plan of Action and Milestones is the living remediation record. The assessment findings and monitoring output together determine whether the program is operating as documented.
The practitioner reading of CA is that it is the domain through which contractors demonstrate they have a program rather than just a set of controls. A contractor can implement every other control correctly and still fail assessment if the SSP does not accurately describe the implementation, if the POA&M does not reflect the remediation work actually underway, if the self-assessment was perfunctory, or if continuous monitoring is not occurring. CA is where the compliance program as a whole becomes visible and assessable.
The GAO has identified SSP deficiencies as one of the two primary drivers of CMMC readiness failures, with only 1 percent of contractors reporting full readiness and a median SPRS score of 60 against a required 110. A substantial portion of those deficiencies trace back to CA.3.12.4 on SSP development and CA.3.12.1 on self-assessment. The CMMC Phase 1 Realities white paper examines this pattern in depth, including why GRC tools do not produce compliant SSPs and why self-assessment scores frequently diverge from what a subsequent C3PAO assessment will find.
The Four Controls
CA.L2-3.12.1Security Control Assessment
Security controls in organizational systems must be periodically assessed to determine whether the controls are effective in their application. The control requires self-assessment on a defined cadence, with documented findings that demonstrate whether each control is working as intended. The assessment must be substantive rather than ceremonial. A self-assessment that confirms every control is fully implemented without supporting evidence does not produce the findings that 3.12.1 expects. The outputs of this control feed directly into 3.12.2 for remediation planning and into the SPRS score that contractors post for DoD visibility. Self-assessment findings that diverge substantially from what a C3PAO subsequently identifies create the False Claims Act exposure that attaches to SPRS affirmations under 32 CFR ยง 170.22.
View the CA.3.12.1 reference card →CA.L2-3.12.2Plan of Action
Plans of action must be developed and implemented to correct deficiencies and reduce or eliminate vulnerabilities in organizational systems. The Plan of Action and Milestones is the living record of remediation work, naming each open deficiency, the planned corrective action, the assigned responsibility, and the scheduled completion date. The POA&M must reflect the reality of the remediation work, including revised timelines when work is delayed and closure records when work is completed. During C3PAO assessment, the POA&M is evaluated for completeness, accuracy, and evidence of actual progress. A POA&M with open items that have been pending for years is not the same as a POA&M with items that are actively being worked.
View the CA.3.12.2 reference card →CA.L2-3.12.3Security Control Monitoring
Security controls must be monitored on an ongoing basis to ensure the continued effectiveness of the controls. The control requires that monitoring happen continuously rather than only during the periodic assessment cycle. Continuous monitoring bridges the gap between formal assessments and provides early detection when controls drift from their documented state. In practice, continuous monitoring combines technical mechanisms (automated configuration compliance checks, vulnerability scanning, log review) with operational discipline (periodic spot checks, review of change activity, response to anomalies). The monitoring output feeds back into 3.12.1 assessments and 3.12.2 POA&M updates, making the assessment program adaptive rather than point-in-time.
View the CA.3.12.3 reference card →CA.L2-3.12.4System Security Plan
System security plans must be developed, documented, and periodically updated to describe system boundaries, system environments of operation, how security requirements are implemented, and the relationships with or connections to other systems. The SSP is the most consequential single document in the CMMC compliance program. Every control in every other domain is described in the SSP, and the assessment evidence for each control is traceable through the SSP. The SSP must describe the contractor's actual environment, not a generic template environment, and it must be periodically updated as the environment evolves. Findings against the SSP typically involve generic template language that does not reflect the actual implementation, missing descriptions for specific controls, undefined or incorrect system boundaries, or inconsistencies between the SSP and the environment an assessor examines.
View the CA.3.12.4 reference card →Where Security Assessment Intersects with Other Domains
Security Assessment touches every other domain because the SSP describes them all, the self-assessment evaluates them all, and the continuous monitoring observes them all.
Risk Assessment is the closest partner to CA. RA establishes what risks the organization faces, and CA evaluates whether the controls that address those risks are effective. The two domains operate as a cycle where RA prioritizes and CA verifies. When the two are misaligned, the self-assessment evaluates controls against the wrong risk picture, and the findings do not support the actual security posture.
Configuration Management provides the baselines that the SSP references and that continuous monitoring compares against. A strong CM program produces the baseline documentation that makes the SSP accurate, and a weak CM program produces an SSP that describes configurations the systems do not actually have.
Audit and Accountability supports CA through the evidence it produces. Audit records are a primary input to both 3.12.1 assessments and 3.12.3 continuous monitoring. The quality of the audit program directly affects how effectively CA can evaluate the rest of the framework.
System and Information Integrity provides technical capabilities that support continuous monitoring under 3.12.3. SI monitoring activity feeds directly into CA observation of control effectiveness, and the two domains operate together to detect drift from documented baselines.
Access Control, Identification and Authentication, and the rest of the framework all depend on CA to describe them accurately in the SSP. Every control's evidence for assessment traces through the SSP, and every finding that CA identifies affects a specific control in a specific other domain.
Common Implementation Pitfalls
Several patterns come up repeatedly in Security Assessment readiness work.
SSP based on a generic template with minimal tailoring. The organization produces an SSP from a vendor template, GRC tool output, or a predecessor document, and the resulting SSP describes a generic environment rather than the contractor's actual systems. The 3.12.4 requirement calls for the SSP to describe the specific environment, and template language that could apply to any contractor does not satisfy that standard. The GAO gap analysis identifies this as one of the most common and consequential SSP deficiencies.
SSP that describes the ideal state rather than the actual state. The SSP describes controls as fully implemented when they are partially implemented or in progress. The contractor posts the corresponding SPRS score based on the SSP description, and the posted score does not match what a subsequent assessment will find. Beyond the compliance failure, this creates False Claims Act exposure because the SPRS affirmation is a contractual representation that the posted score accurately reflects the security posture.
Self-assessment as a rubber stamp. The 3.12.1 assessment is conducted as a documentation exercise rather than as a substantive evaluation. Every control is marked as implemented based on the SSP description, without verification against actual system state. When a subsequent C3PAO assessment reveals significant gaps, the divergence between the self-assessment score and the C3PAO finding creates both a compliance problem and an SPRS integrity concern.
POA&M with perpetual open items. The POA&M identifies deficiencies and assigns timelines, but items stay open well past their scheduled completion dates without updated timelines or closure records. Over time, the POA&M becomes a list of unresolved issues rather than a record of active remediation. The 3.12.2 standard expects the POA&M to reflect actual remediation work with current status.
Continuous monitoring reduced to quarterly reviews. The program treats 3.12.3 continuous monitoring as equivalent to the 3.12.1 periodic assessment performed more frequently. The two controls address different concerns. Assessment evaluates control effectiveness at a point in time. Continuous monitoring maintains awareness between assessments. Quarterly reviews are not continuous, and they miss the drift that continuous monitoring is designed to catch.
SSP that does not reflect changes to the environment. The SSP was developed carefully at program initiation, and then the environment changed. New systems were added, cloud services were adopted, boundary changes occurred, and the SSP was not updated to reflect any of it. The 3.12.4 periodic update requirement is not satisfied by an SSP that describes an environment from two years ago, and findings against the current environment will reveal the gap.
Missing or incorrect system boundary documentation. The SSP describes the security controls but does not clearly define the boundary of the system they protect. Without a clear boundary, the scope of assessment is ambiguous, and assessors cannot determine whether a finding applies to an in-scope system. Boundary documentation, including network diagrams and data flow descriptions, is part of the 3.12.4 obligation that is frequently underbuilt.
Where to Start
For an organization new to the CA domain, the first work is the SSP.
The foundational deliverable is an accurate System Security Plan that describes the contractor's actual environment. The SSP should name the specific systems in the CMMC assessment boundary, describe the data flows and external connections, and document how each of the 110 NIST SP 800-171 requirements is implemented in the specific environment. A template is an acceptable starting point only when it is then tailored to the contractor's reality rather than accepted as-is. A sample SSP and POA&M package is available in the resource library for educational reference on structure, depth, and the level of specificity that assessment documentation requires.
The second deliverable is the initial self-assessment. A substantive evaluation of each control against the environment described in the SSP, with documented findings that identify implementation status and any deficiencies. The self-assessment outputs feed the initial POA&M and the initial SPRS score. The assessment must be rigorous enough that a subsequent C3PAO evaluation will not produce substantially different findings.
The third deliverable is the POA&M and the continuous monitoring framework. The POA&M captures the deficiencies identified during self-assessment and tracks remediation work. The continuous monitoring framework describes how the organization will observe control effectiveness between assessments, including the technical mechanisms and the review cadences that support ongoing awareness.
With the SSP, self-assessment, POA&M, and continuous monitoring framework in place, the CA domain becomes an operational discipline that accumulates evidence over time. The controls are few, but they produce the artifacts that shape how every other domain is evaluated.