Closed aj-stein-gsa closed 2 weeks ago
@DimitriZhurkin please review the checklist tomorrow and be ready to confirm if this issue, now that PRs are closed, is ready to ship.
Technically, unit tests exist only for SSP. Work on SAP, SAR, and POA&M hasn't started yet.
@aj-stein-gsa and @DimitriZhurkin the constraint itself is the same (https://github.com/GSA/fedramp-automation/pull/875/files#diff-d8b799dfbfbf5f5ac64efe9def7e0cf21d38ded8ebc763b4c266aaa59060f63e) for the models, but as Dimitri noted, we'd have to update the unit tests and create additional sample test files for SAP, SAR & POA&M. Unit test file would looks something like this for marking-PASS.yaml
:
# Driver for the valid marking constraint unit test.
test-case:
name: The valid marking constraint unit test.
description: Test that the SSP, SAP, SAR, and POA&M metadata contains the "marking" property.
content:
- ../content/ssp-all-VALID.xml
- ../content/sap-all-VALID.xml
- ../content/sar-all-VALID.xml
- ../content/poam-all-VALID.xml
expectations:
- constraint-id: marking
result: pass
And we'd have to do something similar for marking-FAIL.yaml
. Should we do that now, or scope this issue to just the SSP which is already done, and create separate issues for adding unit test vectors for the SAP, SAR, and POA&M?
Should we do that now, or scope this issue to just the SSP which is already done, and create separate issues for adding unit test vectors for the SAP, SAR, and POA&M?
If we need to consider the scope of work to be everything but the constraint @test
logic, so docs and examples, which makes the work effort more clear and faster? I don't have a strong opinion on this one, so I can just pick if others do not. 😄
Per conversation in a backlog meeting at the time of this writing, for low-risk cross-model //metadata
constraints, we can accept not explicitly adding per-model tests for such requirements that are also covered in generic documentation for cross-model requirements, I am happy to accept the included tests and examples as-is and the work is ready to release.
Constraint Task
As a digital authorization package maintainer, to meet FedRAMP requirements, avoid passbacks, and properly identify data security requirements while doing so, I want to know all my documents in the package have properly selected the correct data classification or return a passback error.
Intended Outcome
Goal
Identify package documents that require a CUI marking (so all but catalogs and profiles; those are inherently not controlled, they go everywhere)
Syntax
/(assessment-plan|assessment-results|plan-of-action-and-milestones|system-security-plan)/metadata/prop[@name="marking"]/@value
to return a message with@level="ERROR"
for a document that does notprop[@name="marking"]/@value
=cui
./(assessment-plan|assessment-results|plan-of-action-and-milestones|system-security-plan)/metadata/prop[@name="marking"]/@value
for an allowed value ofcui
. This enumeration will allow undefined values@allow-other="yes"
, see allowed-values guidance below.~VALID:
INVALID:
Syntax Type
This is optional core OSCAL syntax.
Allowed Values
There are only NIST-defined allowed values.
Metapath(s) to Content
Purpose of the OSCAL Content
Dependencies
N/A
Acceptance Criteria
oscal-cli metaschema metapath eval -e "expression"
.Other information
Part of epic #804.