decentralized-identity / presentation-exchange

Specification that codifies an inter-related pair of data formats for defining proof presentations (Presentation Definition) and subsequent proof submissions (Presentation Submission)
https://identity.foundation/presentation-exchange
Apache License 2.0
84 stars 36 forks source link

Seeking clarification on `path` and `filter` usage to avoid duplicative validation logic #460

Closed mistermoe closed 11 months ago

mistermoe commented 1 year ago

I'm seeking clarification on the intended use of path and filter within field constraints in the PE v2 specification. While implementing evaluation logic, I noticed that these two properties can lead to similar outcomes but require different implementation approaches. e.g.

{
  "path": [
    "$.type[*]"
  ],
  "filter": {
    "type": "string",
    "pattern": "^StreetCredential$"
  }
}

^ requires the evaluation logic to apply filter to each array element individually. whereas,

{
  "path": [
    "$.type"
  ],
  "filter": {
    "type": "array",
    "contains": {
      "type": "string",
      "const": "StreetCredential"
    }
  }
}

^ relies entirely on JSON Schema to validate the value of $.type.

Implementing support for the first field constraint seems to complicate evaluation logic, essentially duplicating validation functionality already provided by JSON Schema. Am I overlooking something, or could this be an area where the specification could provide clarity as a means to streamline implementation?

Maybe something along the lines of

“Implementers are REQUIRED to defer the evaluation and validation of the value specified by path entirely to the filter mechanism. This approach ensures that the filter provides comprehensive validation as per JSON Schema, thereby streamlining the implementation process by eliminating the need for any additional, potentially duplicative, validation logic.”

For example, rather than applying individual validation checks to each element within an array specified by a path, the filter would perform a singular validation against the entire array, consistent with JSON Schema validation principles.

Would love to get insights from others and discuss potential adjustments to the spec that could help avoid this kind of duplicative validation logic.

decentralgabe commented 1 year ago

I can see how this would lead to confusion. I would be in support of an amendment that said something like...(just changing required to recommended)

It is RECOMMENDED that implementers defer the evaluation and validation of the value specified by path entirely to the filter mechanism. This approach ensures that the filter provides comprehensive validation as per JSON Schema, thereby streamlining the implementation process by eliminating the need for any additional, potentially duplicative, validation logic.”

For example, rather than applying individual validation checks to each element within an array specified by a path, the filter would perform a singular validation against the entire array, consistent with JSON Schema validation principles.

mistermoe commented 1 year ago

that makes sense @decentralgabe

rado0x54 commented 11 months ago

The core problem will be addressed by moving from JsonPath to JsonPointer in version 3.

Currently, this is an implementation preference that is completely under the control of the verifier and therefore does not require a normative statement in the spec. We will consider it as part of an implementation guide.