Open richfab opened 8 months ago
Currently, the JSON Schema does not throw any error when fields that are an Array of IDs contain duplicate IDs.
Even though the GBFS spec does not say anything about this, it is common practice that identifiers in an Array MUST be unique.
The rule to validate the uniqueness of values in Arrays of IDs can be added to the JSON schemas using #uniqueItems.
List of fields for which this rule applies:
- vehicle_types.json#vehicle_types.pricing_plan_ids - station_status.json#stations.vehicle_docks_available.vehicle_type_ids - system_alerts.json#alerts.station_ids - system_alerts.json#alerts.region_ids - geofencing_zones.json#geofencing_zones.features.properties.rules.vehicle_type_ids - geofencing_zones.json#geofencing_zones.global_rules.vehicle_type_ids - station_information.json#stations.vehicle_types_capacity.vehicle_type_ids - station_information.json#stations.vehicle_docks_capacity.vehicle_type_ids
Reopening this issue as more thought is needed since this validation rule is a best practice, not explicitly written in the specs.
Problem
Currently, the JSON Schema does not throw any error when fields that are an Array of IDs contain duplicate IDs.
Even though the GBFS spec does not say anything about this, it is common practice that identifiers in an Array MUST be unique.
Solution
The rule to validate the uniqueness of values in Arrays of IDs can be added to the JSON schemas using #uniqueItems.
List of fields for which this rule applies: