Open teto opened 1 year ago
Of note, this also happens if you have i.e.
data Foo = Bar | ... | Wrong A A
because it generates
value:
items:
- $ref: '#/components/schemas/Day'
- $ref: '#/components/schemas/Day'
maxItems: 2
minItems: 2
type: array
instead of
value:
items:
$ref: '#/components/schemas/Day'
maxItems: 2
minItems: 2
type: array
There is a comment regarding this:
Warning: OpenAPI 3.0 does not support tuple arrays. However, OpenAPI 3.1 will, as it will incorporate Json Schema mostly verbatim.
As example:
ghci> :m Data.OpenApi Data.OpenApi.Schema Data.Aeson.Encode.Pretty Data.Proxy Data.ByteString.Lazy.Char8
ghci> Data.ByteString.Lazy.Char8.putStrLn $ encodePretty $ toSchema $ Proxy @(Int, Bool)
{
"items": [
{
"maximum": 9223372036854775807,
"minimum": -9223372036854775808,
"type": "integer"
},
{
"type": "boolean"
}
],
"maxItems": 2,
"minItems": 2,
"type": "array"
}
This is valid per draft7 JSON Schema - I've checked here.
However, latest OpenAPI references 2020-12 draft: https://spec.openapis.org/oas/latest.html#schema-object.
And per the latter prefixItems
should be used instead.
Could you check whether schema is accepted (and validation works correctly both in negative and positive scenarios) if to replace problematic items
with prefixItems
?
If yes I believe one needs to bring these lines in sync with JSON Schema spec (note that there is inference between items
, prefixItems
, unusedItems
). Though additional questions is which version of OpenAPI this library targets.
I have openapi generate
for the types
swagger editor doesn't like to have an array in "items" as according to the spec, items must contain objects only. This is the message I got:
Structural error at components.schemas.AggregateMRQuery.properties.select.items.items should be object
I am ready to prepare a PR to fix that but would like some guidance. Seems like tuples in general are not a good fit for openapi specs so what should the API do ? should it error out when dealing with tuples ? or throw a warning ? I wish the API would tell me when it generates possibly invalid spec instead of discovering it via a spec validator.