Closed zelinh closed 3 weeks ago
We have a linter/validator for various parts of the spec, and should add thirdparty validators like this one to https://github.com/opensearch-project/opensearch-api-specification/blob/main/.github/workflows/validate-spec.yml.
I added a python and ruby workflow that uses various libraries to validate the spec in https://github.com/opensearch-project/opensearch-api-specification/pull/646 and got ... thousands of errors from the Ruby one ;) Before I go and attempt to fix some of them would love to hear from @nhtruong about what we should be doing here.
Yeah. I was seeing the great amount of errors when using this kind of spec validator. Those all look legit according to the documentations from OpenAPI 3.1.0. I would like to hear what we plan as well. :)
The files in /spec
folders are not typical OpenAPI spec files due to the way the files and components are divided (the $ref
objects are not all in a root file like you will see in other OpenAPI specs), and many OpenAPI parsers can't handle this. The merged spec file, however, should be able to pass the OpenAPI validation.
description
as required: The old spec had these as ''
. We can make the merger insert these blank descriptions or simply add OK response
for 2XX responses and Not found
for 404.^[a-zA-Z0-9._-]+$
. This is a harder problem to resolve because the naming pattern we use are to identify responses of operation groups. Since the operation group itself is already ^[a-zA-Z0-9._-]+$
, we use @
as in indices.create@200
to differentiate different response codes of the same operation group. (The client code gens also rely on this to to tell the responses apart). We can change it to indices.create.200
at the cost of slightly less visually distinguishable between operation-group and response code.One new issue i found is:
For example /_bulk
, we specify the requestBodies
for bulk
as an array which conflicts with the application/x-ndjson
format, which uses newline-delimited JSON rather than a JSON array. We should specify whichever schema for application/x-ndjson
as a String.
@zelinh I think this is a separate problem, want to extract this to a new issue.
I have validation passing in https://github.com/opensearch-project/opensearch-api-specification/pull/646.
Just created a new issue for this found. https://github.com/opensearch-project/opensearch-api-specification/issues/656 Thanks for the fix on validation.
@zelinh I think this is a separate problem, want to extract this to a new issue.
I have validation passing in #646.
What is the bug?
I tried to load the opensearch API spec YAML into a python client openapi-spec-validator. However, when running the spec validation with python client, it fails on different aspects.
description
as required. https://spec.openapis.org/oas/v3.1.0#fixed-fields-14'^[a-zA-Z0-9._-]+$'
How can one reproduce the bug?
Download the opensearch-api-spec YAML. https://github.com/opensearch-project/opensearch-api-specification/releases/download/main-latest/opensearch-openapi.yaml
Run this python script
Will see error
After manually updating the component description, we would see another error regarding of the component schema name issue such as
info@200/_common:Name
don't match the pattern.What is the expected behavior?
The API spec yaml should be validated by spec validator for OpenAPI 3.1.
What is your host/environment?
MacOS, python 3.9.13