Open paulprogrammer opened 8 years ago
I very much like RAML's use of traits for this. Let's look there for design inspiration.
It is concise but with reasonably complex specs you'll be chasing around mysterious sources of info. I personally prefer clarity over brevity
@fehguy can you expand on that concern? I'm not sure I grasp it right away.
BTW: Love the Fry's logo comparison there. Makes me giggle every time.
I agree with @fehguy. Big advantage of OpenAPI is an ecosystem of tools/libs around it: https://github.com/swagger-api/swagger.io/blob/wordpress/integrations/open-source.md I personally wrote a lot of OpenAPI related tool and even RAML to OpenAPI converter. So from this side situation looks really different.
Today I can do something useful with OpenAPI spec in less than 100 lines in JS or any other modern language. On the other hand to work with RAML you need special library which goes as far as providing it own AST: https://github.com/raml-org/raml-js-parser And a majority of this is related to supporting complicate traits mechanism. It's not a problem for Mulesoft(owners of RAML) because they build one set of enterprise tools. But ecosystem around OpenAPI is consist of many small tools written by independent developers.
I'm not against reusing some things from RAML or any other specification. Instead, let's agree that OpenAPI is a two-sided project: API owners and library/tools authors. So if the new version of OpenAPI will support useful but complicated to implement features it could become very popular format but with very few tools supporting it.
Well, for my part, the spec today is too simplistic. It works GREAT for weather apis, and simple shopping apps, but at large business scale where there are multiple constituents the concepts fall apart.
I need to be able to define common elements that are referenced in other parts of a definition so that I can separate the concerns of channel analytics, security and multiple application groups. These items need to be separate from the API definitions because they change, and I would hate to have to update some x-channel
header everywhere it is expressed because there's no way to merge two lists into a single output.
DRY concepts like traits and/or mixins allow me to make the most of modularity which gives me consistency in object definitions and separation of concerns which I do not have in OAS today without a very anally-retentive manual process.
Does it need to be part of the spec? Not really -- it could be a preprocess tool, but I would hate for there to be different flavors of OAS-like descriptions. That too does not help the ecosystem.
Does it need to be part of the spec? Not really -- it could be a preprocess tool, but I would hate for there to be different flavors of OAS-like descriptions. That too does not help the ecosystem.
If these preprocess tools, produce specifications which are valid according to OpenAPI I don't see any problem with it. On the contrary, it will allow people to experiment with different approaches, without compromising OpenAPI spec.
As an example of this approach, my customer asked me if I can implement RAML-like directory structure for OpenAPI, so I create an open-source tool to support it. But I'm not sure that this approach is best one and I don't want to push it in core spec.
I think we need to learn from SOAP and CORBA failures and keep core spec simple.
IMHO, for me OpenSpec is more lingua franca
of REST than one tool to solve all problems with designing APIs.
@paulprogrammer we do this, because you basically want to share blocks of yaml right? This is why we chose yaml in the first place. I don't think definition in the spec is strictly necessary
x-ResourceCommon: &ResourceCommon
uuid:
type: string
description: uuid of the resource
name:
type: string
description: name of the resource
paths:
'/blah/{id}':
put:
tags:
- blah
responses:
'200': &RES_STATE
description: request succeeded
headers: &RESP_HEADERS
X-Request-Id:
description: the request id
type: string
schema:
$ref: '#/definitions/ResourceState'
'404': &RESP_ERROR
description: resource not found
headers: *RESP_HEADERS
schema:
$ref: "#/definitions/Error"
'428': *RESP_ERROR
default: *RESP_ERROR
definitions:
ResourceGateway:
description: An Internet Gateway resource type
allOf:
- "$ref": "#/definitions/Resource"
- type:object
properties:
<<: *ResourceCommon
id:
type: string
description: id of the InetGateway
vpcId:
type: string
description: id of VPC
Is there a JSON equivalent of this YAML << construct? I thought OAS allowed JSON or YAML, but that would imply a subset of YAML - a lowest common denominator.
We are currently developing x-sas-traits annotations and a preprocessor (consumes either YAML or JSON) to inject reusable elements/traits This allow more human-friendly and less repetitive/verbose spec files during development. We generate full Swagger 2.0 from there to use with the other tools. For example, we can use this to associate media types with schema definitions, or to indicate that there is a HEAD operation for this GET (to remove all the otherwise redundant spec of including a head operation) or that this PUT is a Conditional PUT (which injects standard If-Match/If-Unmodified-Since request headers and 412/428 response codes). Ditto for 401/403 response codes for authenticated operations, etc.
The effect is API descriptions which are much easier to understand because they are not nearly as verbose at the time the developers and other are most concerned with understanding them. It also helps us ensure everyone does Conditional PUT the same way, etc.
Others have asserted that the tools should provide these capabilities, but to be honest I've looked at swagger-editor and IMHO it is not structured or implemented in an extensible way that would allow such additions.
Also, simply adding tooling to generate content or inject additional headers will not preserve the abstraction that you can express with a 'pageable' trait or a 'conditional-put' trait.
I thought OAS allowed JSON or YAML, but that would imply a subset of YAML - a lowest common denominator.
YAML have way more features that JSON have, for example, comments. But it doesn't mean you can't add comments in OpenAPI files. Instead, Swagger files should be convertible to lowest common denominator which is JSON.
will not preserve the abstraction
Same situation with comments in YAML they are very useful but completely stripped off then you convert to JSON. And it doesn't mean that we should invent some comment
keyword in Swagger.
Disclaimer: I don't know your use-case just general warning on using vendor extensions
in OpenAPI files.
We are currently developing x-sas-traits annotations and a preprocessor (consumes either YAML or JSON) to inject reusable elements/traits
The most important thing is not to treat it as an extension to OpenAPI but as the format which is convertible to OpenAPI. Please don't add swagger: 2.0
keyword in such files and don't advertise them as OpenAPI files. Doing that you confuse your developers, then they try to use unprocessed spec with some of the 3rd-party tools, they would be surprised that nothing will work.
Big thing about vendor extension is they was designed to add some additional data to spec and not to support high-level concepts. As it clear from their description in spec itself:
While the Swagger Specification tries to accommodate most use cases, additional data can be added to extend the specification at certain points.
For example, in my case, I add logo URL to each OpenAPI spec in my catalog, based on this extension.
I do this because I don't have any other way to put it inside spec. So it fairs to all clients of my spec without this extension they didn't have any ways to access logo URL.
But at the same time, I don't try to push logo
keyword into OpenAPI core specification because I understand that not everybody need/care about such data.
Then I add some high-level concept to OpenAPI file and advertise it as such. I'm enforcing work to support such extension on 3rd-party lib/tools.
"additional data can be added to extend the specification at certain points" does not mean (to me) "only add additional data". It means exactly what it says, "extend the specification" (for example to allow expressing valid RESTful API abstractions and constructs that the specification (previously, Swagger 2.0, which has serious limitations) does not support. That is what "open" and "extension" means (or should mean), after all, as in the Open/Closed principle.
I'm not trying to say the spec support specific x- vendor extension, but rather that we allow certain traits and other constructs which make OAS easier for developers to use as a spec, such as code and document generation and validation. Many of them can be reduced (via tooling) to a simpler version of the spec. In many cases, traits can be inlined for example, just as the << construct in YAML, so that tools can be coded to a simpler specification. That might even be a guideline for what kinds of traits we want to add: only those that are reducible with a fixed point process.
It means exactly what it says, "extend the specification" (for example to allow expressing valid RESTful API abstractions and constructs that the specification (previously, Swagger 2.0, which has serious limitations) does not support.
Fully agree with you :+1: If it something that you can't express in the latest version, it's a totally valid case. Having API spec with vendor extension is always better than no spec at all. As an example, I need to support Oauth1.0 and I can't do this in current spec so I will create vendor extension to do this.
But supporting vendor extension for something like traits
or mixins
is a completely different thing.
It means you CAN describe this API in pure Swagger 2.0 but decide not to do so.
What happing at that point, I'm as a developer see your spec and try to use it in some 3rd-part lib/tool. And I don't see most of common parameters/responses, so I run validator against this spec. Validation goes successfully so I open issue saying your lib/tool is broken and don't work with a perfectly valid spec.
As the result of this OpenAPI goes exactly same slippery slope as CORBA, WSDL, WADL, etc. But it can be easily avoided just mark your high-level spec as a different format so it doesn't cause any confusion. And always provide OpenAPI specification alongside with it.
@DavidBiesack the way I see it: yaml is for humans, json is for machines. We write the specs in yaml but serve them as json and that's totally fine.
@casualjim I agree wrt YAML/JSON. I'm bilexical myself, though not fluent in YAML. But we need to be careful about using constructs such as << in YAML that have no equivalence in JSON. (I don't even know what << means; I can't find it in the 1.2 spec, for example).
The $merge
feature of my openapi-preprocessor seems to fit the need for traits/mixins for spec authors.
AsyncAPI, which is very similar to OpenAPI, has traits. Maybe we can draw inspiration from their implementation that relies on JSON Merge Patch:
A list of traits to apply to the operation object. Traits MUST be merged into the operation object using the JSON Merge Patch algorithm in the same order they are defined here.
https://www.asyncapi.com/docs/specifications/2.0.0/#a-name-operationobject-a-operation-object
The
$merge
feature of my openapi-preprocessor seems to fit the need for traits/mixins for spec authors.
Is this $merge option from openapi-preprocessor still the recommended approach for implementing RAML like traits in OpenAPI? Or is there a built-in alternative to achieve the same effect?
I know this is a very old issue, but it's been moved to the Overlays Specification repository, and I think Overlays could be a useful tool to meet the traits and/or mixins case. Check out the main spec and examples if you didn't see the Overlays spec before, and perhaps we can share some examples to look at how these features could be implemented with Overlays as it is today.
I would like to be able to add something like traits or mixins to operations. For instance, if I have a
GET /foo
that returns a list of foo objects I could mark it as "pageable" (assuming I've defined a pageable trait) that would include the pageSize and pageNumber query parameters, and force a "pagination" element in the payload object.This would allow us to utilize reuse concepts to provide a semantic view of the endpoint definitions.
This would result in a long-form definition as:
This is obviously silly if you have only one pageable endpoint, but assuming you had many resources that were listable, you could define how pagination works across the entire suite in a DRY manner.
Open questions: I defined "is" for
traits
on an operation, and "has" asmixins
for an object, but the resolution rules are the same. Not sure if this is really a good practice.Other uses: Consider if there are commonly used headers -- you might have a
mixin
for the path object that contains headers that are merged with the list of headers defined by the implementation:This would resolve to: