ISAITB / gitb

The Interoperability Test Bed is a service offered by the European Commission’s DIGIT for the conformance testing of IT systems. It is based on the GITB CEN Workshop Agreement and was originally forked from the GITB PoC software (available at https://github.com/srdc/gitb).
https://joinup.ec.europa.eu/solution/interoperability-test-bed/about
Other
19 stars 4 forks source link

Question: comparison wrt Postman, Schemathesis, Pact.io #20

Closed ioggstream closed 1 year ago

ioggstream commented 2 years ago

It would be great to have a brief comparison wrt other technologies such as postman, schemathesis and pact.io.

costas80 commented 2 years ago

Thanks for the interest @ioggstream !

The Test Bed software is not really comparable to the tools you mention, which I see as being more developer-oriented. The Test Bed can be as close or as far away from the technical implementation of systems as you want to, and can be used to define test cases that cover any kind of exchanges between systems (possibly also involving multiple systems). In fact the Test Bed is not a testing tool per se, but rather a full platform where both testers and "community" admins connect to respectively execute and monitor conformance tests.

For a better understanding of how the Test Bed works and how it can be used, I would invite you to check the documentation in Joinup, where you can also find a brief technical overview as well as a couple selected user stories. From this point, you could check the developer onboarding guide which goes over the main elements involved from a developer's perspective. You can also try out a couple simple demos yourself on the Test Bed instance hosted by the European Commission (EC).

Regarding this instance I mention, note that this is an EC-themed instance used by (primarily) EC-driven projects where typically different National implementations of specifications (involving APIs, message protocols, data, validation rules) are testing against common EU-wide specs. Anyone can however install the Test Bed via Docker or even build/extend and use it from source via the current repo. In the near future we plan to include more advanced theming and localisation options to make it's use also more fitting outside such EU-wide projects.

I hope this provides some extra info. If you need any more specific details let me know.

ioggstream commented 2 years ago

Hi @costas80

Thanks for your reply! In Italy we have a set of rest api guidelines and we provide a tool to test API design based on OpenAPI spec.

https://italia.github.io/api-oas-checker

I was comparing our project and, iiuc they have different goals.

The OAS checker supports the design/review phase.

The ITB supports the implementation phase.

It is possible that ITB could implement features that integrate with OAS for further checks.

For validation of CSV I saw that ITB uses TableSchema: is this the frictionless data Table Schema or the w3c one? In Italy we are investigating on an alternative to w3c TableSchema based on frictionless data and JSON schema to improve syntax constraints.

costas80 commented 2 years ago

Very interesting. I'm adding my comments on what you mention.

I was comparing our project and, iiuc they have different goals.

The OAS checker supports the design/review phase.

The ITB supports the implementation phase.

Indeed. The main way in which the Test Bed is used is to test the systems during/after their implementation by sending/receiving messages and then validating them against the overall specs and test-case specific assertions. Having said this, one could define test cases to test the YAML or JSON OpenAPI representation of an API, although for this purpose it would be more appropriate to use a validator versus the full Test Bed platform. Validators are simple to use and integrate with, whereas using the Test Bed you would need to define test cases, launch them, follow them up etc. Clearly the Test Bed (the software built from this repo) would be overkill for this and more suited to test the eventual implementations.

For this purpose, what we offer alongside the full Test Bed software are validators for different kinds of syntaxes (which, as I understand you are already aware of), where you define via configuration your validation rules and settings, and then provide the validator via several APIs (UI, SOAP API, REST API, command line tool etc). For the specific case of verification of an OpenAPI definition, we could imagine an equivalent being an instance of our JSON validator configured with the base schema plus additional schemas capturing your guidelines for Italy. In fact checking out your validator and the rulesets it offers it brings to mind a couple of our hosted validators such as the SHACL shape validator (used in the RDF world) to validate that SHACL validation rules are well-defined. This similarly offers options to test against the official spec, its extensions, best practices etc.

It is possible that ITB could implement features that integrate with OAS for further checks.

This is an interesting notion. Is OAS limited to Italian guidelines or can it e.g. accept the rules to test against via configuration or input? I'm thinking for example if its possible to use OAS for different countries and organisations, either verifying an API definition against the base specs or against organisation-specific extensions (as you already do for Italy).

One option you could find interesting is that the Test Bed (I'm referring again to the full conformance testing platform, not validators) supports orchestration of external services via a standard SOAP API (the GITB validation service API for validators). Implementing this API for OAS would allow it to be called in conformance test cases as a validation step in a potentially multi-step conformance test case. The API for validators is quite simple, essentially implementing a validate method that receives any inputs you expect before returning a validation report as its output. All the validators we define (for XML, RDF, JSON and CSV) implement this making it possible to use them standalone or to integrate them in conformance test scenarios. If you think such an addition to OAS would be meaningful I'd be happy to share further info and discuss in more detail.

For validation of CSV I saw that ITB uses TableSchema: is this the frictionless data Table Schema or the w3c one?

It is the frictionless one (see details here). As you have an interest in (the frictionless) TableSchema, you might also find interesting our TableSchema validator used to validate the JSON TableSchema definitions themselves.

ioggstream commented 2 years ago

This is an interesting notion. Is OAS limited to Italian guidelines or can it e.g. accept the rules to test against via configuration or input? I'm thinking for example if its possible to use OAS for different countries and organisations, either verifying an API definition against the base specs or against organisation-specific extensions (as you already do for Italy).

Our ruleset is composable and is used by other countries and companies. Currently we provide 4 ruleset:

  1. Standard Italian Ruleset https://italia.github.io/api-oas-checker/spectral.doc.html
  2. API Best Practices (without Italian-specific ruleset) taken from literature and API community https://italia.github.io/api-oas-checker/spectral-generic.doc.html
  3. Extra Security Best Practices https://italia.github.io/api-oas-checker/spectral-security.doc.html
  4. Italian + Security Best Practices https://italia.github.io/api-oas-checker/spectral-full.doc.html

Organizations can source our ruleset and atomically disable specific rules and/or add new ones. In general, rulesets are updated gathering our experience in reviewing potential issues by implementers, and confronting with the OAS and HTTP community.

For validation of CSV I saw that ITB uses TableSchema: is this the frictionless data Table Schema or the w3c one?

It is the frictionless one (see details here). As you have an interest in (the frictionless) TableSchema, you might also find

Great! We are in touch with frictionless and we are working to bridge DCAT and Frictionless cc: @mfortini on it. here you can see how we envision to provide JSON-LD information through Frictionless Tableschema using this draft specification

costas80 commented 2 years ago

Our ruleset is composable and is used by other countries and companies.

Interesting. Apart from the UI, Docker and the CLI approach which I saw you offer, do you also allow testing via e.g. REST API calls? It could be interesting for example to allow a user to call your validator and provide the definitions to validate as the payload. This could open up interesting integrations (of your instance and instances set up by other organisations).

Organizations can source our ruleset and atomically disable specific rules and/or add new ones.

So the approach to customise the rules is by forking/cloning the repo and adapting the rules. Do you think it would be interesting to also allow rules to be provided on-the-fly when using the validator (as e.g. additions to the "built-in" rules)? For example we support this in our JSON validator resulting in e.g. the TableSchema validator I linked to earlier where the JSON Schema for TableSchema is built-in but a user can provide additional schemas with their own rules. Going further, we also provide a generic JSON validator where there are no pre-configured schemas but these are expected to be provided as part of the input. It could be interesting to foresee some options like these for the OAS for users who would not want to fork repos and/or run their own validator.

Regarding conformance testing, do you think that OAS would be useful as a validation step in multi-step test cases? If yes, do have a thought on what I suggested earlier on supporting our validation service API. If there is indeed a lot of potential we could even consider contributing such an extension to OAS (especially if OAS can be fully generic - see my above comment).

Great! We are in touch with frictionless and we are working to bridge DCAT and Frictionless cc: @mfortini on it. here you can see how we envision to provide JSON-LD information through Frictionless Tableschema using this draft specification

Sounds good! I'll check this out.

ioggstream commented 2 years ago

do you also allow testing via e.g. REST API calls?.

Not now. This is easy to implement (e.g. wrapping spectral via an API), but the current implementation does not need to process third party data. All the processing happens on client side. Agencies just need to reference our ruleset URL. Moreover, since we have many of agencies using this tool, we don't need to deploy a specific infrastructure ;)

It could be interesting for example to allow a user to call your validator and provide the definitions to validate as the payload.

As of now we imagine to achieve this depending on use cases:

  1. when agencies uploads their APIs on the National API Catalogue, we'll validate it and add a red/yellow/green badge on the API. This will probably work only for providers.
  2. if a generic user wants to validate a content against a specific schema, we can customize the UI to do it. For example, current UI already validates the example content of each schema. In this case, the challenging part is to avoid processing data which is potentially personal
  3. for public APIs, we can harvest all OAS and run the validator on them, or we could check that every repo has a gh-action that runs spectral

This could open up interesting integrations (of your instance and instances set up by other organisations).

This is something we are thinking on and we could discuss on that together.

Organizations can source our ruleset and atomically disable specific rules and/or add new ones.

So the approach to customise the rules is by forking/cloning the repo and adapting the rules.

Sort of. Actually you can have an one-file repo and source the rules like this

Do you think it would be interesting to also allow rules to be provided on-the-fly when using the validator (as e.g. additions to the "built-in" rules)?

Customization is something we planned but it's on the backlog. If there's a project, imho it's better to create an one-file repo and run spectral.yml like shown above, so that you can track changes. If the goal is just supporting this feature in the WebUI, I think it's easy to implement but we should finalize new rules first:

  1. when loading ruleset here https://github.com/italia/api-oas-checker/blob/97b708582940dececbf983502094b7fca92f2365/src/spectral/spectral_engine.js you can update the JSON object with data provided from the UI (e.g. references to new ruleset, ...)

Again, the advantage is that all processing happens on the client.

It could be interesting to foresee some options like these for the OAS for users who would not want to fork repos and/or run their own validator.

It would be very nice for general users, and I think it's something that can be tackled together with https://github.com/italia/api-oas-checker/issues/26. For agencies, I expect them to setup even a simple repository with extended ruleset (e.g. publishing one single file), and we could maybe support referencing external ruleset in the URL. This is something we can discuss, though.

Regarding conformance testing, do you think that OAS would be useful as a validation step in multi-step test cases?

This is non-trivial. For sure, if an API does not pass e.g. OAS zaproxy/schemathesis validation, it's not interoperable. OTOH if you need to test specific workflows, I think you need other tools (e.g. like yours). While I love fuzzing, I don't know whether there are specific tools that can help in fuzzy-testing specific workflows. This topic actually requires a dedicated meeting, though.

Great! We are in touch with frictionless and we are working to bridge DCAT and Frictionless cc: @mfortini on it. here you can see how we envision to provide JSON-LD information through Frictionless Tableschema using this draft specification

Sounds good! I'll check this out.

Let me know what you think.

costas80 commented 2 years ago

It could be interesting to foresee some options like these for the OAS for users who would not want to fork repos and/or run their own validator.

It would be very nice for general users, and I think it's something that can be tackled together with https://github.com/italia/api-oas-checker/issues/26.

Good to see you like this idea. In fact from our experience our generic validators (validators for RDF, CSV, JSON and XML) that don't predefine any rules, see quite a bit of use as supporting tools. Usage is more as you suggest by general users that are in the process of developing their own specifications. I do believe that offering a lightweight validation approach like this to anyone would be a very nice use case for the OAS validator (or a more generic configuration of it). Note that when I say lightweight I mean an approach were you don't need to use Docker, create a GitHub repo, or even familiarise yourself with the validator's configuration.

Regarding conformance testing, do you think that OAS would be useful as a validation step in multi-step test cases?

This is non-trivial. For sure, if an API does not pass e.g. OAS zaproxy/schemathesis validation, it's not interoperable.

We have test cases for specifications where even if an API is expected to be valid before being used for testing workflows, there is a first set of sanity checks to ensure that more complex steps are not attempted if the API fails more "basic" validation. Using the OAS validator for such a sanity check would be interesting indeed. Such usage would require integration that would suggest the OAS validator exposed a "backend" that could be called by the orchestrator (the test engine). Given that all the work happens client-side this is not an obvious extension. However, thinking about this I believe that this "limitation" is also easily surmountable. Instead of somehow extending the OAS validator itself one could foresee a validation service adapter that implements the (backend) API needed to be used in test cases, and that internally uses the OAS validator, creating the necessary config to use on-the-fly based on received inputs. As we would have full control here, these inputs could include both the actual API definition to validate (provided as-is or as a URI), as well as the rules themselves (that could be defined at the level of the specific test case, test suite or validator adapter instance).

This is a first stab at a design that could support such usage that would need no changes in the OAS validator and would not be too difficult to implement. In any case, this is something to be discussed further as I am surely mistaken somewhere due to my current limited understanding of the OAS validator.

Great! We are in touch with frictionless and we are working to bridge DCAT and Frictionless cc: @mfortini on it. here you can see how we envision to provide JSON-LD information through Frictionless Tableschema using this draft specification

Sounds good! I'll check this out.

Let me know what you think.

I had a read through the information you share and agree that this indeed has potential. One of the drawbacks of CSV and TableSchema is that you have limited possibilities in explaining what the data actually is (semantically). As an example from our own experience, we host a CSV validator for the Kohesio pilot project for which an additional source of documentation/metadata was needed to explain what the CSV fields mean. An approach like what you suggest would allow capturing this in the validation artefacts themselves and potentially open this up to different kinds of integrations (e.g. live discovery of datasets and conversions to/from CSV/RDF/JSON depending on what is needed).

This is something we are thinking on and we could discuss on that together.

Indeed there is an interesting follow-up discussion here and on the other discussed points :) If you agree, I would propose you drop an email to DIGIT-ITB@ec.europa.eu and then we can then go about planning this.

ioggstream commented 2 years ago

from our experience our generic validators (validators for RDF, CSV, JSON and XML) that don't predefine any rules, see quite a bit of use as supporting tools

Did you consider the idea of using in-browser implementations to validate content? This avoids not only users to deploy containers, but even the EC can avoid doing this.

Given that all the work happens client-side this is not an obvious extension. However, thinking about this I believe that this "limitation" is also easily surmountable.

Yes. A webUI can even validate locally and invoke the API only after the spec validates locally, thus reducing the infrastructural load on the EC platform.

we host a CSV validator for the Kohesio pilot project for which an additional source of documentation/metadata was needed to explain what the CSV fields mean. An approach like what you suggest would allow capturing this in the validation artefacts themselves

You could even generate a human readable description of the rules out of the (tableschema + linked data keywords) specification.

costas80 commented 2 years ago

Did you consider the idea of using in-browser implementations to validate content? This avoids not only users to deploy containers, but even the EC can avoid doing this.

True, and this is something I find quite nice in your implementation. For the validators that we provide, this was not possible as we started from requests primarily for machine-to-machine integration (REST and SOAP APIs). The validators' UIs, although today quite popular, was initially a secondary goal.