camaraproject / WorkingGroups

Archived repository used previously by CAMARA Working Groups.
https://wiki.camaraproject.org/display/CAM/CAMARA+Working+Groups
42 stars 60 forks source link

Concept on how to deliver test cases for contributed Camara APIs #61

Closed shilpa-padgaonkar closed 1 year ago

shilpa-padgaonkar commented 2 years ago

In what format are the test cases expected?

Kevsy commented 2 years ago

I can think of:

conformance tests.

These validate the request and response syntax of an API, including response codes and exceptions.

documentation conformance tests.

Could be by parsing the .md to ensure the markdown template is followed

penetration tests.

probably out of scope for CAMARA but essential for implementers

performance tests

out of scope

privacy by design and security by design.

To be discussed, could be a manual review?

user acceptance testing

In this case the user is the developer. Ideally API developers (and not the API authors) will review.

Others?

shilpa-padgaonkar commented 1 year ago

Use suggestions and feedback provided here. Contribute the first set of test cases for one of the subprojects and for one or more of the categories listed above and then derive a (guidelines) doc out of this contribution

jordonezlucena commented 1 year ago

We propose to use Gherkin as DSL for ATP. This DSL allows (via other tools, up to operators' criteria) automation on test campaign execution. An example of baseline ATP using Gherkin is provided in PR #102.

@Kevsy: in relation to the tests you have listed above, here's our feedback.

Kevsy commented 1 year ago

Thanks @jordonezlucena

For 'user acceptance testing': the 'user' here is the API consumer. So UAT is to make sure the API is useful and useable - otherwise we have no feedback and end up publishing the API and hoping it will be used.

An example could be that the network operators define an API using certain operations, datatypes, error messages etc. that cause problems to the API consumer. In which case the UAT feedback would be used to reconsider the API defintion.

I see UAT as an iterative process (before and after the conformance test) with the result that we have a useful/conformant spec.

shilpa-padgaonkar commented 1 year ago

@jordonezlucena : Proposal to use Gherkin as DSL is fine from DT side.

jordonezlucena commented 1 year ago

Thanks both for the feedback. @shilpa-padgaonkar: since QoD is the most advanced API as of today, do you think a first example on ATP can be provided by DT in the near future? Would it be using Gherkin?

In the meantime, my take is to draft a (live) table where we specify: 1) DSL for testing, and 2) tests in scope and out of scope in CAMARA. Does it make sense?

patrice-conil commented 1 year ago

Hi @all, As the Karate framework is based on Gherkin, and is independent of the programming language, I can share a first sample of BDD tests if you think it can help progress on this subject.

jordonezlucena commented 1 year ago

Hi @ALL, As the Karate framework is based on Gherkin, and is independent of the programming language, I can share a first sample of BDD tests if you think it can help progress on this subject.

Looking forward to it.

patrice-conil commented 1 year ago

The pull-request is here: https://github.com/camaraproject/WorkingGroups/pull/148

You can look at it.

mdomale commented 1 year ago

Cucumber Framework is based on Gherkin ,we are sharing samples for QoD cucumber Tests . Also we are sharing comparison between Cucumber and Karate. cucumber.zip Comparing+Cucumber+and+Karate.docx

patrice-conil commented 1 year ago

@mdomale, I agree with you that Cucumber is more popular than Karate, it can be used to define and test just about anything and we use it in many projects.

For me the question is not: "what is the best framework to test my implementation by myself?" but "which framework makes it easy for someone else to test their implementation for conformance using the suite I provide?"

I think that Cucumber has a few downsides that you didn't mention in your comparison:

A Karate collection can be run in vscode without any knowledge of the programming language and can also mock NEF. As a developer of a new API implementation, if I want to test its conformance using an external test suite, I expect it to be as easy as possible.

Maybe I'm trying to answer the wrong question, let me know if that's the case.

sfnuser commented 1 year ago

@patrice-conil - Thanks for your PR and @mdomale - Thanks for your references. Both are useful read.

There are few comments from my side.

  1. Does the framework we choose is binding on all CAMARA sub-projects?
  2. Karate - it seems language agnostic but it still depends on Java. The standalone tool is still a jar file. Which brings to the below question - "which framework makes it easy for someone else to test their implementation for conformance using the suite I provide?" - Is this the question this Github issue addressing?
shilpa-padgaonkar commented 1 year ago

@sfnuser: As test artifacts are expected to be in the main subproject repos, there is at least a desire to align on the framework/tools to be used for the same (and yes across subprojects). WE can discuss if it makes sense to give a chance to provider implementation repos to supplement this with their individual test artifacts, and then may have the choice also to use the tools of their choice.

But for the main subproject repos, it would be useful to have a common alignment. let us know your view.

patrice-conil commented 1 year ago

@sfnuser, Karate needs a java runtime to run the tests. But you don't need any Java coding skills to use it. That's why I called it "language agnostic"...despite the fact that the karate config uses javascript.

Are we targeting validation for API consumers or API providers or both?

sfnuser commented 1 year ago

@shilpa-padgaonkar Thanks. If there is a common framework/tools, which is part of the main repo, that targets API consumers or API providers to validate their implementation, I am OK with any of the above mentioned tools.

I believe Provider Implementations can add their own set of tests that suit their need, within their repo. I don't think we need commonalities across PIs on this aspect.

@patrice-conil OK. Karate framework does seem to recommend Java or JS when the logic gets complex. I suppose it does not matter if it is part of a top level suite as mentioned above. If I am an API provider or consumer, I can always spin off a docker instance running the suite and test against it.

Are we targeting validation for API consumers or API providers or both?

Good question. I think it is nice to have both. However, if we have to choose one at this point, API providers would be my pick. It would put the onus of all of us to consistently upgrade and maintain :-) and API consumer devs are free to choose PIs.

mdomale commented 1 year ago

@patrice-conil Cucumber framework makes it easy for everyone to test their implementation for conformance because feature file is more readable and easy understandable along with re-use of Step definitions when need arises for extending scenarios. As cucumber is wellknown and widely used framework it becomes more effective for contribution for everyone and utilization as well.

patrice-conil commented 1 year ago

@mdomale, If idea is to deliver a full project structure with features and steps and probably a wiremock to simulate the NEF, I think we need to agree on which technical stack to work:

mdomale commented 1 year ago

Although we don't want to restrict usage of different tools to ensure maximum flexibility we would like to prefer below stack

mdomale commented 1 year ago

@patrice-conil @sfnuser Can you please share confirmation that you are fine with above stack or you have any concerns ?

patrice-conil commented 1 year ago

@mdomale, I prefer gradle and kotlin but I can live with maven and java. :) Since this is a full Java stack project... I think we will need to provide a docker/k8s image for non-javaist users to run the test suite.

sfnuser commented 1 year ago

@mdomale

As @patrice-conil mentioned, if the instructions to dockerize the test suite is also provided, I have no further concerns. Cheers.

mdomale commented 1 year ago

@sfnuser @patrice-conil Thanks a lot for your responses .Yes we can provide docker/k8s image. @shilpa-padgaonkar @akoshunyadi

mdomale commented 1 year ago

@patrice-conil @sfnuser We have created a draft pull request with our initial contribution for QoD APIs and integration of wiremock will be done shortly .
https://github.com/camaraproject/QualityOnDemand/pull/134 @akoshunyadi @shilpa-padgaonkar

mdomale commented 1 year ago

@patrice-conil @sfnuser @jlurien PR is already raised for review https://github.com/camaraproject/QualityOnDemand/pull/134 @akoshunyadi @shilpa-padgaonkar

jpengar commented 1 year ago

Providing API test cases is included as one of the mandatory items in the API Readiness minimum criteria checklist defined in Commonalities.

These checklist steps are part of the requirements in order to generate a stable release v1.0.0 in CAMARA as it is being discussed in Commonalities https://github.com/camaraproject/WorkingGroups/issues/139

As @jordonezlucena mentioned above, we propose to use Gherkin as DSL for ATP. This DSL allows (via other tools, up to operators' criteria) to automate the execution of test campaigns. And perhaps specific test implementations using Cucumber (or other frameworks) could be part of the provider implementation repos, if applicable.

@rartych @shilpa-padgaonkar @jordonezlucena @jlurien @patrice-conil @sfnuser @mdomale @hdamker To satisfy step 5 ("API test cases and documentation") of the API Readiness checklist, could we conclude that the mandatory requirement is to provide the `.feature' gherkin file describing the test scenarios and left implementation specifications to implementation repositories? Is this correct? Or does this need further discussion/clarification? Do you expect the implementation of the test plan to also be a mandatory deliverable in the main API subproject?

jordonezlucena commented 1 year ago

@jpengar : thanks for resuming the issue. Your proposal LGTM.

shilpa-padgaonkar commented 1 year ago

@jpengar and @jordonezlucena : This issue was already resolved with agreement from multiple participants (see comment https://github.com/camaraproject/WorkingGroups/issues/61#issuecomment-1467809859 and https://github.com/camaraproject/WorkingGroups/issues/61#issuecomment-1468299304 ).

Based on this agreement, a PR was created in QoD subproject which was reviewed, approved and merged.

I see now that @jpengar has requested 2 new things:

I would rather open a new issue where we extend the already made contribution with the Gherkin feature file. DT will then create a PR in QoD subproject to add the feature file.

At least for QoD all current 3 providers DT, Orange and SpryFoxNetworks are fine to have a common implementation as currently provided. If you are still keen to move this out, I would recommend you to start this as a separate discussion and try to get a new consensus here.

@jpengar @jordonezlucena : Please provide feedback if you are ok with the recommendations above.

jpengar commented 1 year ago

@shilpa-padgaonkar In my personal defense (kidding) I will say that the issue was still open when I got to it and it was previously referenced only a few days ago... :)

Anyway, what I wanted to know is what exactly is the requirement to fulfill step 5 ("API test cases and documentation") of the API Readiness checklist. And if it is good enough to provide the .feature file or not. As mentioned above, the .feature file is actually the gherkin file that describes the test scenarios. And this is not a new request, if you check the PR https://github.com/camaraproject/QualityOnDemand/pull/134, the .feature file is provided as part of the pull request as the test case definition. But apart from that, the test plan implementation is also provided using the cucumber framework (test runner, test step implementation in java, etc).

As for moving the test case implementation into provider implementation repos. For me it makes sense, but if it was agreed to include it in the main project, I would like to know if it is a mandatory requirement to satisfy the corresponding step in the API Readiness checklist or not. Or if a .feature file describing the test cases is good enough as a prerequisite to generate a first stable API version v1.0.0.

shilpa-padgaonkar commented 1 year ago

@jpengar : :) no worries. You are right, the Gherkin feature file is already in, my bad.

I would say that the feature file would fulfill the requirements from the minimum readiness checklist. But this is just my personal opinion, and we could check in the group for feedback from others.

rafpas-tim commented 1 year ago

Since Gherkin .feature is a high level description of a specific set of features/behaviours that the api implementation would provide as user-experience, me and @FabrizioMoggio are fine with adding it into related subproject (e.g. TrafficInfluence).

In this way it will represent a useful document of the api's "internal" behaviour, which is hidden if you rely only on openapi specs, and it will be easier and faster to assess minimum critieria without being an expert in BDD or Gherkin.

Consequently, the test case implementation (Cucumber), which is tightly coupled with the reference implementation (preconditions, states, mocks, etc.), will be hosted in the implementation repo.

If we are understanding BDD correclty: firstly write .feature, then approve, then write implementation code.

shilpa-padgaonkar commented 1 year ago

Thank you @jpengar @rafpas-tim @FabrizioMoggio for your feedback.

Would like to propose the following:

  1. Everyone seem to agree that .feature file in main subproject will help with compliance of minimum checklist criteria document.
  2. If there is a testcase implementation that is aligned and accepted in the subproject, we could host it in the main subproject repo as we do this in QoD.
  3. Provider implementors are still free to further provide additional test implementation within their individual repos.

Would this work? If it is still important for you that we move the QoD test implementation to the provider implementation repo, we can also do that. Would be fine for me both ways. But I see more chances of people wanting to contribute just the test implementation but not a full reference implementation (currently we just have that for QoD), and in this case it would mean that we need to add a new provider implementation repo in Camara just to host the test implementation.

jpengar commented 1 year ago

Thank you @jpengar @rafpas-tim @FabrizioMoggio for your feedback.

Would like to propose the following:

  1. Everyone seem to agree that .feature file in main subproject will help with compliance of minimum checklist criteria document.
  2. If there is a testcase implementation that is aligned and accepted in the subproject, we could host it in the main subproject repo as we do this in QoD.
  3. Provider implementors are still free to further provide additional test implementation within their individual repos.

Would this work? If it is still important for you that we move the QoD test implementation to the provider implementation repo, we can also do that. Would be fine for me both ways. But I see more chances of people wanting to contribute just the test implementation but not a full reference implementation (currently we just have that for QoD), and in this case it would mean that we need to add a new provider implementation repo in Camara just to host the test implementation.

It makes sense to me. Thank you @shilpa-padgaonkar

shilpa-padgaonkar commented 1 year ago

@jpengar : Thanks for your feedback.

@rafpas-tim @FabrizioMoggio : Could you kindly provide your feedback? If you are ok with the proposal, we can go ahead and close this issue.

FabrizioMoggio commented 1 year ago

this is fine to me.