camaraproject / Commonalities

Repository to describe, develop, document and test the common guidelines and assets for CAMARA APIs
Apache License 2.0
9 stars 24 forks source link

Enhancement of the Testing Guidelines #158

Closed jlurien closed 4 weeks ago

jlurien commented 4 months ago

Problem description The first version of the has to be enhanced with more detailed instructions to have consistent Test Plans across the WGs.

Possible evolution

Additional context

We include here a draft of the proposal, to trigger the discussion. When we reach enough consensus on the approach we can create a PR with the modifications.


Proposal

Testing implementations can use the Gherkin feature files with two different approaches:

Design principles

Feature structure

A feature file will typically test an API operation and will consist of several scenarios testing the behaviour of the API operation with different conditions or input content, validating that the response complies with the expected HTTP status, and that the response body meeets the expected JSON schema and some properties have the expected values.

Configuration variables

Most scenarios will test a request and its response. Commonly, values to fill request bodies will not be known in advance as they will be specific for the test environment, and will have to be provided as a separate set of configuration variables.

A first stage is to identify those variables, e.g. device identifiers (phone numbers, IP addresses), status of a testing device, etc. How those variables are set and feed into the testing execution will depent on the testing tool (Postman environment, context in Behave, etc)

In order to pass information between steps and make use of the variables, we have to agree on some syntaxis to refer to those variables. For example, Postman uses {{variable}}, Gherkin uses <variable> for Scenario Outline, but this is not properly an Scenario Outline case. As a proposal, we may use something like [CONFIG:var].

Example:

  Scenario: Description
    Given the configuration variables:
      | variable  | description                           |
      | device    | Object identifying a device           |
    And the request body:
      """
      {
        "device": [CONFIG:device],
        ...
      }
    When the HTTP "POST" request is sent
    Then the response status code is "200"
    And the response JSON field "$.device" is "[CONFIG:device]"

A background section at the beginning of the feature file may set common configuration variables and attributes for all scenarios, e.g. apiServer, baseUrl, resource, etc

Request setup

Typically the scenario will have to setup the request as part of the Given steps, filling the necessary path, query, header or body parameters. The guidelines can define a set of reusable steps for this, e.g.

- the resource "{url_path}"
- the resource “{url_path_str_format}” is parameterized with values:
    | param_name | param_value |
    | ---------- | ----------- |
    | xxx        | yyy         |
- the header "{name}" is set as "{value}"
- the headers:
    | param_name | param_value |
    | ---------- | ----------- |
    | xxx        | yyy         |
- the query parameter "{name}" is set as "{value}"
- the query parameters:
    | param_name | param_value |
    | ---------- | ----------- |
    | xxx        | yyy         |
- the request body field "{json_path" is set as "{value}"
- the request body:
{
  "xxx": "yyy"
}
```

### Request sending

Usually one When step will only be necessary:

When the HTTP "{method}" request is sent


For complex scenarios concatenating several requests, subsequent requests will usually be included in Then steps after the response to the first one is validated.

### Response validation

Several Then steps can validate the response. Some may be quite common, e.g.:

Others may be quite specific for the API logic and have to be designed ad-hoc.

Open points

Some to start with...

Device identification

Identification of the device. There are many possible combinations that comply with the schema Device, and there are no clear guidelines about which ones are required in order to pass the certification.

This topic links with the discussion in https://github.com/camaraproject/Commonalities/issues/127

Scope/reach of the testing plan

Does then Test plan have to test only that the interface complies with the spec, or does it have to test that the service is correctly provided? For example:

mdomale commented 3 months ago

@jlurien Below are few thoughts from our side
Configuration variables -We recommend to use gherkin syntax for variables and common setup can be achieved using background section. -We can consider to use separate configuration file which will have values defined for variables of multiple feature files. -For Payload we can consider separate json filesto be used for configuration. We can use scenario outline with examples for different variable value

Request setup It can achieved as part of background step but we cannot guarantee of same setup(payload) for different methods of same API

For Open points We have to assume that device as an object is a configuration variable and each environment will provide a valid .

Scope of Test plan We can have recommendation as validation of response code and mandatory response parameters. Here decision has to taken if validation of 1 possible case is enough or all possible set of values?

@akoshunyadi @shilpa-padgaonkar

bigludo7 commented 3 months ago

Hello Compiled with @patrice-conil and @GuyVidal for Orange perspective

First thanks @jlurien for the proposal.

Configuration variables +1 to @mdomale points

Request set-up We’re fine with @jlurien proposal globally – Perhaps we could have some adjustment after first use/examples.

Request Sending OK for us

Response validation OK for us

Open Points Device Identification For us we must follow what has been described in #127. We can probably contribute a Gherkin file to describe this.

Scope/reach of the testing plan We test only the interface and not the service itself.

jlurien commented 3 months ago

Thanks for the feedback @mdomale. Some comments/questions below:

@jlurien Below are few thoughts from our side Configuration variables -We recommend to use gherkin syntax for variables and common setup can be achieved using background section.

Do you mean using ? Advantage of this is that Gherkin formatters work better with it, but as it is reserved for Scenario Outline, some tools may expect an examples section with values to be substituted, which will not be available while defining the test plan

-We can consider to use separate configuration file which will have values defined for variables of multiple feature files.

In general, values for testing variables will be provided by the implementation to the tester for certain environment. We may considering defining a template with the set of variables that have to be provided.

-For Payload we can consider separate json filesto be used for configuration. We can use scenario outline with examples for different variable value

It is a possibility to move the request bodies to separate files. The advantage is that they can be used as values for scenarios outlines. But it will require maintaining many independent files for a test plan and defining clear file naming rules

Request setup It can achieved as part of background step but we cannot guarantee of same setup(payload) for different methods of same API

Agree. We can only setup a generic request body in Background, that may work for some generic scenarios testing generic errors, e.g. 401 without Authorization header, etc. We may define a default request body and then allow scenarios to overwrite the request body in the Background

For Open points We have to assume that device as an object is a configuration variable and each environment will provide a valid .

Testing of device is particularly complex, and will have to be aligned with outcome of discussion in #127. We may need specific scenarios to test the behaviour agreed in #127, and for other scenarios which do not test device support, just assume that a valid device object is provided as config variable.

Scope of Test plan We can have recommendation as validation of response code and mandatory response parameters. Here decision has to taken if validation of 1 possible case is enough or all possible set of values?

In the test plan we can provide different inputs to test different behaviours, but this would not test that implementation is right or that implementation is able to provide all possible responses. For example, in location-verification, if an implemention always answers with vwerificationResult: TRUE or FALSE, but never answers with PARTIAL or UNKNOWN, that would be difficult to test, unless we add some precondition to a test scenario requiring some input values that force that response.

@akoshunyadi @shilpa-padgaonkar

jlurien commented 3 months ago

Thanks @bigludo7, please see my comments inline:

Hello Compiled with @patrice-conil and @GuyVidal for Orange perspective

First thanks @jlurien for the proposal.

Configuration variables +1 to @mdomale points

  • about keeping standard Gherkin syntax. No use of specific [CONFIG:var] grammar but use
  • environment values must be set in background. Let's keep it standard.

Commented above. Happy to keep it standard, but we'll have to figure out a way to express in our Gherkins that certain value for certain variable will be provided separately. We may use a separate template file and refer to it, or maybe write Scenario Outlines with placeholders. Tools that automatise execution of feature files may have problems with certain approach. Any feedback to handle this is welcome.

Request set-up We’re fine with @jlurien proposal globally – Perhaps we could have some adjustment after first use/examples.

Request Sending OK for us

Response validation OK for us

Open Points Device Identification For us we must follow what has been described in #127. We can probably contribute a Gherkin file to describe this.

Agree. It is key to close #127, and I would isolate testing tranversal device particularities from other more API-specific logic.

Scope/reach of the testing plan We test only the interface and not the service itself.

In the first iterations, I think that this is enough. In more mature phases we may try to test that service implementations follow the agreed implementation guidelines

jlurien commented 3 months ago

To move the discussion further with some examples:

One of the main decisions to make is the level of detail for each scenario, specially regarding preconditions to setup the scenario. For example, for an API with device in the input, there should be a case to test that there is compliance with the schema, but there are many possible variations for a wrong request body.

It is enough to design something like option 1, or should we try to achieve something more similar to option 2?


  # Assuming that a default valid request body is setup in the Background 

  # Option 1: High level step, not indicating how to write the test.
  # Tester can decide how to build the request body and how many cases to test
  Scenario: Validate that device complies with the schema
    Given the request body property "$.device" does not comply with the schema
    When the HTTP "POST" request is sent
    Then the response status code is 400
    And the response property "$.status" is 400
    And the response property "$.code" is "INVALID_ARGUMENT"
    And the response property "$.message" contains a user friendly text

  # Option 2: Details steps with explicit content to test.
  # Implementations will know in advance the level of testing
  Scenario Outline: Validate that device phoneNumber complies with the schema
    Given the request body property "$.device.phoneNumber" is set to <value>
    When the HTTP "POST" request is sent
    Then the response status code is 400
    And the response property "$.status" is 400
    And the response property "$.code" is "INVALID_ARGUMENT"
    And the response property "$.message" contains a user friendly text

    Examples:
      | value                                                   |
      | foo                                                     |
      | *092828#                                                |
      | 1234567890                                              |
      | 12ft23333                                               |
      | ""                                                      |
      | +178931297489º17249017409º70937498º73297932790723097091 |

    Scenario Outline: Validate that device ipv4Address complies with the schema
    Given the request body property "$.device.ipv4Address" is set to
      """
        { 
          "publicAddress": <publicAddress>,
          "publicPort": <publicPort>
        }
      """
    When the HTTP "POST" request is sent
    Then the response status code is 400
    And the response property "$.status" is 400
    And the response property "$.code" is "INVALID_ARGUMENT"
    And the response property "$.message" contains a user friendly text

    Examples:
      | publicAddress | publicPort |
      | 1.2.3.4.5     | 1234       |
      | foo           | 1234       |
      | 1.2.3.4       | foo        |

    # etc, there will be lots of scenarios
jlurien commented 2 months ago

Please, take a look to the example in https://github.com/camaraproject/DeviceLocation/pull/189, to illustrate the proposal here.

bigludo7 commented 2 months ago

Please, take a look to the example in camaraproject/DeviceLocation#189, to illustrate the proposal here.

Hello @jlurien - From Orange side (checked with @patrice-conil) we're ok with your proposal provided in Device Location project. Thanks.

mdomale commented 1 month ago

To move the discussion further with some examples:

One of the main decisions to make is the level of detail for each scenario, specially regarding preconditions to setup the scenario. For example, for an API with device in the input, there should be a case to test that there is compliance with the schema, but there are many possible variations for a wrong request body.

It is enough to design something like option 1, or should we try to achieve something more similar to option 2?


  # Assuming that a default valid request body is setup in the Background 

  # Option 1: High level step, not indicating how to write the test.
  # Tester can decide how to build the request body and how many cases to test
  Scenario: Validate that device complies with the schema
    Given the request body property "$.device" does not comply with the schema
    When the HTTP "POST" request is sent
    Then the response status code is 400
    And the response property "$.status" is 400
    And the response property "$.code" is "INVALID_ARGUMENT"
    And the response property "$.message" contains a user friendly text

  # Option 2: Details steps with explicit content to test.
  # Implementations will know in advance the level of testing
  Scenario Outline: Validate that device phoneNumber complies with the schema
    Given the request body property "$.device.phoneNumber" is set to <value>
    When the HTTP "POST" request is sent
    Then the response status code is 400
    And the response property "$.status" is 400
    And the response property "$.code" is "INVALID_ARGUMENT"
    And the response property "$.message" contains a user friendly text

    Examples:
      | value                                                   |
      | foo                                                     |
      | *092828#                                                |
      | 1234567890                                              |
      | 12ft23333                                               |
      | ""                                                      |
      | +178931297489º17249017409º70937498º73297932790723097091 |

    Scenario Outline: Validate that device ipv4Address complies with the schema
    Given the request body property "$.device.ipv4Address" is set to
      """
        { 
          "publicAddress": <publicAddress>,
          "publicPort": <publicPort>
        }
      """
    When the HTTP "POST" request is sent
    Then the response status code is 400
    And the response property "$.status" is 400
    And the response property "$.code" is "INVALID_ARGUMENT"
    And the response property "$.message" contains a user friendly text

    Examples:
      | publicAddress | publicPort |
      | 1.2.3.4.5     | 1234       |
      | foo           | 1234       |
      | 1.2.3.4       | foo        |

    # etc, there will be lots of scenarios

Option 1 is preferable for us to ensure we provide flexibility to test implementer and not to restrict set of input static values. @akoshunyadi @shilpa-padgaonkar