cta-wave / dpctf-test-runner

Repo for the DPCTF Test Runner
Other
0 stars 3 forks source link

include any non-default settings from OF config.ini file in the report #21

Open jpiesing opened 2 years ago

jpiesing commented 2 years ago

Although it is not encouraged, organisations running the tests can edit the configuration files. One likely reason for this is an implementation that fails tests due to dropped frames - the configuration file can be edited to permit dropped frames at the start, in the middle or at the end. Service provides may accept this. When someone submits a test report to a service provider / platform, this should include any non-default values for the configuration.

louaybassbouss commented 2 years ago

@jpiesing in the deploy project we have the option to override the default tests configuration --> https://github.com/cta-wave/dpctf-deploy#mapping-new-content-into-the-container . We can update the text in section to make it more clear. wdyt?

jpiesing commented 2 years ago

Adding @yanj-github.

This relates to the config.ini from the OF. Specifically settings like this.

[TOLERANCES] start_frame_num_tolerance = 0 end_frame_num_tolerance = 0 mid_frame_num_tolerance = 10 splice_start_frame_num_tolerance = 0 splice_end_frame_num_tolerance = 0

Perhaps the OF can send this file to the test runner which could append it at the end of the report? Alternatively just the differences from the default could be appended - which would be clearer for service providers or platforms reading the report.

louaybassbouss commented 2 years ago

I propose an extension of the JSON WPT report structure to include additional parameters:

  1. Date/Time when the test is run and when observation is run
  2. OF configuration from config.ini

Something like this:

{
    "datetime_test_run": "<UTC Date Time>",
    "datetime_observation": "<UTC Date Time>",
    "observation_config": {
          "start_frame_num_tolerance": 0,
           ....
    }
}
yanj-github commented 2 years ago

@louaybassbouss One question does these additional parameters one set per test or one set per test session (may contain more then one test) please?

yanj-github commented 2 years ago

@jpiesing OF already embeded the tolereances to the result messge. eg: "message": " First frame found is 5, expected to start from 1. First frame number tolerance is 0. Mid frame number tolerance is 10. Following frames are missing: 12 Total of missing frames is 5." Relevent tolerence presented to the related observation. such as start end tolerence for every sample rendered observation and startup delay tolerence noted for start up delay observation.
e.g: "Maximum permitted startup delay is 120ms.The presentation start up delay is xxxms"

I this good enough for you or you still want this to be added in different structure please?

louaybassbouss commented 2 years ago

@louaybassbouss One question does these additional parameters one set per test or one set per test session (may contain more then one test) please?

@yanj-github my idea is to add these parameters to the same JSON file you upload via the Results API. This means for each group of tests in a folder.

jpiesing commented 2 years ago

October 12th meeting OK as proposed. Eurofins to implement in the OF first.

yanj-github commented 2 years ago

@louaybassbouss Will TR include the following entries with default value to the result file please? "datetime_observation": "", "observation_config": {}

Not sure whether OF will need to add "datetime_observation" and "observation_config" entry if so which level? Or OF just looking for "datetime_observation" and "observation_config" and change the value please?

louaybassbouss commented 2 years ago

@louaybassbouss Will TR include the following entries with default value to the result file please? "datetime_observation": "", "observation_config": {}

Not sure whether OF will need to add "datetime_observation" and "observation_config" entry if so which level? Or OF just looking for "datetime_observation" and "observation_config" and change the value please?

@yanj-github below is an example of a simple test report generated with the current test runner.

{ 
   "results": [
        {
            "test": "/avc_12.5_25_50-2021-07-07-local/playback-of-encrypted-content-https__t1-cenc.html",
            "status": "TIMEOUT",
            "message": null,
            "subtests": [
                {
                    "status": "TIMEOUT",
                    "xstatus": "SERVERTIMEOUT"
                }
            ]
        }
    ]
}

and this is the same example with meta information you can upload via the API. This means you need to add the element meta to the JSON since it is not existing for now, but I recommend that your implementation checks if the meta element exists and only create a new one if it doesn't exist.

{ 
    "results": [
        {
            "test": "/avc_12.5_25_50-2021-07-07-local/playback-of-encrypted-content-https__t1-cenc.html",
            "status": "TIMEOUT",
            "message": null,
            "subtests": [
                {
                    "status": "TIMEOUT",
                    "xstatus": "SERVERTIMEOUT"
                }
            ]
        }
    ],
    "meta": {
       "datetime_test_run": "<UTC Date Time>",
       "datetime_observation": "<UTC Date Time>",
        "observation_config": {
             "start_frame_num_tolerance": 0,
              ....
        }
    }
}
yanj-github commented 2 years ago

Thanks @louaybassbouss this looks like one meta per session not per test is this correct? Also please note that OF wont fill in "datetime_test_run" it will only add:

"meta": {
       "datetime_observation": "<UTC Date Time>",
        "observation_config": {
             "start_frame_num_tolerance": 0,
              ....
        }
}

when meta is missing. When meta is present, e,g:

    "meta": {
       "datetime_test_run": "<UTC Date Time>"
    }

OF will add follwing to the "meta":

"datetime_observation": "<UTC Date Time>",
"observation_config": {
     "start_frame_num_tolerance": 0,
      ....
}
louaybassbouss commented 2 years ago

Thanks @louaybassbouss this looks like one meta per session not per test is this correct? Also please note that OF wont fill in "datetime_test_run" it will only add:

"meta": {
       "datetime_observation": "<UTC Date Time>",
        "observation_config": {
             "start_frame_num_tolerance": 0,
              ....
        }
}

when meta is missing. When meta is present, e,g:

    "meta": {
       "datetime_test_run": "<UTC Date Time>"
    }

OF will add follwing to the "meta":

"datetime_observation": "<UTC Date Time>",
"observation_config": {
     "start_frame_num_tolerance": 0,
      ....
}

@yanj-github yes exactly this is correct :)

jpiesing commented 2 years ago

November 9th meeting - @yanj-github believes this is implemented in the test runner & is successfully pushed to the test runner. @louaybassbouss needs to look at how this is displayed in the HTML report.

jpiesing commented 2 years ago

@louaybassbouss ?

louaybassbouss commented 2 years ago

@FritzHeiden is working on this feature.

FritzHeiden commented 2 years ago

Implementation is done. So far, there are two fields that are added by the test runner: date_session_started and date_session_finished

louaybassbouss commented 2 years ago

Implementation is done. So far, there are two fields that are added by the test runner: date_session_started and date_session_finished

thanks @FritzHeiden. @jpiesing @yanj-github please let us know if it works for you.

gitwjr commented 1 year ago

@jpiesing Check if done.

gitwjr commented 1 year ago

@jpiesing to reread and close if complete.

jpiesing commented 8 months ago

Going back to the original problem statement. If I edit test-config.json to relax one of the criteria, this isn't obvious in the results. For example, I've changed ts_max to 1000 but it doesn't appear here.

"meta": {
    "datetime_observation": "2023-10-17 16:08:28",
    "observation_config": {
        "start_frame_num_tolerance": 0,
        "end_frame_num_tolerance": 0,
        "mid_frame_num_tolerance": 10,
        "splice_start_frame_num_tolerance": 0,
        "splice_end_frame_num_tolerance": 0
    }

Someone evaluating the test report has to look in the PASS statements and realise that they say;

Maximum permitted startup delay is 1000ms.

Rather than

Maximum permitted startup delay is 120ms.

yanj-github commented 8 months ago

@jpiesing "observation_config" is there to show OF configurations defined in config.ini file from OF. Observation Framework send "observation_config" together with observation result as test runner does not read config.ini file from OF. For tolerances that are defined in test-config.json can be accessed by test runner so that I dont think it need to be passed from OF. @FritzHeiden and @louaybassbouss?

jpiesing commented 8 months ago

@jpiesing "observation_config" is there to show OF configurations defined in config.ini file from OF. Observation Framework send "observation_config" together with observation result as test runner does not read config.ini file from OF. For tolerances that are defined in test-config.json can be accessed by test runner so that I dont think it need to be passed from OF. @FritzHeiden and @louaybassbouss?

What I care about is the last sentence of the original posting - "When someone submits a test report to a service provider / platform, this should include any non-default values for the configuration.". If someone edits test-config.json before they run the OF, the values used should be included in the report.

Yes the test runner can access a test-config.json but who knows that it's accessing the same test-config.json as was used by the OF?