EOEPCA / resource-health

Apache License 2.0
0 stars 0 forks source link

Refine use cases of Resource Health BB #12

Closed dovydas-an closed 2 weeks ago

dovydas-an commented 2 months ago

Define new and refine existing use case definitions focusing on the ones relevant to workflow and UI design.

dovydas-an commented 1 month ago

Further use case refinement has been performed with APEx and EarthCODE stakeholders (one virtual discussion and additional communication by email).

The following derived (low-level) use cases have been identified (which in turn provided further clarification for the Resource Health BB requirements, see https://github.com/orgs/EOEPCA/projects/13/views/1?pane=issue&itemId=74625038) that directly relate to the high-level use cases presented at the beginning of the project:

dovydas-an commented 1 month ago

Further use case refinement has been performed with APEx and EarthCODE stakeholders (one virtual discussion and additional communication by email).

The following derived (low-level) use cases have been identified (which in turn provided further clarification for the Resource Health BB requirements, see https://github.com/orgs/EOEPCA/projects/13/views/1?pane=issue&itemId=74625038) that directly relate to the high-level use cases presented at the beginning of the project:

  • UC68: As a user, I want automated checks to verify the continued correct end-to-end functioning of my published workflows, so that I can receive notification when there is a problem.

    • Automated regularly running check (most frequent, monthly) to verify if hosted algorithms are visible/callable (including cases where the algorithm ought to be callable under specific conditions, such as random spatial/temporal locations). If check verification is negative, notification shall be issued.
    • Upon user request issued check to verify if hosted algorithms are visible/callable (including cases where the algorithm ought to be callable under specific conditions, such as random spatial/temporal locations). If check verification is negative, notification shall be issued.
    • Automated regularly running check (most frequent, monthly) to verify if hosted algorithms satisfy certain conventions (e.g. publishes a specification that satisfies a set of linting rules). If check verification is negative, notification shall be issued.
    • Upon user request issued check to verify if hosted algorithms satisfy certain conventions (e.g. publishes a specification that satisfies a set of linting rules). If check verification is negative, notification shall be issued.
    • Automated regularly running check (most frequent, monthly) to verify if hosted environment satisfy certain conventions (e.g. publishes a specification that satisfies a set of linting rules). If check verification is negative, notification shall be issued.
    • Upon user request issued check to verify if hosted environment service satisfy certain conventions (e.g. publishes a specification that satisfies a set of linting rules). If check verification is negative, notification shall be issued.
    • Automated regularly running check (most frequent, monthly) to verify if third party backends successfluly finish certain typical test cases (e.g. access data of a given size). If check verification is negative, notification shall be issued.
    • Upon user request issued check to verify if third party backends successfluly finish certain typical test cases (e.g. access data of a given size). If check verification is negative, notification shall be issued.
    • Automated regularly running check (most frequent, monthly) to verify if third party backends comply with API specification and profiles (such as for example openEO API profile). If check verification is negative, notification shall be issued.
    • Upon user request issued check to verify if third party backends comply with API specification and profiles (such as for example openEO API profile). If check verification is negative, notification shall be issued.
  • As a user, I want automated checks to verify the continued availability of my published datasets, so that I can receive notification when there is a problem. Availability of the dataset refers to discovery in the Catalogue and access via Data Retrieval and Visualisation services.

    • Automated regularly running check (most frequent, monthly) to verify if published dataset is discoverable in the Resource Catalogue (Resource Discovery BB). If check verification is negative, notification shall be issued.
    • Upon request type check to verify if published dataset is discoverable in the Resource Catalogue (Resource Discovery BB). If check verification is negative, notification shall be issued.
    • Automated regularly running check (most frequent, monthly) to verify if published dataset is accessible via Data Retrieval and Visualization (Data Access BB). If check verification is negative, notification shall be issued.
    • Upon request type check to verify if published dataset is accessible via Data Retrieval and Visualization (Data Access BB). If check verification is negative, notification shall be issued.
  • As a user, I want KPIs to be recorded for my published service, in order to demonstrate that I have met the SLA of my NoR offering. In this context, as a published service includes published datasets, workflows, ML models, Dashboards, etc.

    • Automated regularly running check (most frequent, monthly) to verify if KPIs for published EO data have been met.
    • Upon request type check to verify if KPIs for published EO data have been met.
    • Automated regularly running check (most frequent, monthly) to verify if KPIs for hosted algorithms have been met.
    • Upon request type check to verify if KPIs for hosted algorithms have been met.
    • Automated regularly running check (most frequent, monthly) to verify if KPIs for hosting environments have been met.
    • Upon request type check to verify if KPIs for hosting environments have been met.
    • Automated regularly running check (most frequent, monthly) to verify if KPIs for third party backends have been met.
    • Upon request type check to verify if KPIs for third party backends have been met.
    • Automated regularly running check (most frequent, monthly) to verify if KPIs for ML models have been met.
    • Upon request type check to verify if KPIs for ML models have been met.
    • Automated regularly running check (most frequent, monthly) to verify if KPIs for Dashboards have been met.
    • Upon request type check to verify if KPIs for Dashboards have been met.

Just to reduce repetition, I am rewriting the list of use cases previously posted:

dovydas-an commented 3 weeks ago

A couple distinct use cases have been identified and added from the discussion yesterday with other BB developers (Processing, Resource Discovery and Data Access), however the meeting was also attended by the APEx and EarthCODE stakeholders as well.