Booz Allen's lean manufacturing approach for holistically designing, developing and fielding AI solutions across the engineering lifecycle from data processing to model building, tuning, and training to secure operational deployment
Other
34
stars
8
forks
source link
Feature: Improve testing around pipeline filestore configurations #328
Now that the environment variables required to access file stores configured for pipelines are being generated by aiSSEMBLE automatically, there's a need to improve the unit testing of this behavior. This goal of this ticket is to automate the testing around this area of functionality.
DOD
Acceptance criteria required to realize the requested feature
[x] Add the following scenarios to the Foundation-MDA test suite
Test Strategy/Script
Confirm these tests pass during the build process
@pipeline-generation @code-generation
Scenario: Pipeline generates file store environment variables necessary for accessing a single configured file store
Given a project named "example"
And a "data-flow" pipeline using "data-delivery-spark" named "SparkPipeline" and a file store named "s3TestModelOne"
When the profile "data-delivery-spark-pipeline" is generated
Then the "s3TestModelOne_FS_PROVIDER", "s3TestModelOne_FS_ACCESS_KEY_ID", and "s3TestModelOne_FS_SECRET_ACCESS_KEY" configurations are generated to access the "s3TestModelOne" file store
@pipeline-generation @code-generation
Scenario: Pipeline does not generate file store environment variables for accessing a file store when it does not have a file store configured
Given a project named "example"
And a "data-flow" pipeline using "data-delivery-spark" named "SparkPipeline"
When the profile "data-delivery-spark-pipeline" is generated
Then the "s3TestModelOne_FS_PROVIDER", "s3TestModelOne_FS_ACCESS_KEY_ID", and "s3TestModelOne_FS_SECRET_ACCESS_KEY" configurations are not generated
@pipeline-generation @code-generation
Scenario: Pipeline generates file store environment variables necessary for accessing multiple configured file stores
Given a project named "example"
And a "data-flow" pipeline using "data-delivery-spark" named "SparkPipeline" and two file stores named "s3TestModelOne" and "s3TestModelTwo"
When the profile "data-delivery-spark-pipeline" is generated
Then the "s3TestModelOne_FS_PROVIDER", "s3TestModelOne_FS_ACCESS_KEY_ID", and "s3TestModelOne_FS_SECRET_ACCESS_KEY" configurations are generated to access the "s3TestModelOne" file store
And the "s3TestModelTwo_FS_PROVIDER", "s3TestModelTwo_FS_ACCESS_KEY_ID", and "s3TestModelTwo_FS_SECRET_ACCESS_KEY" configurations are generated to access the "s3TestModelTwo" file store
Description
Now that the environment variables required to access file stores configured for pipelines are being generated by aiSSEMBLE automatically, there's a need to improve the unit testing of this behavior. This goal of this ticket is to automate the testing around this area of functionality.
DOD
Acceptance criteria required to realize the requested feature
Test Strategy/Script
Confirm these tests pass during the build process
Final Testing Steps
mvn clean install -pl :foundation-mda
CI--> https://jenkins.aiops.boozallencsn.com/job/aissemble-build/1998/