Open nathanagez opened 3 years ago
Jest is used by Amplify-CLI, so I'm all for it.
Can we also make sure documentation regarding running, writing tests is included as a task? We want to make it easy for folks to use/write new tests.
Jest is used by Amplify-CLI, so I'm all for it.
Can we also make sure documentation regarding running, writing tests is included as a task? We want to make it easy for folks to use/write new tests.
Absolutely, the ticket has been updated!
@smp @wizage after some research, the easiest way to make this plugin (and the others testable) is to create an headless mode such as amplify-cli, it will allow us to auto answering the questions instead of just mocking serviceQuestions
or addResource
.
Following this approach we will be able to deploy the permutations we want, exactly in the same way a user will do. So the new testing workflow will looks like this:
amplify init
amplify add video --payload $OUR_PAYLOAD_WITH_PERMUTATIONS
using headless mode (We need to clarify what kind of parameters we want to use and how to parse them, to make something generic)a. Loop over each infrastructure permutation thanks to the headless mode
b. Execute amplify push to deploy the CloudFormation template
c. Launch tests suites using npm run tests
d. Test if resource has been added into backend-config.json
e. Test if generated CloudFormation template is valid
f. Test if generated stack is deployed without error
To create this headless mode, I suggest to use arguments and parse them by adding to the actual question dictionaries taken by inquirer the following:
const nameProject = [
{
type: inputs[0].type,
name: inputs[0].key,
message: inputs[0].question,
validate: amplify.inputValidation(inputs[0]),
default: defaults.resourceName,
// I added the following
when(answers) {
console.log('context', context.parameters); // this is the context returned by amplify-cli
if (context.parameters.options.name) {
console.log('Using option from arguments or config');
answers.name = options.name;
} else {
// Prompt for input
return true;
}
},
}];
So, if we run amplify video add --name test
the question will automatically take the value of our argument and so on.
This makes sense to me @nathanagez. I assume this will work with defaults as does amplify-cli and it will not require additional maintenance if folks add new prompts to questions.json?
I'll let @wizage comment on prompt/payload, but my gut says we'll just keep it simple with one payload for the entire Video resource config as there isn't a lot of upside in breaking it down by type (live/vod)
This makes sense to me @nathanagez. I assume this will work with defaults as does amplify-cli and it will not require additional maintenance if folks add new prompts to questions.json?
@smp with what is already implemented, if folks add new prompts to {}-questions.json
they will also have to edit the service-walkthroughs
file related to those questions and add new prompts to it too. But yes, I they don't specify optional parameters, we will use the default values from the {}-questions.json
file.
What we can do is edit the CONTRIBUTING.md to specify that if they edit the service-walkthroughs
they must call the when()
method with the utility I will build to parse the payload.
This snippet of code will automatically build IVS files for an amplify project
#!/bin/bash
set -e
IFS='|'
CONFIG="{\
\"service\":\"video\",\
\"serviceType\":\"ivs\",\
\"providerName\":\"awscloudformation\",\
\"resourceName\":\"test\"\ -> This is optional
... -> optional parameters depending of the permutations we want to build (I am using the same key names as the {}-questions.json file)
}"
amplify video add --payload $CONFIG
I am thinking about a permutation generator based on {}-questions.json
that will output shell scripts to deploy resources during the tests running to avoid us writing each permutations manually.
Is your feature request related to a problem? Please describe.
Integration tests should validate that the plugin produces cloud resources that successfully deploy and operate within the context of an Amplify project. This ticket is related to the [RFC] Amplify Video Test Suite opened here
Describe the solution you'd like
With this testing suite we want to be able to validate that the plugin produces the right CloudFormation templates and that they are successfully deployed in the cloud.
Describe alternatives you've considered
Testing workflow description:
npm run tests
a. Loop over each infrastructure permutation b. Load configuration for infrastructure permutation c. ExecuteproviderController.addResource
using previously loaded config d. Test if resource has been added intobackend-config.json
e. Test if generated CloudFormation template is valid f. Executeamplify push
to deploy the CloudFormation template g. Test if generated stack is deployed without errorHere is a list of tasks that have to be done to implement the previous workflow: