opendatahub-io / ai-edge

ODH integration with AI at the Edge usecases
Apache License 2.0
8 stars 17 forks source link

RHOAIENG-6869: Support running tests based on values given to the e2e tests #266

Closed jackdelahunt closed 1 month ago

jackdelahunt commented 3 months ago

Description

This PR aims to make the go e2e test more configurable and more usable for cases in which you want to test but don't have all the requirements for the different types of tests.

Using a config.json the values are passed to the e2e tests. Based on the this config the e2e tests may or may not run certain tests, this is set from the enabled field in each sub object in the config.

{
  "namespace": "",
  "image_registry_username": "",
  "image_registry_password": "",
  "target_image_tags": [],

  "git_fetch": {
    "enabled": false,
    "container_file_repo": "",
    "container_file_revision": "",
    "container_relative_path": "",
    "model_repo": "",
    "model_relative_path": "",
    "model_revision": "",
    "model_dir": "",
    "self_signed_cert": ""
  },

  "s3_fetch": {
    "enabled": false,
    "aws_secret": "",
    "aws_access": "",
    "region": "",
    "endpoint": "",
    "bucket_name": "",
    "self_signed_cert": ""
  },

  "gitops": {
    "enabled": false,
    "token": "",
    "username": "",
    "repo": "",
    "api_server": "",
    "branch": ""
  }
}

For example, if you are only trying to run the main MLOps pipeline it is not needed to give the values for gitops. By setting the enabled field you can set the gitops pipeline not to run.

The config.json is structured to have all non-specific values at the root (Namespace etc.) and then all optional values for tests in their own sub object, here labelled s3_fetch and git_fetch as they are the two options for fetching.

How Has This Been Tested?

Merge criteria:

jackdelahunt commented 3 months ago

I requested review but still need to update docs and the change I mentioned above. Waiting to do that as I want to get an opinion on the work so far.

openshift-ci-robot commented 3 months ago

@jackdelahunt: This pull request references RHOAIENG-6869 which is a valid jira issue.

Warning: The referenced jira issue has an invalid target version for the target branch this PR targets: expected the story to target the "4.16.0" version, but no target version was set.

In response to [this](https://github.com/opendatahub-io/ai-edge/pull/266): > > >## Description >This PR aims to make the go e2e test more configurable and more usable for cases in which you want to test but don't have all the requirements for the different types of tests. > >Using a `config.json` the values are passed to the e2e tests. Based on the this config the e2e tests may or may not run certain. tests. >```json >{ > "NAMESPACE": "", > "IMAGE_REGISTRY_USERNAME": "", > "IMAGE_REGISTRY_PASSWORD": "", > "TARGET_IMAGE_TAGS": [], > > "git-fetch": { > "CONTAINERFILE_REPO": "", > "CONTAINERFILE_REVISION": "", > "CONTAINER_RELATIVE_PATH": "", > "MODEL_REPO": "", > "MODEL_RELATIVE_PATH": "", > "MODEL_REVISION": "", > "MODEL_DIR": "", > "SELF_SIGNED_CERT": "" > }, > > "s3-fetch": { > "AWS_SECRET": "", > "AWS_ACCESS": "", > "REGION": "", > "ENDPOINT": "", > "BUCKET_NAME": "", > "SELF_SIGNED_CERT": "" > }, > > "gitops": { > "TOKEN": "", > "USERNAME": "", > "REPO": "", > "API_SERVER": "", > "BRANCH": "" > } >} >``` > >For example, if you are only trying to run the main MLOps pipeline it is not needed to give the values for git. If all of the values in the config for git are empty then those tests are not run. (This part isn't added yet but it's no the way). > >The `config.json` is structured to have all non-specific values at the root (NAMESPACE etc.) and then all optional values which will be the conditions for tests in their own sub object, here labelled `s3` and `git` as they are the two options for fetching. > >## How Has This Been Tested? >- Update the config.json file in the e2e test folder with the values required, the same values used before should work as normal >- Run `make go-test` >- All tests should pass >- Remove the values each fetch or the gitops fields and see how the e2e tests change their behaviour >- All tests should pass again > >## Merge criteria: > > > >- [x] The commits are squashed in a cohesive manner and have meaningful messages. >- [x] Testing instructions have been added in the PR body (for PRs involving changes that are not immediately obvious). >- [x] The developer has manually tested the changes and verified that the changes work > Instructions for interacting with me using PR comments are available [here](https://prow.ci.openshift.org/command-help?repo=opendatahub-io%2Fai-edge). If you have questions or suggestions related to my behavior, please file an issue against the [openshift-eng/jira-lifecycle-plugin](https://github.com/openshift-eng/jira-lifecycle-plugin/issues/new) repository.
openshift-ci-robot commented 3 months ago

@jackdelahunt: This pull request references RHOAIENG-6869 which is a valid jira issue.

Warning: The referenced jira issue has an invalid target version for the target branch this PR targets: expected the story to target the "4.16.0" version, but no target version was set.

In response to [this](https://github.com/opendatahub-io/ai-edge/pull/266): > > >## Description >This PR aims to make the go e2e test more configurable and more usable for cases in which you want to test but don't have all the requirements for the different types of tests. > >Using a `config.json` the values are passed to the e2e tests. Based on the this config the e2e tests may or may not run certain. tests. >```json >{ > "NAMESPACE": "", > "IMAGE_REGISTRY_USERNAME": "", > "IMAGE_REGISTRY_PASSWORD": "", > "TARGET_IMAGE_TAGS": [], > > "git-fetch": { > "CONTAINERFILE_REPO": "", > "CONTAINERFILE_REVISION": "", > "CONTAINER_RELATIVE_PATH": "", > "MODEL_REPO": "", > "MODEL_RELATIVE_PATH": "", > "MODEL_REVISION": "", > "MODEL_DIR": "", > "SELF_SIGNED_CERT": "" > }, > > "s3-fetch": { > "AWS_SECRET": "", > "AWS_ACCESS": "", > "REGION": "", > "ENDPOINT": "", > "BUCKET_NAME": "", > "SELF_SIGNED_CERT": "" > }, > > "gitops": { > "TOKEN": "", > "USERNAME": "", > "REPO": "", > "API_SERVER": "", > "BRANCH": "" > } >} >``` > >For example, if you are only trying to run the main MLOps pipeline it is not needed to give the values for gitops. If all of the values in the config for gitops are empty then those tests are not run. > >The `config.json` is structured to have all non-specific values at the root (NAMESPACE etc.) and then all optional values which will be the conditions for tests in their own sub object, here labelled `s3-fetch` and `git-fetch` as they are the two options for fetching. > >## How Has This Been Tested? >- Update the config.json file in the e2e test folder with the values required, the same values used before should work as normal >- Run `make go-test` >- All tests should pass >- Remove the values each fetch or the gitops fields and see how the e2e tests change their behaviour >- All tests should pass again > >## Merge criteria: > > > >- [x] The commits are squashed in a cohesive manner and have meaningful messages. >- [x] Testing instructions have been added in the PR body (for PRs involving changes that are not immediately obvious). >- [x] The developer has manually tested the changes and verified that the changes work > Instructions for interacting with me using PR comments are available [here](https://prow.ci.openshift.org/command-help?repo=opendatahub-io%2Fai-edge). If you have questions or suggestions related to my behavior, please file an issue against the [openshift-eng/jira-lifecycle-plugin](https://github.com/openshift-eng/jira-lifecycle-plugin/issues/new) repository.
jackdelahunt commented 3 months ago

@grdryn

Is it still possible to run specific tests (or sets or tests) without taking stuff out of the config file?

Maybe just passing the types of tests you want to run through a env variable.

TESTS=gitops,s3-fetch make go-test

Or define what is run in the config file aswell, not exactly sure how that would work though

JSON rather than YAML (or something else)

Go has built in support for json and yaml so I just picked one

CAPS_LOCK for the keys

This was a carry over from when they were passed and env variables. But yeah they can change

grdryn commented 3 months ago

Is it still possible to run specific tests (or sets or tests) without taking stuff out of the config file?

Maybe just passing the types of tests you want to run through a env variable.

TESTS=gitops,s3-fetch make go-test

Using tags would be one possibility: https://mickey.dev/posts/go-build-tags-testing/

Another would be to have some sort of enabled field in the config, which could be set to false (so that I can just set that, rather than take out the entire gitops block, if I don't want to run the gitops-related tests)

jackdelahunt commented 3 months ago

Using tags would be one possibility: https://mickey.dev/posts/go-build-tags-testing/

I have used these before and they were a hassle to deal with but maybe if I was to look into again it might work well here. Seperate test files have a build falg behind them. How would you pass this to the compiler from make, in a way that isn't a pain

Another would be to have some sort of enabled field in the config, which could be set to false

This is for sure a thing I was thinking about, in the config types I even have these alerady just that they are infered from the file instead of being set by it. So I think that could be okay.

Something like this I guess you are saying and have this on each sub object in the config file right?

"git-fetch": {
    "enabled": true,
    ...
  },
grdryn commented 3 months ago

How would you pass this to the compiler from make, in a way that isn't a pain

We don't necessarily need to require the tests be run by make, right? I'd keep the default invocation that CI would use to run all tests in the Makefile, but there's a bunch of stuff that go test can do that we might often want to do without having make cover every possibility. Pointing people to run go test for non-default cases should be fine (and probably IDEs can be configured to do that in different configurations)

jackdelahunt commented 3 months ago

Pointing people to run go test for non-default cases should be fine

Yeah I probably agree I just thought we were trying to do everything through the Makefile.

About using build flags, I feel like relying on compiler flags is something we wouldn't want but now that we are saying we just say calling go test directly is expected then maybe it would work

openshift-ci-robot commented 3 months ago

@jackdelahunt: This pull request references RHOAIENG-6869 which is a valid jira issue.

Warning: The referenced jira issue has an invalid target version for the target branch this PR targets: expected the story to target the "4.17.0" version, but no target version was set.

In response to [this](https://github.com/opendatahub-io/ai-edge/pull/266): > > >## Description >This PR aims to make the go e2e test more configurable and more usable for cases in which you want to test but don't have all the requirements for the different types of tests. > >Using a `config.json` the values are passed to the e2e tests. Based on the this config the e2e tests may or may not run certain. tests. >```json >{ > "namespace": "", > "image_registry_username": "", > "image_registry_password": "", > "target_image_tags": [], > > "git_fetch": { > "enabled": false, > "container_file_repo": "", > "container_file_revision": "", > "container_relative_path": "", > "model_repo": "", > "model_relative_path": "", > "model_revision": "", > "model_dir": "", > "self_signed_cert": "" > }, > > "s3_fetch": { > "enabled": false, > "aws_secret": "", > "aws_access": "", > "region": "", > "endpoint": "", > "bucket_name": "", > "self_signed_cert": "" > }, > > "gitops": { > "enabled": false, > "token": "", > "username": "", > "repo": "", > "api_server": "", > "branch": "" > } >} > >``` > >For example, if you are only trying to run the main MLOps pipeline it is not needed to give the values for gitops. If all of the values in the config for gitops are empty then those tests are not run. > >The `config.json` is structured to have all non-specific values at the root (NAMESPACE etc.) and then all optional values which will be the conditions for tests in their own sub object, here labelled `s3-fetch` and `git-fetch` as they are the two options for fetching. > >## How Has This Been Tested? >- Update the config.json file in the e2e test folder with the values required, the same values used before should work as normal >- Run `make go-test` >- All tests should pass >- Remove the values each fetch or the gitops fields and see how the e2e tests change their behaviour >- All tests should pass again > >## Merge criteria: > > > >- [x] The commits are squashed in a cohesive manner and have meaningful messages. >- [x] Testing instructions have been added in the PR body (for PRs involving changes that are not immediately obvious). >- [x] The developer has manually tested the changes and verified that the changes work > Instructions for interacting with me using PR comments are available [here](https://prow.ci.openshift.org/command-help?repo=opendatahub-io%2Fai-edge). If you have questions or suggestions related to my behavior, please file an issue against the [openshift-eng/jira-lifecycle-plugin](https://github.com/openshift-eng/jira-lifecycle-plugin/issues/new) repository.
openshift-ci-robot commented 3 months ago

@jackdelahunt: This pull request references RHOAIENG-6869 which is a valid jira issue.

Warning: The referenced jira issue has an invalid target version for the target branch this PR targets: expected the story to target the "4.17.0" version, but no target version was set.

In response to [this](https://github.com/opendatahub-io/ai-edge/pull/266): > > >## Description >This PR aims to make the go e2e test more configurable and more usable for cases in which you want to test but don't have all the requirements for the different types of tests. > >Using a `config.json` the values are passed to the e2e tests. Based on the this config the e2e tests may or may not run certain tests, this is set from the `enabled` field in each sub object in the config. >```json >{ > "namespace": "", > "image_registry_username": "", > "image_registry_password": "", > "target_image_tags": [], > > "git_fetch": { > "enabled": false, > "container_file_repo": "", > "container_file_revision": "", > "container_relative_path": "", > "model_repo": "", > "model_relative_path": "", > "model_revision": "", > "model_dir": "", > "self_signed_cert": "" > }, > > "s3_fetch": { > "enabled": false, > "aws_secret": "", > "aws_access": "", > "region": "", > "endpoint": "", > "bucket_name": "", > "self_signed_cert": "" > }, > > "gitops": { > "enabled": false, > "token": "", > "username": "", > "repo": "", > "api_server": "", > "branch": "" > } >} > >``` > >For example, if you are only trying to run the main MLOps pipeline it is not needed to give the values for gitops. By setting the `enabled` field you can set the gitops pipeline not to run. > >The `config.json` is structured to have all non-specific values at the root (Namespace etc.) and then all optional values for tests in their own sub object, here labelled `s3_fetch` and `git_fetch` as they are the two options for fetching. > >## How Has This Been Tested? >- Update the config.json file in the e2e test folder with the values required, the same values used before should work as normal >- Run `make go-test` >- All tests should pass >- Remove the values each fetch or the gitops fields and see how the e2e tests change their behaviour >- All tests should pass again > >## Merge criteria: > > > >- [x] The commits are squashed in a cohesive manner and have meaningful messages. >- [x] Testing instructions have been added in the PR body (for PRs involving changes that are not immediately obvious). >- [x] The developer has manually tested the changes and verified that the changes work > Instructions for interacting with me using PR comments are available [here](https://prow.ci.openshift.org/command-help?repo=opendatahub-io%2Fai-edge). If you have questions or suggestions related to my behavior, please file an issue against the [openshift-eng/jira-lifecycle-plugin](https://github.com/openshift-eng/jira-lifecycle-plugin/issues/new) repository.
MarianMacik commented 3 months ago

Seperate test files have a build falg behind them. How would you pass this to the compiler from make, in a way that isn't a pain

@jackdelahunt what do you mean by that?

This would be just passing the tags variable content or similar to the go test command.

jackdelahunt commented 3 months ago

@MarianMacik

Why not having simple switches for meaningful groups of tests (s3, git, gitops...) as Makefile variables?

I guess this could work.. I was thinking about this as then we would also need to do the validation in make but as from what you said you are saying to pass all the variables in, run setup based on other variables set in make like TEST_GIT_FETCH=true

I would prefer to provide test parameters using command line rather than using a file

My idea with this was to then be able to save a configuration or setup multiple to be run. For example keep the same config but change the bucket and git repo and use self signed certs. But maybe this just ins't a use case.

Another reason for using a cofig is that it is more clear what variables are used where. I can see a case happening where a person is confused but trying to give values for fields that won't be used. This could probably be helped by good docks and clear naming of course aswell.

MarianMacik commented 3 months ago

@MarianMacik

Why not having simple switches for meaningful groups of tests (s3, git, gitops...) as Makefile variables?

I guess this could work.. I was thinking about this as then we would also need to do the validation in make but as from what you said you are saying to pass all the variables in, run setup based on other variables set in make like TEST_GIT_FETCH=true

Validation can be done in Go later. I think it would be more clear to validate everything in Go rather than in Makefile with complicated branching. If we had an expensive setup done before running Go, then it would make sense to have it in Makefile to fail fast but if we delegate this to Go, it won't be seconds/minutes later that we find out some parameter is not filled.

I would prefer to provide test parameters using command line rather than using a file

My idea with this was to then be able to save a configuration or setup multiple to be run. For example keep the same config but change the bucket and git repo and use self signed certs. But maybe this just ins't a use case.

Another reason for using a cofig is that it is more clear what variables are used where. I can see a case happening where a person is confused but trying to give values for fields that won't be used. This could probably be helped by good docks and clear naming of course aswell.

Makes sense then. However, I would like to retain the possibility to pass variables from command line, i.e. not having to use the config file every time. But that may be just my preference. It's true that for CI it may be possible to have a predefined config file and then we can just reference it as an argument/using a Makefile variable. This way we won't need to run sed every time to overwrite the default config, for example.

jackdelahunt commented 3 months ago

Validation can be done in Go later. I think it would be more clear to validate everything in Go rather than in Makefile

Okay I will work on getting that done then 👍

It's true that for CI it may be possible to have a predefined config file and then we can just reference it as an argument/using a Makefile variable

That's true but if every input is thorugh a env variable then we can just load a .env file as our config file if we wanted in the CI.

jackdelahunt commented 3 months ago

Also just as another question, when we are selecting which tests to run (git fetch, gitops etc.) which way is nicer to have those flags.

  1. Set a single test type either on or off: TEST_GIT_FETCH=true, TEST_GIT_FETCH=false
  2. A list of tests with names we define and check: TESTS=gitops,fetch_s3,fetch_git
CFSNM commented 2 months ago

Also just as another question, when we are selecting which tests to run (git fetch, gitops etc.) which way is nicer to have those flags.

1. Set a single test type either on or off: `TEST_GIT_FETCH=true`, `TEST_GIT_FETCH=false`

2. A list of tests with names we define and check: `TESTS=gitops,fetch_s3,fetch_git`

I would go with option 1. Option 2 is a bit more elegant imo, but having one flag per test-suite is easier for the user, so you only need to enable/disable the flags related to the test-suites you want to run.

jackdelahunt commented 2 months ago

@MarianMacik just updated the PR to load the environment instead of a config file, here are my thoughts so far after using it a bit.

  1. With moving setup to go there are now more env variables required. With the current changes that is 26 different environment variables with more coming. Personally this is way to much env variables to get from the command line or your IDE config. It is painful to set this up and I am the one developing it so I am going to have more patience then most

  2. With all of these options, and with all of them needing to be inputed exactly as expected this gives way too many options to not make a mistakes. When just trying to test the git fetch step I needed to re-run the tests ~5 time just because I was getting things wrong.

  3. This is also made worse by the fact the implementation of this is also a pain and tedious which leads to mistakes also

  4. I think passing values via the command can be good but in this case I think it is a mess. The alternatives may not be better either possibly but I just want to say I don't think this is a great experience.

MarianMacik commented 2 months ago

Validation can be done in Go later. I think it would be more clear to validate everything in Go rather than in Makefile

Okay I will work on getting that done then 👍

It's true that for CI it may be possible to have a predefined config file and then we can just reference it as an argument/using a Makefile variable

That's true but if every input is thorugh a env variable then we can just load a .env file as our config file if we wanted in the CI.

I didn't say it would be through env variables, but through the Makefile variables. It's true that then these would need to be translated to env vars for go test or to cobra flags.

@MarianMacik just updated the PR to load the environment instead of a config file, here are my thoughts so far after using it a bit.

1. With moving setup to go there are now more env variables required. With the current changes that is 26 different environment variables with more coming. Personally this is way to much env variables to get from the command line or your IDE config. It is painful to set this up and I am the one developing it so I am going to have more patience then most

Why are more env variables required when moving setup to go? These variables/params will need to be somewhere anyway because the test suite simply needs this data.

2. With all of these options, and with all of them needing to be inputed exactly as expected this gives way too many options to not make a mistakes. When just trying to test the git fetch step I needed to re-run the tests ~5 time just because I was getting things wrong.

I understand, though we don't need to pass every variable every time if we configure some defaults in the test suite.

3. This is also made worse by the fact the implementation of this is also a pain and tedious which leads to mistakes also

4. I think passing values via the command can be good but in this case I think it is a mess. The alternatives may not be better either possibly but I just want to say I don't think this is a great experience.

In the previous project Kogito operator we had Makefile variables which were then translated using Makefile itself to flags with prefix --. After that they were passed to a script, which just "filtered" these parameters to framework-specific parameters prefixed with --godog. and to test parameters prefixed with --tests.. The script was quite big because of many parameters but it worked. After that, these parameters were passed to the go test command and were further passed by Cobra. I am not saying we should do it like that, we can just take an inspiration from it, e.g. use Cobra as well.

Also, what did you mean by being it messy when passing from the command line and not messy when using a config file? Did you mean that config file is automatically mapped to a Go struct without having to query environment variable one by one? For command line arguments this can be solved by using Cobra, which will automatically parse and map parameters to respective Go variables, for example.

I am not against the config file, just trying to already think about how it will be used by users. If it is run locally, then a user will override the config file with custom values or use a custom config file. For CI we would either need to override the config file using sed or provide a specific CI config file, right?

jackdelahunt commented 2 months ago

Why are more env variables required when moving setup to go? These variables/params will need to be somewhere anyway because the test suite simply needs this data.

Sorry, I didn't actually mean there are more because of go just that I added ones that were missing before that make it much more configureable compared to before

I understand, though we don't need to pass every variable every time if we configure some defaults in the test suite.

This is true but I think many of these defaults are going to need to be set to whatever the user needs more often then not. So much of the options like names, credentials and build tags are specific to the person running it meaning the default won't cover much use cases. Of course setting them to something the pr-check will use means it will be easier for that atleast and you still have some defaults there

In the previous project Kogito operator we had Makefile variables which were then translated using Makefile itself to flags with prefix --. After that they were passed to a script, which just "filtered" these parameters to framework-specific parameters prefixed with --godog.

I think I understand whats happening but not sure but I do get the next section.

use Cobra as well

Would go test be needed then, could you run tests based on a command given, this then wraps the choice of test to run and its specific options into one solution but then you might imagine we are creating the ai-edge cli 2.0 so maybe it is too much.

ai-edge-e2e test mlops --fetch-model=s3 --aws-secret=...

Also, what did you mean by being it messy when passing from the command line and not messy when using a config file

I meant two things:

  1. Its messy as a developer as you don't get to use the nice encoding stuff from go.
  2. It is messy as a user because for a config file I have an example showing the structure and naming and options in one file that is also used in the process, just looking at the config file (which you will already) gives the info you need compared to a naming convention only structure that needs to be documented somewhere else. My guess is though those makefile flags may have autocomplete as well so maybe not as different.
  3. It means there is no way to permanently save a particular config and if you do want to retry you need to either go back in your command history or saving them manually.
  4. Passing raw credentials as flags/env vars I think is a security issue but maybe its actually okay I am not sure but that I was a thought

I am not against the config file, just trying to already think about how it will be used by users.

There are defiantly pros and cons, I am not sure if a config file is the best idea either. Right now I am just feeling the current solution of passing env vars is not my favouite but maybe some of the others mentioned could work instead of either.

If it is run locally, then a user will override the config file with custom values or use a custom config file. For CI we would either need to override the config file using sed or provide a specific CI config file, right?

Yes that was my idea, for the CI using sed or a custom config can both work for it so either way is fine. For local my idea was, when you clone the repo you get the config template and then copy it (because of gitconfig and maybe leaking secrets) to create your own with all your needed fields filled in.

MarianMacik commented 2 months ago

use Cobra as well

Would go test be needed then, could you run tests based on a command given, this then wraps the choice of test to run and its specific options into one solution but then you might imagine we are creating the ai-edge cli 2.0 so maybe it is too much.

ai-edge-e2e test mlops --fetch-model=s3 --aws-secret=...

go test can work with Cobra flags, the same approach was used with the Kogito operator I posted.

Also, what did you mean by being it messy when passing from the command line and not messy when using a config file

I meant two things:

1. Its messy as a developer as you don't get to use the nice encoding stuff from go.

2. It is messy as a user because for a config file I have an example showing the structure and naming and options in one file that is also used in the process, just looking at the config file (which you will already) gives the info you need compared to a naming convention only structure that needs to be documented somewhere else. My guess is though those makefile flags may have autocomplete as well so maybe not as different.

3. It means there is no way to permanently save a particular config and if you do want to retry you need to either go back in your command history or saving them manually.

4. Passing raw credentials as flags/env vars  I think is a security issue but maybe its actually okay I am not sure but that I was a thought

I am not against the config file, just trying to already think about how it will be used by users.

There are defiantly pros and cons, I am not sure if a config file is the best idea either. Right now I am just feeling the current solution of passing env vars is not my favouite but maybe some of the others mentioned could work instead of either.

If it is run locally, then a user will override the config file with custom values or use a custom config file. For CI we would either need to override the config file using sed or provide a specific CI config file, right?

Yes that was my idea, for the CI using sed or a custom config can both work for it so either way is fine. For local my idea was, when you clone the repo you get the config template and then copy it (because of gitconfig and maybe leaking secrets) to create your own with all your needed fields filled in.

What did you mean by "gitconfig" here?

OK, I think we can go with config file then. We just need a way how to reference a custom config file, either through environment variables or using a Cobra flag (maybe too complex for one parameter, it would also require to use TestMain instead of normal test approach).

jackdelahunt commented 2 months ago

What did you mean by "gitconfig" here?

Sorry wrong one meant .gitignore. It was just a comment on the fact that if we want to have an up to date config example we should track that but of course we dont want people using that in case they accidentally commit secrets.

it would also require to use TestMain instead of normal test approach

Yep that makes sense, it also looks almost the exact same as the init funtion just with out the testing info passed in. I guess it is better for that but I don't know why they both exist. If I dug deeper maybe the scope at which they are called once for different...

https://github.com/opendatahub-io/ai-edge/blob/main/test/e2e-tests/tests/pipelines_test.go#L61-L63

LaVLaS commented 2 months ago

/lgtm

After resolving issues with my config.json, this worked for me with a fresh namespace.

One major issue is that this is not testing against the actual testrun that is created by this script so any failed PipelineRuns that already exist cause the go test run to fail. This isn't a blocker for this implementation since we can follow-up when this repo is refactored.

jackdelahunt commented 1 month ago

One major issue is that this is not testing against the actual testrun that is created by this script so any failed PipelineRuns that already exist cause the go test run to fail. This isn't a blocker for this implementation since we can follow-up when this repo is refactored.

Okay I will make a follow up for this, I thought I fixed it but I will give it another shot. I guess we will merge this anyway as the happy path is working fine.

jackdelahunt commented 1 month ago

@LaVLaS I think you need an approve aswell

MarianMacik commented 1 month ago

@jackdelahunt I will have one more look today.

jackdelahunt commented 1 month ago

The issue @LaVLaS was having is fixed in #271 so you can ignore that issue for this PR atleast

jackdelahunt commented 1 month ago

I closed #271 because those fixes were not needed based on new changes here

jackdelahunt commented 1 month ago

Yes, even if S3 bucket is used for model, for container files we still use Git. So these parameters need to be set for S3 aproach as well. Together with containerRelativePath.

Okay no problem updated now

MarianMacik commented 1 month ago

/approve

biswassri commented 1 month ago

Overall looks good to me. Adding approve since Marian also approved and to get this merged.

openshift-ci[bot] commented 1 month ago

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: biswassri, MarianMacik

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files: - ~~[OWNERS](https://github.com/opendatahub-io/ai-edge/blob/main/OWNERS)~~ [MarianMacik] Approvers can indicate their approval by writing `/approve` in a comment Approvers can cancel approval by writing `/approve cancel` in a comment
biswassri commented 1 month ago

/lgtm