infinispan / infinispan-operator

Infinispan Operator
https://infinispan.org/docs/infinispan-operator/main/operator.html
Apache License 2.0
49 stars 54 forks source link

Implement test filtering mechanism to allow running smoke/integration tests #1381

Open Crumby opened 2 years ago

Crumby commented 2 years ago

Currently it's possible to filter test only by either package/file name or using test regex (-run ${regex}). Neither of these mechanisms is practical to filter out tests based on the reason why they are being executed such as smoke testing or integration testing with new K8s/OCP versions.

There are several possible solutions to overcome this issue:

1. Using build tags Would be reasonably usable if we had single test per file.

Pro: Simplifies tests execution by running the tests with -tags integration/smoke and it'll automatically pick the right tests. Con: It's usable only at file level, it's not possible to mark only some of the tests from the file.

2. Using test/file naming conventions Tests that should be part of the particular test suite would have it's name in it. Eg. TestFeatureIntegration

Pro: Overcomes issue of using build tags and makes running tests with regex possible systematically (-run .*Integration/Smoke.*) Con: Test naming could be confusing. Eg. with tree CustomConfig tests for yaml,json,xml, only one would be ending with Integration/Smoke

3. Wrapping the tests in each file and selecting the tests to be executed based on env var. There would be a master test in each file that would pick tests to executed based on env var input

SomeTest(t *testing.T) {
  if(env.Integration) {
    t.run(t, test1)
  } else {
    t.run(t, test1)
    t.run(t, test2)
  }
}

Pro: Fine grained solution. Con: Tests and where they belong needs to be manually maintained in each file

4. Implementing skipIf() method used at the beginning of each test Each test would call skipIf() at the beginning that would decide whether test should be executed or not based on env var.

TestSomeFeature(t *testing.T) {
  skipIf(t, env.Integration)
}

skipIf(t *testing.T, envName string) {
   if(env.Integration == true) {
    t.skip()
   }
}

Pro Fine grained solution Con If the call of skipIf is forgottent then the may be executed even though it's no desired.

Crumby commented 2 years ago

@ryanemerson @rigazilla I guess the closest to ideal solution is using the build tags even though that would mean having single test per file but I'd like to hear your opinion and I'm all ears for any other possible solutions. Ideally those that are rather marking which tests should be executed rather then other way around.

ryanemerson commented 2 years ago

I guess the closest to ideal solution is using the build tags even though that would mean having single test per file

I also think this is a nice way to go. Why is it necessary to have a single file per test though? Couldn't we just have a smoke_test.go and integration_test.go each containing the tests that fall into the respective category?

Also, do we have any smoke tests? From my understanding they're all integration tests.

Crumby commented 2 years ago

So basically smoke tests should be set of tests that you want to execute in case of a release to address CVE in underlying image. Eg. new Java version, so the tests might include more security tests while they don't need to focus too much on integration points with OpenShift.

On the other hand K8s/OpenShift integration tests should include tests that are stretching out integration points between the Operator/Server and OpenShift.

Couldn't we just have a smoke_test.go and integration_test.go each containing the tests that fall into the respective category?

That would either mean duplicate the tests we have or having the tests inconsistently placed. (Eg. I'd like to execute one of the custom configuration tests as part of the OCP integration but not all of them). But if you meant to use the file as list referencing tests in other files then that sounds great (That would be option no 3. but in separate file and no env needed).

ryanemerson commented 2 years ago

Thanks for the detailed explanation.

But if you meant to use the file as list referencing tests in other files then that sounds great (That would be option no 3. but in separate file and no env needed).

That's not what I meant, but I like that idea :smile:. This approach will give us a lot of flexibility and it won't impact the existing tests.