linkerd / linkerd2-conformance

A Conformance Validation tool for Linkerd 2.X
https://linkerd.io/
Apache License 2.0
1 stars 2 forks source link

Consider using `gotestsum` #8

Closed mayankshah1607 closed 4 years ago

mayankshah1607 commented 4 years ago

gotestsum can be used as a drop-in replacement for the go test CLI, and we may want to consider using that. However, I don't yet see any obvious advantages over the current workflow:

alpeb commented 4 years ago

I beg to disagree :slightly_smiling_face: Keeping linkerd's testing approach makes it easier for new contributors to get started, i.e. not having to learn a different testing library than standard go's. And being able to reuse linkerd's integration tests would be a big plus IMO. Given the gotestsum wrapper can provide the JUnit XML output that the standard go test lib lacks, what are then the advantages Gingko would give us? BDD might be appealing to some, but I think by its own is not enough reason to introduce a new testing library.

mayankshah1607 commented 4 years ago

Thanks @alpeb. That makes sense to me. However, I still have a few more concerns over using gotestsum.

And being able to reuse linkerd's integration tests would be a big plus IMO.

Just to clear my understanding, are we talking about directly using the integration tests for conformance validation? i.e, we'd have to then move all our conformance stuff into the upstream linkerd2 repo, right? (Because _test.go files cannot provide exportable functions like other packages).

Given the gotestsum wrapper can provide the JUnit XML output that the standard go test lib lacks, what are then the advantages Gingko would give us?

Ginkgo too provides JUnit XML reports (that can later be aggregated by Sonobuoy as a single readable artifact). In fact, we're using this feature in our current approach (See https://github.com/linkerd/linkerd2-conformance/pull/1). Apart from that, another advantage that Ginkgo provides over gotestsum or the standard testing library is the UX - the outputs on stdout are more nuanced and detailed. Here is an example from one of the conformance test written using Ginkgo:

=== RUN   TestLifecycle
Running Suite: Lifecycle
========================
Random Seed: 1594185960
Will run 1 of 1 specs

 `linkerd install` 
  can install a new control plane
  /home/mayank/Desktop/linkerd2-conformance/tests/lifecycle/specs.go:18
STEP: Installing linkerd control plane with HA: false
STEP: Running pre-installation checks
STEP: Validating `check` output
STEP: Running `linkerd install`
STEP: Applying control plane manifests
STEP: Checking resources in namespace l5d-conformance
STEP: Running post-installation checks
STEP: Validating `check` output

• [SLOW TEST:43.875 seconds]

/home/mayank/Desktop/linkerd2-conformance/tests/lifecycle/specs.go:8
  `linkerd install`
  /home/mayank/Desktop/linkerd2-conformance/tests/lifecycle/specs.go:17
    can install a new control plane
    /home/mayank/Desktop/linkerd2-conformance/tests/lifecycle/specs.go:18
------------------------------

JUnit path was configured: /home/mayank/Desktop/linkerd2-conformance/reports/01-lifecycle.junit.xml

JUnit report was created: /home/mayank/Desktop/linkerd2-conformance/reports/01-lifecycle.junit.xml

Ran 1 of 1 Specs in 43.875 seconds
SUCCESS! -- 1 Passed | 0 Failed | 0 Pending | 0 Skipped

I'd also like to add how Ginkgo makes structuring your tests very easy, intuitive, expressive and hence organized because of various kinds of "containers" it offers - Describe, Context, BeforeEach, AfterEach, It, When, etc (See here). But these may not be strong enough reasons as they're centered around BDD testing.

I am not really sure if this significantly outweights the problem of new contributors having to learn a new testing framework, which is when the std testing library seems like a great option. However, the question we must adress then is how would we re-use the integration tests for conformance validation? We could move conformance validation to the upstream linkerd2 repo, but we may have to re-think the UX, as it would not conform to the config YAML approach that we have planned. To tackle that would mean housing them on a separate repo (like this one) and re-writing those tests with small changes (for our config YAML approach) would introuce some amount of redundancy.

alpeb commented 4 years ago

Ah I hadn't considered the impossibility of exporting tests... How much overlap do you think there will be between conformance tests and integration tests? I think that's what's gonna determine whether we'd like to do the conformance tests in the same linkerd2 repo.

mayankshah1607 commented 4 years ago

@alpeb I'd say that the main body of the tests (the logic, assertions, etc.) would mostly be the same. But if we do plan on directly using the integration tests as they are, the only major drawback that I can think of is that the integration tests will not conform to our idea of having the tests be configurable using a YAML file. This would require major refactoring because the same tests now have to be configurable according to the CI as well as a YAML file (the conformance tool). We also need to consider that there are some tests such as ingress that conformance validation requires, but integration tests do not have. @Pothulapati WDYT?

alpeb commented 4 years ago

Thanks for your considerations @mayankshah1607 :slightly_smiling_face: Having this separate as initially intended would make things clearer, and go's inability to export tests should not drive a decision to do otherwise. And I don't wanna stall this project, so please go ahead with the PR.