knative / pkg

Knative common packages
Apache License 2.0
255 stars 331 forks source link

Improving our conformance support story #1563

Closed dprotaso closed 3 years ago

dprotaso commented 3 years ago

Background

We've been piggy backing off of golang's testing package along with some bash scripts as our conformance test driver. We've also built up some tooling in knative.dev/pkg/test to support this effort. We're at a point now where we have enough insight and experience with conformance testing that we can do a pass at improving the experience for both upstream authors and downstream implementors.

Requirements

Here's my first pass of the requirements we need:

  1. Build upon golang testing given the supporting tools and services that already exist
  2. Conformance Authors need the ability to mark certain features with their maturity/state (alpha, beta, stable)
  3. Conformance Authors need the ability to mark functionality with different requirement levels
  4. Conformance Authors need the ability to compose, consume & reuse common configuration (ie. environment settings)
  5. Conformance Authors need the ability to provide defaults for various test options.
  6. Downstream implementors need the ability to consume and invoke upstream conformance tests.
  7. Downstream implementors need the ability to invoke a subset of tests based on requirements levels.
  8. Downstream implementors need the ability to invoke a subset of tests based on the maturity of a feature.
  9. Downstream implementors need the ability to consistently supply test options and override defaults.

Next Steps

  1. Solicit feedback of other conformance authors & runners before I proceed.
  2. Turn the pkg POC into a new pkg/test/v2 so that we can gradually onboard different conformance tests
  3. 🎉

Solutioning on the various requirements

I've been experimenting a POC but I'll provide more context wrt. to each requirement listed above. knative.dev/pkg poc diff knative.dev/serving poc diff

The order I talk about the requirements below differ from above since they build on each other. Above I just partitioned based on persona so it reads better.

1. Build upon golang testing given the supporting tools and services that already exist

We should use go test package as our driver for invoking conformance tests. This lets us continue using our current infrastructure of capturing test results and viewing them in test grid.

What we can do is build upon the testing package to offer higher level semantics for conformance authors. See below for more details.

6. Downstream implementors need the ability to consume and invoke upstream conformance tests.

When the project switched from dep to go modules we lost our ability to invoke vendored tests. Thus to support implementors we started to expose the ingress conformance as a library that can be invoked downstream in their own tests.

the tl;dr right now conformance authors expose a function for downstream implementors. Golang subtests (t.Run) offer a nice mechanism to compose the larger test suite of small focused feature tests.

package conformance

// in a standard go file
func Run(t *testing.T) {
    t.Run("some feature", TestSomeFeature)
    t.Run("some other feature", TestSomeOtherFeature)
}

func TestSomeFeature(t *testing.T) {...}
func TestSomeOtherFeature(t *testing.T) {...}

Downstream implementors would author test file as follows

// in a _test.go file
import "knative.dev/upstream/conformance"

func TestConformance(t *testing.T) {
  conformance.Run(t)
}

2. Conformance Authors need the ability to mark certain features with their maturity/state (alpha, beta, stable)

It should be easy to mark tests based on a feature's maturity/state. If we build upon golang's testing package by offering our own 'test context' we can offer some first class helpers similar to subtests (t.Run) that then feel idiomatic to the language.

// our test framework - knative.dev/pkg/test/v2
package test

type T struct {
  *testing.T
}

func (t T) Alpha(name string, func (t *T)) {...}
func (t T) Beta(name string, func (t *T)) {...}
func (t T) Stable(name string, func (t *T)) {...}

Conformance authors would then utilize these methods as follows

// in a standard go file
import "knative.dev/pkg/test/v2"

func Run(t *test.T) {
    t.Alpha("some alpha feature", TestAlphaFeature)
    t.Beta("some beta feature", TestBetaFeature)
    t.Stable("some stable feature", TestStableFeature)
} 

func TestAlphaFeature(t *test.T) {}
func TestBetaFeature(t *test.T) {}
func TestStableFeature(t *test.T) {}

3. Conformance Authors need the ability to mark functionality with different requirement levels

Similar to above we extend test.T to allow conformance authors to mark different assertions with specific requirement levels.

// our test framework - knative.dev/pkg/test/v2
package test

type T struct {
  *testing.T
}

// One option is to use a similar subtest approach
func (t T) Must(name string, func (t *T)) {...}
func (t T) Should(name string, func (t *T)) {...}
func (t T) May(name string, func (t *T)) {...}

// Another option is to support exposing when certain requirements are to be tested
func (t T) Must() bool {...}
func (t T) Should() bool {...}
func (t T) May() bool {...}

Conformance authors would then utilize these methods as follows

func TestSomeStableFeature(t test.T) {
   // option 1
   t.Must("have some runtime thing", func() {
       // make your assertions
   })

  // option 2
  t.Must() {
    // make your assertions
  }
}

4. Conformance Authors need the ability to compose, consume & reuse common configuration.

Various properties are shared amongst different conformance tests. Tests shouldn't have to redefine these properties. We want to have consistency in the way these properties are supplied, read and composed. The current pattern is to read command line args into global flags. Sadly we're not consistent as some functions read environment variables directly.

The conformance lib can provide a base config that tracks what feature states, requirements should be exercise against a specific environment

type BaseConfig struct {
    Levels RequirementLevels
    States FeatureStates

    environment.Settings // similar to pkg/test env - k8s cluster, docker repo etc.
}

// We have enough info to build the client make it easier to do with this helper
func (BaseConfig) KubeClient() {..}
// ... add other helpers ...

// We can tie properties in this config to a flag set 
func (BaseConfig) AddFlags(fs &flag.FlagSet) {...}

This config can be embedded into a specific conformance test's configuration

type Config struct {
   test.BaseConfig 
   IngressClass string
}

// We can tie properties in this instance to a flag set 
func (c Config) AddFlags(fs &flag.FlagSet) {
    c.BaseConfig.AddFlags(fs) // add all our base flags
    fs.StringVar(...) // add our additional properties
}

Our test context can carry an instance of this config that's accessible in tests. Note: go doesn't support inheritance/generics so we'll need our 'test context' to contain an interface of a config (long story)

// in our lib
package test

type T struct {
    *testing.T
    Config
}

// in a conformance test
func TestStableFeature(t test.T) {
    endpoint := t.Config.IngressEndpoint 
    ... use endpoint ...
}

5. Conformance Authors need the ability to provide defaults for various test options.

This is pretty self explanatory. For a concrete example serving runs conformance in a default namespace (serving-tests) but allows overrides.

9. Downstream implementors need the ability to supply test options and override defaults in a consistent way.

We want to support different options for supplying test options & overriding defaults.

Options defined in code

Conformance authors should consume their a config when their test is invoked

func Run(tt *testing.T, config) {
    t := test.NewContext(tt, config)

    t.Alpha("some alpha feature", TestAlphaFeature)
    t.Beta("some beta feature", TestBetaFeature)
    t.Stable("some stable feature", TestStableFeature)
} 

This allows implementors to write their configs in go

func TestIngressConformance(t *testing.T) {
   config := DefaultConfig()
   config.IngressClass = "my-ingress-class"

   Run(t, config)
}
Options & overrides via flags

We'd like to support our existing scripts and allow invokers of our conformance to override some settings. This can be accomplished via flags on our test binary. Since we want to avoid globals in our library packages we make the global in the test runner and attach the flags to the default command line.

var config Config = DefaultConfig()

// To be verified this might need to be in a TestMain
func init() {
    // allow overrides via flags
    config.AddFlags(flag.CommandLine)
}

func TestIngressConformance(t *testing.T) {
   Run(t, config)
}

7. Downstream implementors need the ability to invoke a subset of tests based on requirements levels.

This helps them know which requirements they're meeting and which they are not. This can be accomplished by running tests with different requirements enabled.

For example here's how we could override the requirement levels via. test flags

  1. Ensure we're passing all absolute requirements go test -requirement.must -requirement.mustnot
  2. Ensure we're passing recommended requirements go test -requirements.recommended
  3. Ensure we're passing optional requirements go test -requirements.may
  4. Ensure we're passing all requirements go test -requirements.all

8. Downstream implementors need the ability to invoke a subset of tests based on the maturity of a feature.

We want to avoid breaking downstream CIs when a new conformance test is added for an alpha feature. This provides a gradual onboarding for implementors as a feature progresses from alpha to stable.

For example here's how we could override the enabled feature states via. test flags

  1. Ensure we're passing all stable and beta features go test -features.stable -features.beta
  2. Ensure we're passing alpha features go test -features.alpha
  3. Ensure we're passing all tests go test -features.all

/assign @dprotaso

evankanderson commented 3 years ago
  1. Conformance Authors need the ability to mark certain features with their maturity/state (alpha, beta, stable)
  2. Conformance Authors need the ability to mark functionality with different requirement levels

I was wondering if "tags" would be a solution to both of these. Looking at the current signature-based solution to 2 & 3, it seems like expressing Beta + Must would require nesting the two. What would you think of using build tags instead, like so:

// +build should,beta must,stable
// +build v1

This would indicate that these tests should run against v1 "should" tests for beta and would become a "must" for stable.

Drawback: tests for different levels of maturity end up in different files; we may end up with a large number of files.

evankanderson commented 3 years ago

6. Downstream implementors need the ability to consume and invoke upstream conformance tests.

When the project switched from dep to go modules we lost our ability to invoke vendored tests. Thus to support implementors we started to expose the ingress conformance as a library that can be invoked downstream in their own tests.

the tl;dr right now conformance authors expose a function for downstream implementors. Golang subtests (t.Run) offer a nice mechanism to compose the larger test suite of small focused feature tests.

package conformance

// in a standard go file
func Run(t *testing.T) {
    t.Run("some feature", TestSomeFeature)
    t.Run("some other feature", TestSomeOtherFeature)
}

func TestSomeFeature(t *testing.T) {...}
func TestSomeOtherFeature(t *testing.T) {...}

Downstream implementors would author test file as follows

// in a _test.go file
import "knative.dev/upstream/conformance"

func TestConformance(t *testing.T) {
  conformance.Run(t)
}

I wonder if RunConformance could be built using go generate to store a slice of all the functions and then iterate over the slice in RunConformance.

evankanderson commented 3 years ago
  1. Conformance Authors need the ability to mark certain features with their maturity/state (alpha, beta, stable)
  2. Conformance Authors need the ability to mark functionality with different requirement levels

I was wondering if "tags" would be a solution to both of these. Looking at the current signature-based solution to 2 & 3, it seems like expressing Beta + Must would require nesting the two. What would you think of using build tags instead, like so:

// +build should,beta must,stable
// +build v1

This would indicate that these tests should run against v1 "should" tests for beta and would become a "must" for stable.

Drawback: tests for different levels of maturity end up in different files; we may end up with a large number of files.

Additional drawback: build flags won't show up if we do a reflection-based RunConformance.

dprotaso commented 3 years ago

I thought about build tags but decided it's not a good fit for the following reasons

1) Because we need to package our tests in a library (thanks go.mod) we need an entry point that coalesces all the tests together. Thus we need to be able to reference all the test funcs at compile time no matter the tags. We'll need noop tests when those tags aren't enabled. You could write a tool that mimics what go test does by looking at method prefixes but that is an anti-goal. 2) We may want to ship a conformance binary and enabling/disabling certain assertions dynamically via flags seems nice 3) tags require manipulation via go test - I'd like the option to do similar manipulation in code. ie. downstream implementors can turn on alpha features in code by default and not stress about new contributors forgetting to turn it on

dprotaso commented 3 years ago

I wonder if RunConformance could be built using go generate to store a slice of all the functions and then iterate over the slice in RunConformance.

Maintaining the list manually isn't ideal but not troublesome at this point. Annotations or decorators in the language would be nice here.

This could be a future improvement.

dprotaso commented 3 years ago

This would indicate that these tests should run against v1 "should" tests for beta and would become a "must" for stable.

This is interesting I think I'd prefer requirements levels to be stable across feature states. This would signal the final intent to implementors earlier on

julz commented 3 years ago

high level comment on first reading: I like it!

small low level comment: I wonder if we can avoid changing the test signature and wrapping testing.T, 'cos that'll break tooling a bit and make the tests a little non-idiomatic-seeming, and instead do something similar to mux.Var, i.e. just take T as an argument to a package-level function. So conformance.Must(t, func() { .. }) instead of t.Must(func() {..}). Not the most important thing in the world though 🤷.

whaught commented 3 years ago

This would be really helpful for me. The current solution to turn on a feature inside of the e2e-tests.sh script means there's an odd dependency for the test to run properly (and we put the test in its own package/path to make this easier to run). https://github.com/knative/serving/blob/0633c628d3e0a3b25511217351564960bc777b42/test/e2e-tests.sh#L115

Currently I'm looking at testing Garbage Collection. The ability to toggle features by maturity would be great but also having a go library to modify configMap values in the test would also make it easier to test several varieties of the scenario.

evankanderson commented 3 years ago

What about a helper at the beginning of each function?

func TestMust_Clause123(t *testing.T) {
  conformance.Verifies(t, "Clause123", conformance.Beta("serving"), conformance.Must())
}

func conformance.Verifies(t *testing.T, ref string, opts ...Options) {
  // If should not be enabled:
  t.SkipNow()
}

Ref: https://golang.org/pkg/testing/#T.SkipNow

devguyio commented 3 years ago

first of all, thanks for doing this. I believe this is very needed and I'll share my thoughts about it

  1. I think your head is in the right place, this is the direction Knative should head towards and the requirements generally seem to capture the essence of what we need to give our users and downstream implementors.
  2. Conformance tests in eventing are tied to two aspects, a test context (i.e. given a GitHubSource CRD is installed), and a event/action(i.e when an object is created, or when an event is sent) which both need to be provided by the downstream implementors. In this proposal I am expecting that Config can be used for the context, but I'm struggling figuring out how can implementors provide the events/actions.
  3. In Eventing WG we are about to establish a testing taskforce that is ideally across WGs to explore a couple of testing approaches and mainstream that effort hopefully also across Knative, these are some of the ones mentioned so far:
    1. How k8s does conformance https://github.com/cncf/k8s-conformance
    2. Sonobuoy https://github.com/vmware-tanzu/sonobuoy
    3. Using cucumber and features
    4. https://github.com/n3wscott/rigging
    5. Checkout the bdd here https://github.com/n3wscott/cloudevents-observer
    6. https://github.com/knative-sandbox/sample-controller/pull/287
    7. https://github.com/knative-sandbox/sample-controller/pull/175
  4. It would be great if maybe we compare this approach and others mentioned above in the taskforce and settle on one to work on and drive it through across the project.
dprotaso commented 3 years ago

@julz @evankanderson re: package level functions & being non-idiomatic

So we're currently non-idiomatic because we're moving test funcs into non-test files. This is necessary for the tests to be consumed downstream. If we were to have package level functions then we'd need to access the test config (ie. what requirement levels & feature states are to be exercised) as a global and that's something I'd prefer to avoid.

The other bit by having our own defined T is we can augment it with new features/helpers (ie. avoid test.Setup everywhere)

devguyio commented 3 years ago

For the case of the Sources conformance here some of the known and unknowns for the conformance authors :

Known

  1. A source has a ducktype "source"
    1. A source has a specific "status" (e.g. ready and sinkUri)
  2. A source emits CEs based on a specific trigger

Unknown

  1. The actual source typemeta (each source is its own CRD)
  2. How to trigger the source to emit CEs
dprotaso commented 3 years ago

You could use the config as an extensions point to provide specific functionality

ie.

// eventing conformance

type Config struct {
   test.BaseConfig
   ...
   SourceType metav1.TypeMeta
   SourceTriggerFunc func() {}
}

func Run(tt *testing.T, c Config) {
    t := test.NewContext(tt, c Config)
    // assert config is valid

    // eventing conformance assertions   
    ... 

    // Send event
    c.SourceTriggerFunc()
} 

Downstream you could have


var sourceConfig conformance.Config

func init() {
  // parse gets called by the test binary or TestMain
  sourceConfig.AddFlags(flag.CommandLine)
}

func TestSomeSource(t *testing.T) {
   some := sourceConfig
   some.SourceType = // some metav1.Type
   some.SourceTriggerFunc = func() {
       // send the event
  }

  conformance.Run(t, some)
}
mattmoor commented 3 years ago

I mentioned this on the call, but I'd also love it if this enabled downstream to run higher iteration counts.

In net-contour I added this: https://github.com/knative-sandbox/net-contour/blob/677e434bcac7521d6884c0f11d428af40090ba14/test/conformance/ingress_test.go#L34-L39

However, this groups things on testgrid as: TestIngressConformance/4/hosts/multiple, where the grouping I really want is: TestIngressConformance/hosts/multiple/4 (see also testgrid)

aslom commented 3 years ago

@dprotaso please take a look on initial draft of Requirements Gathering for Knative Enhanced Testing Task Force and add if there is anything missing (I tried incorporate your Requirements from this issiue - please add if anything is missing or comment): https://docs.google.com/document/d/1zuGSMLXGJlsjyZIpuNZXhMbbCtE2yFe8dxbQ03FZrS0/edit?usp=sharing also linked from https://github.com/knative/eventing/issues/3777#issuecomment-670644926

dprotaso commented 3 years ago

Following up I'm doing another pass of the conformance lib. I've merged the test.T type and Config in to one. This should make it easier for conformance test authors.

See: https://github.com/knative-sandbox/reconciler-test/pull/17 Serving POC is here: https://github.com/knative/serving/pull/9742

dprotaso commented 3 years ago

Originally posted by @dprotaso in https://github.com/knative-sandbox/reconciler-test/pull/17#issuecomment-705748603

Playing with this in https://github.com/knative/serving/pull/9742 leads to a few questions

1. Should test names be decorated with requirement levels & feature states

ie. right now it's

=== RUN   TestConformance/Service/multi_container

it could be

=== RUN TestConformance/Service/ALPHA_multi_container

It might make sense to do this only for non-stable & non-must/should requirements

=== RUN TestConformance/Service/ALPHA_multi_container/MAY_do_something

2. Test Structure

If tests aren't being consumed downstream you might not want to have a single golang test as your entry point. This means all your tests will have the same prefix ie. TestConformance/

The simplest thing I can think of is authors create their own explicit copy method. Thus the package variable captures flags and copies are initialized with test.Init

Ideally if we had generics test.Init could return a copy that we wouldn't need to cast

dprotaso commented 3 years ago

Work on using the framework in networking is on-going here: https://github.com/knative/networking/pull/233

github-actions[bot] commented 3 years ago

This issue is stale because it has been open for 90 days with no activity. It will automatically close after 30 more days of inactivity. Reopen the issue with /reopen. Mark the issue as fresh by adding the comment /remove-lifecycle stale.

dprotaso commented 3 years ago

/reopen /lifecycle frozen

knative-prow-robot commented 3 years ago

@dprotaso: Reopened this issue.

In response to [this](https://github.com/knative/pkg/issues/1563#issuecomment-781772754): >/reopen >/lifecycle frozen Instructions for interacting with me using PR comments are available [here](https://git.k8s.io/community/contributors/guide/pull-requests.md). If you have questions or suggestions related to my behavior, please file an issue against the [kubernetes/test-infra](https://github.com/kubernetes/test-infra/issues/new?title=Prow%20issue:) repository.
dprotaso commented 3 years ago

FYI - this has evolved into: https://github.com/knative-sandbox/reconciler-test

Please check out the README and offer feedback in that repo