Closed dprotaso closed 3 years ago
- Conformance Authors need the ability to mark certain features with their maturity/state (alpha, beta, stable)
- Conformance Authors need the ability to mark functionality with different requirement levels
I was wondering if "tags" would be a solution to both of these. Looking at the current signature-based solution to 2 & 3, it seems like expressing Beta + Must would require nesting the two. What would you think of using build tags instead, like so:
// +build should,beta must,stable
// +build v1
This would indicate that these tests should run against v1 "should" tests for beta and would become a "must" for stable.
Drawback: tests for different levels of maturity end up in different files; we may end up with a large number of files.
6. Downstream implementors need the ability to consume and invoke upstream conformance tests.
When the project switched from
dep
to go modules we lost our ability to invoke vendored tests. Thus to support implementors we started to expose the ingress conformance as a library that can be invoked downstream in their own tests.the tl;dr right now conformance authors expose a function for downstream implementors. Golang subtests (
t.Run
) offer a nice mechanism to compose the larger test suite of small focused feature tests.package conformance // in a standard go file func Run(t *testing.T) { t.Run("some feature", TestSomeFeature) t.Run("some other feature", TestSomeOtherFeature) } func TestSomeFeature(t *testing.T) {...} func TestSomeOtherFeature(t *testing.T) {...}
Downstream implementors would author test file as follows
// in a _test.go file import "knative.dev/upstream/conformance" func TestConformance(t *testing.T) { conformance.Run(t) }
I wonder if RunConformance
could be built using go generate
to store a slice of all the functions and then iterate over the slice in RunConformance
.
- Conformance Authors need the ability to mark certain features with their maturity/state (alpha, beta, stable)
- Conformance Authors need the ability to mark functionality with different requirement levels
I was wondering if "tags" would be a solution to both of these. Looking at the current signature-based solution to 2 & 3, it seems like expressing Beta + Must would require nesting the two. What would you think of using build tags instead, like so:
// +build should,beta must,stable // +build v1
This would indicate that these tests should run against v1 "should" tests for beta and would become a "must" for stable.
Drawback: tests for different levels of maturity end up in different files; we may end up with a large number of files.
Additional drawback: build flags won't show up if we do a reflection-based RunConformance
.
I thought about build tags but decided it's not a good fit for the following reasons
1) Because we need to package our tests in a library (thanks go.mod) we need an entry point that coalesces all the tests together. Thus we need to be able to reference all the test funcs at compile time no matter the tags. We'll need noop tests when those tags aren't enabled. You could write a tool that mimics what go test
does by looking at method prefixes but that is an anti-goal.
2) We may want to ship a conformance binary and enabling/disabling certain assertions dynamically via flags seems nice
3) tags require manipulation via go test
- I'd like the option to do similar manipulation in code. ie. downstream implementors can turn on alpha features in code by default and not stress about new contributors forgetting to turn it on
I wonder if RunConformance could be built using go generate to store a slice of all the functions and then iterate over the slice in RunConformance.
Maintaining the list manually isn't ideal but not troublesome at this point. Annotations or decorators in the language would be nice here.
This could be a future improvement.
This would indicate that these tests should run against v1 "should" tests for beta and would become a "must" for stable.
This is interesting I think I'd prefer requirements levels to be stable across feature states. This would signal the final intent to implementors earlier on
high level comment on first reading: I like it!
small low level comment: I wonder if we can avoid changing the test signature and wrapping testing.T, 'cos that'll break tooling a bit and make the tests a little non-idiomatic-seeming, and instead do something similar to mux.Var, i.e. just take T
as an argument to a package-level function. So conformance.Must(t, func() { .. })
instead of t.Must(func() {..})
. Not the most important thing in the world though 🤷.
This would be really helpful for me. The current solution to turn on a feature inside of the e2e-tests.sh script means there's an odd dependency for the test to run properly (and we put the test in its own package/path to make this easier to run). https://github.com/knative/serving/blob/0633c628d3e0a3b25511217351564960bc777b42/test/e2e-tests.sh#L115
Currently I'm looking at testing Garbage Collection. The ability to toggle features by maturity would be great but also having a go library to modify configMap values in the test would also make it easier to test several varieties of the scenario.
What about a helper at the beginning of each function?
func TestMust_Clause123(t *testing.T) {
conformance.Verifies(t, "Clause123", conformance.Beta("serving"), conformance.Must())
}
func conformance.Verifies(t *testing.T, ref string, opts ...Options) {
// If should not be enabled:
t.SkipNow()
}
first of all, thanks for doing this. I believe this is very needed and I'll share my thoughts about it
Config
can be used for the context, but I'm struggling figuring out how can implementors provide the events/actions. @julz @evankanderson re: package level functions & being non-idiomatic
So we're currently non-idiomatic because we're moving test funcs into non-test files. This is necessary for the tests to be consumed downstream. If we were to have package level functions then we'd need to access the test config (ie. what requirement levels & feature states are to be exercised) as a global and that's something I'd prefer to avoid.
The other bit by having our own defined T
is we can augment it with new features/helpers (ie. avoid test.Setup
everywhere)
For the case of the Sources conformance here some of the known and unknowns for the conformance authors :
ready
and sinkUri
)You could use the config as an extensions point to provide specific functionality
ie.
// eventing conformance
type Config struct {
test.BaseConfig
...
SourceType metav1.TypeMeta
SourceTriggerFunc func() {}
}
func Run(tt *testing.T, c Config) {
t := test.NewContext(tt, c Config)
// assert config is valid
// eventing conformance assertions
...
// Send event
c.SourceTriggerFunc()
}
Downstream you could have
var sourceConfig conformance.Config
func init() {
// parse gets called by the test binary or TestMain
sourceConfig.AddFlags(flag.CommandLine)
}
func TestSomeSource(t *testing.T) {
some := sourceConfig
some.SourceType = // some metav1.Type
some.SourceTriggerFunc = func() {
// send the event
}
conformance.Run(t, some)
}
I mentioned this on the call, but I'd also love it if this enabled downstream to run higher iteration counts.
In net-contour I added this: https://github.com/knative-sandbox/net-contour/blob/677e434bcac7521d6884c0f11d428af40090ba14/test/conformance/ingress_test.go#L34-L39
However, this groups things on testgrid as: TestIngressConformance/4/hosts/multiple
, where the grouping I really want is: TestIngressConformance/hosts/multiple/4
(see also testgrid)
@dprotaso please take a look on initial draft of Requirements Gathering for Knative Enhanced Testing Task Force and add if there is anything missing (I tried incorporate your Requirements from this issiue - please add if anything is missing or comment): https://docs.google.com/document/d/1zuGSMLXGJlsjyZIpuNZXhMbbCtE2yFe8dxbQ03FZrS0/edit?usp=sharing also linked from https://github.com/knative/eventing/issues/3777#issuecomment-670644926
Following up I'm doing another pass of the conformance lib. I've merged the test.T
type and Config
in to one. This should make it easier for conformance test authors.
See: https://github.com/knative-sandbox/reconciler-test/pull/17 Serving POC is here: https://github.com/knative/serving/pull/9742
Originally posted by @dprotaso in https://github.com/knative-sandbox/reconciler-test/pull/17#issuecomment-705748603
Playing with this in https://github.com/knative/serving/pull/9742 leads to a few questions
ie. right now it's
=== RUN TestConformance/Service/multi_container
it could be
=== RUN TestConformance/Service/ALPHA_multi_container
It might make sense to do this only for non-stable & non-must/should requirements
=== RUN TestConformance/Service/ALPHA_multi_container/MAY_do_something
If tests aren't being consumed downstream you might not want to have a single golang test as your entry point. This means all your tests will have the same prefix ie. TestConformance/
The simplest thing I can think of is authors create their own explicit copy method. Thus the package variable captures flags and copies are initialized with test.Init
Ideally if we had generics test.Init
could return a copy that we wouldn't need to cast
Work on using the framework in networking is on-going here: https://github.com/knative/networking/pull/233
This issue is stale because it has been open for 90 days with no
activity. It will automatically close after 30 more days of
inactivity. Reopen the issue with /reopen
. Mark the issue as
fresh by adding the comment /remove-lifecycle stale
.
/reopen /lifecycle frozen
@dprotaso: Reopened this issue.
FYI - this has evolved into: https://github.com/knative-sandbox/reconciler-test
Please check out the README and offer feedback in that repo
Background
We've been piggy backing off of golang's testing package along with some bash scripts as our conformance test driver. We've also built up some tooling in knative.dev/pkg/test to support this effort. We're at a point now where we have enough insight and experience with conformance testing that we can do a pass at improving the experience for both upstream authors and downstream implementors.
Requirements
Here's my first pass of the requirements we need:
Next Steps
pkg/test/v2
so that we can gradually onboard different conformance testsSolutioning on the various requirements
I've been experimenting a POC but I'll provide more context wrt. to each requirement listed above. knative.dev/pkg poc diff knative.dev/serving poc diff
The order I talk about the requirements below differ from above since they build on each other. Above I just partitioned based on persona so it reads better.
1. Build upon golang testing given the supporting tools and services that already exist
We should use
go test
package as our driver for invoking conformance tests. This lets us continue using our current infrastructure of capturing test results and viewing them in test grid.What we can do is build upon the
testing
package to offer higher level semantics for conformance authors. See below for more details.6. Downstream implementors need the ability to consume and invoke upstream conformance tests.
When the project switched from
dep
to go modules we lost our ability to invoke vendored tests. Thus to support implementors we started to expose the ingress conformance as a library that can be invoked downstream in their own tests.the tl;dr right now conformance authors expose a function for downstream implementors. Golang subtests (
t.Run
) offer a nice mechanism to compose the larger test suite of small focused feature tests.Downstream implementors would author test file as follows
2. Conformance Authors need the ability to mark certain features with their maturity/state (alpha, beta, stable)
It should be easy to mark tests based on a feature's maturity/state. If we build upon golang's
testing
package by offering our own 'test context' we can offer some first class helpers similar to subtests (t.Run
) that then feel idiomatic to the language.Conformance authors would then utilize these methods as follows
3. Conformance Authors need the ability to mark functionality with different requirement levels
Similar to above we extend
test.T
to allow conformance authors to mark different assertions with specific requirement levels.Conformance authors would then utilize these methods as follows
4. Conformance Authors need the ability to compose, consume & reuse common configuration.
Various properties are shared amongst different conformance tests. Tests shouldn't have to redefine these properties. We want to have consistency in the way these properties are supplied, read and composed. The current pattern is to read command line args into global flags. Sadly we're not consistent as some functions read environment variables directly.
The conformance lib can provide a base config that tracks what feature states, requirements should be exercise against a specific environment
This config can be embedded into a specific conformance test's configuration
Our test context can carry an instance of this config that's accessible in tests. Note: go doesn't support inheritance/generics so we'll need our 'test context' to contain an interface of a config (long story)
5. Conformance Authors need the ability to provide defaults for various test options.
This is pretty self explanatory. For a concrete example serving runs conformance in a default namespace (serving-tests) but allows overrides.
9. Downstream implementors need the ability to supply test options and override defaults in a consistent way.
We want to support different options for supplying test options & overriding defaults.
Options defined in code
Conformance authors should consume their a config when their test is invoked
This allows implementors to write their configs in go
Options & overrides via flags
We'd like to support our existing scripts and allow invokers of our conformance to override some settings. This can be accomplished via flags on our test binary. Since we want to avoid globals in our library packages we make the global in the test runner and attach the flags to the default command line.
7. Downstream implementors need the ability to invoke a subset of tests based on requirements levels.
This helps them know which requirements they're meeting and which they are not. This can be accomplished by running tests with different requirements enabled.
For example here's how we could override the requirement levels via. test flags
go test -requirement.must -requirement.mustnot
go test -requirements.recommended
go test -requirements.may
go test -requirements.all
8. Downstream implementors need the ability to invoke a subset of tests based on the maturity of a feature.
We want to avoid breaking downstream CIs when a new conformance test is added for an alpha feature. This provides a gradual onboarding for implementors as a feature progresses from alpha to stable.
For example here's how we could override the enabled feature states via. test flags
go test -features.stable -features.beta
go test -features.alpha
go test -features.all
/assign @dprotaso