knative / eventing

Event-driven application platform for Kubernetes
https://knative.dev/docs/eventing
Apache License 2.0
1.41k stars 588 forks source link

Failing e2e tests for knative/eventing-v0.16.1 #4016

Closed nailushapotnuru closed 3 years ago

nailushapotnuru commented 4 years ago

TestName : e2e tests Repository : knative/eventing Tag: v0.16.1

Steps followed:

  1. Deployed knative-eventing on OCP4.5

    [root@rheltestmachine1 test_images]# oc get pods -n knative-eventing
    NAME                                       READY   STATUS      RESTARTS   AGE
    eventing-controller-7d864b799d-zwslf       1/1     Running     0          2d5h
    eventing-webhook-6f5bd45d85-cmp99          1/1     Running     2          2d5h
    imc-controller-548bdd5587-gdnqq            1/1     Running     0          2d5h
    imc-dispatcher-565d6df65c-h2m9b            1/1     Running     0          2d5h
    mt-broker-controller-9968b99f4-zxj2c       1/1     Running     0          2d5h
    mt-broker-filter-7bf9b6c5ff-dzdv9          1/1     Running     0          2d5h
    mt-broker-ingress-5f4488f56f-8vnjt         1/1     Running     0          2d5h
    storage-version-migration-eventing-v62s6   0/1     Completed   0          2d5h
    sugar-controller-6f5b4b68b9-hhj7w          1/1     Running     0          2d5h
    v0.16.0-broker-cleanup-mfkcr               0/1     Completed   0          2d5h
  2. Built the test-images under eventing/test/test_images (https://github.com/knative/eventing/tree/v0.16.1/test/test_images)

  3. Created image streams for these test-images and replaced under respective pod.yaml files of https://github.com/knative/eventing/tree/v0.16.1/test/test_images .

  4. go test -v -tags=e2e -count=1 ./test/e2e --dockerrepo image-registry.openshift-image-registry.svc:5000/knative-eventing/

Please find the test output in the below attachment.

eventing_e2e_test_output.txt

Could someone help me with the above failures or any additional setup/procedure I need to follow for running e2e tests for eventing.

pierDipi commented 4 years ago

Can you check again by running tests using the following statement?

go test -v -tags=e2e ./test/e2e -brokerclass=MTChannelBasedBroker -channels=messaging.knative.dev/v1beta1:Channel,messaging.knative.dev/v1beta1:InMemoryChannel,messaging.knative.dev/v1:Channel,messaging.knative.dev/v1:InMemoryChannel -sources=sources.knative.dev/v1alpha2:ApiServerSource,sources.knative.dev/v1alpha2:ContainerSource,sources.knative.dev/v1alpha2:PingSource 

I think the missing brokerClass flag leads to Brokers not becoming ready.

nailushapotnuru commented 4 years ago

@pierDipi , Thanks for looking into the issue.

Failures still persists even after passing the above flags. Here is the test output of above command.

eventing_e2e_test_outputv0.1.txt

pierDipi commented 4 years ago

Do you have installed the sugar-controller? https://github.com/knative/eventing/releases/download/v0.16.1/eventing-sugar-controller.yaml

pierDipi commented 4 years ago

Yeah, you have.

pierDipi commented 4 years ago

Are you upgrading from 0.15.*?

pierDipi commented 4 years ago

I followed these steps and tests are working:

kubectl apply -f https://github.com/knative/eventing/releases/download/v0.16.1/eventing-core.yaml \
-f https://github.com/knative/eventing/releases/download/v0.16.1/eventing-crds.yaml \
-f https://github.com/knative/eventing/releases/download/v0.16.1/eventing-sugar-controller.yaml \
-f https://github.com/knative/eventing/releases/download/v0.16.1/in-memory-channel.yaml \
-f https://github.com/knative/eventing/releases/download/v0.16.1/mt-channel-broker.yaml

# <wait running>

go test -v -tags=e2e -timeout=30m ./test/e2e -brokerclass=MTChannelBasedBroker -channels=messaging.knative.dev/v1beta1:Channel,messaging.knative.dev/v1beta1:InMemoryChannel,messaging.knative.dev/v1:Channel,messaging.knative.dev/v1:InMemoryChannel -sources=sources.knative.dev/v1alpha2:ApiServerSource,sources.knative.dev/v1alpha2:ContainerSource,sources.knative.dev/v1alpha2:PingSource

On k8s 1.17, I don't have an OCP available to test them.

nailushapotnuru commented 4 years ago

I have deployed knative-eventing on OCP from https://github.com/knative/eventing/tree/v0.16.1/config and built all the test-images from (https://github.com/knative/eventing/tree/v0.16.1/test/test_images) and then tried e2e testing.

nailushapotnuru commented 4 years ago

Are you upgrading from 0.15.*?

I am working on v0.16.1 for power architecture.

matzew commented 4 years ago

Not really sure why the tests fails for you, but we have our own (OCP) CI setup, testing against 4.5 and 4.6

For instance:

Here is how those tests on CI got invoked: https://github.com/openshift/knative-eventing/blob/release-v0.16.0/openshift/e2e-common.sh#L134

(similar file exists for 0.17.2 branch)

grantr commented 4 years ago

@nailushapotnuru For another data point, have you tried running the tests with the latest version 0.17.2?

vaikas commented 4 years ago

Did you install a broker. This one or another one? https://github.com/knative/eventing/tree/v0.16.1/config/brokers/mt-channel-broker

vaikas commented 4 years ago

Oops, sorry, I see it there in the initial list of pods.Would it be possible to get the logs for the: mt-broker-controller-9968b99f4-zxj2c 1/1 Running 0 2d5h

nailushapotnuru commented 4 years ago

Did you install a broker. This one or another one? https://github.com/knative/eventing/tree/v0.16.1/config/brokers/mt-channel-broker

Yes I have installed mt-channel-broker from the same one.

nailushapotnuru commented 4 years ago

@nailushapotnuru For another data point, have you tried running the tests with the latest version 0.17.2?

No I haven't tried with 0.17.2 version, have only tried with 0.16.1. Do you want me to try with the latest one ?

nailushapotnuru commented 4 years ago

Oops, sorry, I see it there in the initial list of pods.Would it be possible to get the logs for the: mt-broker-controller-9968b99f4-zxj2c 1/1 Running 0 2d5h

Please find the mt-broker-controller logs attached in the below txt file. mt-broker-controller.log

vaikas commented 4 years ago

You can certainly try the .17.X releases to see if things have been fixed there. What happens if you just create a Broker like this: kubectl create -f - <<EOF apiVersion: eventing.knative.dev/v1 kind: Broker metadata: name: default namespace: reprobroker EOF

Does that become ready? If not, can you send the logs again?

Then if it does work, maybe just run a single test like: go test -v -tags=e2e -count=1 ./test/e2e -run ^TestDefaultBrokerWithManyTriggers$

nailushapotnuru commented 4 years ago

@vaikas, Thanks for your response! Even with the latest eventing git version 0.17.2 e2e tests are failing for power architecture. Broker pods are not creating for me when I tried as above. Please find the mt-broker-controller logs attached and let me know your suggestions.

mt-broker-controller-6dd59d6fc8-25gb7.log

matzew commented 4 years ago

@nailushapotnuru hello,

have you tried applying these yamls:

nailushapotnuru commented 4 years ago

@nailushapotnuru hello, have you tried applying these yamls:

https://raw.githubusercontent.com/openshift/knative-eventing/release-v0.17.2/openshift/release/knative-eventing-ci.yaml https://raw.githubusercontent.com/openshift/knative-eventing/release-v0.17.2/openshift/release/knative-eventing-mtbroker-ci.yaml

@matzew , Yes I have installed eventing using the above yaml as well, But still the e2e tests fails with the below error logs. eventingci.log

vaikas commented 4 years ago

Can you create a broker like this:

kubectl create -f - <<EOF
apiVersion: eventing.knative.dev/v1
kind: Broker
metadata:
name: brokertest
namespace: default
EOF

And then see what state it goes in. I'm trying to just simplify and see if anything works :)

Then dump it: kubectl get brokers brokertest -oyaml

After waiting a little bit to give it time to reconcile.

nailushapotnuru commented 4 years ago

@vaikas , Here is the output of "kubectl get brokers brokertest -oyaml"

[root@nailuknative1 OCP45]#  oc apply -f broker.yaml
broker.eventing.knative.dev/brokertest created
[root@nailuknative1 OCP45]# kubectl get brokers brokertest -oyaml
apiVersion: eventing.knative.dev/v1
kind: Broker
metadata:
  annotations:
    eventing.knative.dev/broker.class: MTChannelBasedBroker
    eventing.knative.dev/creator: kube:admin
    eventing.knative.dev/lastModifier: kube:admin
    kubectl.kubernetes.io/last-applied-configuration: |
      {"apiVersion":"eventing.knative.dev/v1","kind":"Broker","metadata":{"annotations":{},"name":"brokertest","namespace":"default"}}
  creationTimestamp: "2020-09-22T05:52:51Z"
  generation: 1
  managedFields:
  - apiVersion: eventing.knative.dev/v1
    fieldsType: FieldsV1
    fieldsV1:
      f:metadata:
        f:annotations:
          .: {}
          f:kubectl.kubernetes.io/last-applied-configuration: {}
    manager: oc
    operation: Update
    time: "2020-09-22T05:52:51Z"
  - apiVersion: eventing.knative.dev/v1
    fieldsType: FieldsV1
    fieldsV1:
      f:status:
        f:address:
          f:url: {}
        f:annotations:
          .: {}
          f:knative.dev/channelAPIVersion: {}
          f:knative.dev/channelAddress: {}
          f:knative.dev/channelKind: {}
          f:knative.dev/channelName: {}
        f:conditions: {}
        f:observedGeneration: {}
    manager: mtchannel_broker
    operation: Update
    time: "2020-09-22T05:52:52Z"
  name: brokertest
  namespace: default
  resourceVersion: "23019232"
  selfLink: /apis/eventing.knative.dev/v1/namespaces/default/brokers/brokertest
  uid: 7b09e6de-3663-43e2-8b25-fac30613f287
spec:
  config:
    apiVersion: v1
    kind: ConfigMap
    name: config-br-default-channel
    namespace: knative-eventing
status:
  address:
    url: http://broker-ingress.knative-eventing.svc.cluster.local/default/brokertest
  annotations:
    knative.dev/channelAPIVersion: messaging.knative.dev/v1
    knative.dev/channelAddress: http://brokertest-kne-trigger-kn-channel.default.svc.cluster.local
    knative.dev/channelKind: InMemoryChannel
    knative.dev/channelName: brokertest-kne-trigger
  conditions:
  - lastTransitionTime: "2020-09-22T05:52:51Z"
    status: "True"
    type: Addressable
  - lastTransitionTime: "2020-09-22T05:52:51Z"
    status: "True"
    type: FilterReady
  - lastTransitionTime: "2020-09-22T05:52:51Z"
    status: "True"
    type: IngressReady
  - lastTransitionTime: "2020-09-22T05:52:52Z"
    status: "True"
    type: Ready
  - lastTransitionTime: "2020-09-22T05:52:52Z"
    status: "True"
    type: TriggerChannelReady
  observedGeneration: 1

Please find the pod logs of mt-broker-controller: mt-broker-controller-589f57b6f6-m7m5s.log

nailushapotnuru commented 4 years ago

Could some one help me with the test errors for testsuite: "TestPingSourceV1Alpha2", most of my test cases are failing with the same error.

tracker.go:152: Waiting for all KResources to become ready
    event_info_store.go:238: Timeout waiting for at least 1 matches.
        Error: FAIL MATCHING: saw 0/1 matching events.
        Recent events: 
        0 events seen, last 0 events:
        Match errors: 

        knative.dev/eventing/test/lib/recordevents.(*EventInfoStore).AssertAtLeast
            /root/OCP45/0.16.1/eventing/test/lib/recordevents/event_info_store.go:238
        knative.dev/eventing/test/lib/recordevents.(*EventInfoStore).AssertInRange
            /root/OCP45/0.16.1/eventing/test/lib/recordevents/event_info_store.go:246
        knative.dev/eventing/test/lib/recordevents.(*EventInfoStore).AssertExact
            /root/OCP45/0.16.1/eventing/test/lib/recordevents/event_info_store.go:274
        knative.dev/eventing/test/e2e.TestPingSourceV1Alpha2
            /root/OCP45/0.16.1/eventing/test/e2e/source_ping_test.go:75
        testing.tRunner
            /usr/local/go/src/testing/testing.go:909
        runtime.goexit
            /usr/local/go/src/runtime/asm_ppc64x.s:884

Architecture: ppc64le OCP version: 4.5 knative-eventing: v0.16.1

[root@nailuknative1 ~]# oc get pods -n knative-eventing
NAME                                     READY   STATUS    RESTARTS   AGE
eventing-controller-bdb496b6d-mjm8t      1/1     Running   0          6d6h
eventing-webhook-558c787798-rv4td        1/1     Running   1          6d6h
imc-controller-fd87cb9cd-cbjrv           1/1     Running   0          6d6h
imc-dispatcher-76b74c6789-mp9sq          1/1     Running   0          6d6h
mt-broker-controller-5b5c785d87-hfttg    1/1     Running   0          6d6h
mt-broker-filter-7c6ddff4b6-hh6vt        1/1     Running   0          6d6h
mt-broker-ingress-564b8545db-7rjpj       1/1     Running   0          6d6h
pingsource-mt-adapter-6f5c79f96c-hbg6v   1/1     Running   0          6d5h
sugar-controller-687b4b47fb-pvk85        1/1     Running   0          6d6h
[root@nailuknative1 ~]#

Command executed:

go test -v -tags=e2e -count=1 ./test/e2e -run ^TestPingSourceV1Alpha2$ --dockerrepo image-registry.openshift-image-registry.svc:5000/local-images

Please find the test logs attached below: TestPingSourceV1Alpha2.log

lionelvillard commented 4 years ago

Can you dump the pingsource-mt-adapter-6f5c79f96c-hbg6v logs?

nailushapotnuru commented 4 years ago

Please find the ping source pod logs:

pingsource-mt-adapter-6f5c79f96c-hbg6v.log

lionelvillard commented 4 years ago

@nailushapotnuru thanks! Looks like a permission issue. Can you check this clusterrole exists in your cluster: https://github.com/knative/eventing/blob/v0.16.1/config/core/roles/pingsource-mt-adapter-clusterrole.yaml ?

lionelvillard commented 4 years ago

how did you install knative ?

nailushapotnuru commented 4 years ago

@lionelvillard , thanks for looking into the issue I have installed eventing using the below yamls:

kubectl apply -f https://github.com/knative/eventing/releases/download/v0.16.1/eventing-core.yaml \ -f https://github.com/knative/eventing/releases/download/v0.16.1/eventing-crds.yaml \ -f https://github.com/knative/eventing/releases/download/v0.16.1/eventing-sugar-controller.yaml \ -f https://github.com/knative/eventing/releases/download/v0.16.1/in-memory-channel.yaml \ -f https://github.com/knative/eventing/releases/download/v0.16.1/mt-channel-broker.yaml

nailushapotnuru commented 4 years ago

@nailushapotnuru thanks! Looks like a permission issue. Can you check this clusterrole exists in your cluster: https://github.com/knative/eventing/blob/v0.16.1/config/core/roles/pingsource-mt-adapter-clusterrole.yaml ?

Now I have tried applying this yaml but failed with below error:

[root@nailuknative1 ~]# oc apply -f  pingsource-mt-adapter-clusterrole.yaml
error: error parsing pingsource-mt-adapter-clusterrole.yaml: error converting YAML to JSON: yaml: line 128: mapping values are not allowed in this context
lionelvillard commented 4 years ago

Please check knative-eventing-pingsource-mt-adapter ClusterRole. You should see this:

 apiGroups:
  - sources.knative.dev
  resources:
  - pingsources
  - pingsources/status
  verbs:
  - get
  - list
  - watch
  - patch
nailushapotnuru commented 4 years ago

Please find the ClusterRole below:

[root@nailuknative1 ~]# oc get ClusterRole | grep knative-eventing-pingsource-mt-adapter
knative-eventing-pingsource-mt-adapter                                 2020-09-24T06:52:03Z
[root@nailuknative1 ~]# oc describe ClusterRole knative-eventing-pingsource-mt-adapter
Name:         knative-eventing-pingsource-mt-adapter
Labels:       eventing.knative.dev/release=devel
Annotations:  kubectl.kubernetes.io/last-applied-configuration:
                {"apiVersion":"rbac.authorization.k8s.io/v1","kind":"ClusterRole","metadata":{"annotations":{},"labels":{"eventing.knative.dev/release":"d...
PolicyRule:
  Resources                   Non-Resource URLs  Resource Names  Verbs
  ---------                   -----------------  --------------  -----
  events                      []                 []              [create patch]
  leases.coordination.k8s.io  []                 []              [get list watch create update patch]
  configmaps                  []                 []              [get list watch]
[root@nailuknative1 ~]#
lionelvillard commented 4 years ago

pingsource-mt-adapter-clusterrole.yaml does not have 128 lines. What's its content?

nailushapotnuru commented 4 years ago

Sorry I tried to download the file and apply it, It didn't download correctly. Now I am able to apply the yaml file.

[root@nailuknative1 roles]# oc apply -f pingsource-mt-adapter-clusterrole.yaml
clusterrole.rbac.authorization.k8s.io/knative-eventing-pingsource-mt-adapter configured
nailushapotnuru commented 3 years ago

Most of my test cases are failing with below error: Architecture: PPC64LE Eventing: v0.16.1 Command executed:

go test -v -tags=e2e -timeout=0m ./test/e2e -brokerclass=MTChannelBasedBroker -channels=messaging.knative.dev/v1beta1:Channel,messaging.knative.dev/v1beta1:InMemoryChannel,messaging.knative.dev/v1:Channel,messaging.knative.dev/v1:InMemoryChannel -sources=sources.knative.dev/v1alpha2:ApiServerSource,sources.knative.dev/v1alpha2:ContainerSource,sources.knative.dev/v1alpha2:PingSource --dockerrepo image-registry.openshift-image-registry.svc:5000/local-images

Error Message:

event_info_store.go:238: Timeout waiting for at least 1 matches.
                Error: FAIL MATCHING: unexpected error during find: error getting events error getting MinMax http get error: Get http://localhost:35493/minmax: dial tcp 127.0.0.1:35493: connect: connection refused
                knative.dev/eventing/test/lib/recordevents.(*EventInfoStore).AssertAtLeast
                    /root/OCP45/0710/eventing/test/lib/recordevents/event_info_store.go:238
                knative.dev/eventing/test/e2e/helpers.TestBrokerWithManyTriggers.func1
                    /root/OCP45/0710/eventing/test/e2e/helpers/broker_test_helper.go:324
                testing.tRunner
                    /usr/local/go/src/testing/testing.go:909
                runtime.goexit
                    /usr/local/go/src/runtime/asm_ppc64x.s:884

Could some one help me resolve this issue ?

grantr commented 3 years ago

This seems related to https://github.com/knative/eventing/issues/3570. We think that's fixed in https://github.com/knative/eventing/pull/4171.

@nailushapotnuru Could you test in your environment with the current master branch or a recent nightly release?

Instructions for installing a nightly release are in the pre-release docs at https://knative.dev/development/install/any-kubernetes-cluster/.

lberk commented 3 years ago

@nailushapotnuru after the release tomorrow upstream will be dropping support for 0.16 era releases, please try this out with a newer release (0.17+). If you're still hitting the issue please feel free to reopen!