Closed vincent1chen closed 1 year ago
@vincent1chen: Thank you for submitting this issue!
The issue is currently awaiting triage. Please make sure you have given us as much context as possible.
If the maintainers determine this is a relevant issue, they will remove the needs-triage label and assign an appropriate priority label.
We want your feedback! If you have any questions or suggestions regarding our contributing process/workflow, please reach out to us at container.storage.modules@dell.com.
Hi @vincent1chen , Can you please let us know that which Protocol you are using?
I am using iscsi
I tried with iscsi the test is successful for me. Below are the logs:
[root@master-1-QY58amanSV377 cert-csi]# ./cert-csi --debug functional-test multi-attach-vol --sc powerstore-ext4 --access-mode ReadOnlyMany [2023-08-16 04:00:41] INFO Starting cert-csi; ver. 0.8.1 [2023-08-16 04:00:41] INFO Using EVENT observer type [2023-08-16 04:00:41] INFO Using default config [2023-08-16 04:00:41] INFO Successfully loaded config. Host: https://10.225.111.25:6443 [2023-08-16 04:00:41] DEBUG Creating new KubeClient [2023-08-16 04:00:41] INFO Created new KubeClient [2023-08-16 04:00:41] DEBUG Created PersistentVolumeClaim client in functional-test namespace [2023-08-16 04:00:41] DEBUG Created Pod client in functional-test namespace [2023-08-16 04:00:41] DEBUG Created VA client in functional-test namespace [2023-08-16 04:00:41] DEBUG Created Metrics client in functional-test namespace [2023-08-16 04:00:41] DEBUG Created NodeClient [2023-08-16 04:00:41] INFO Using default number of pods [2023-08-16 04:00:41] INFO Creating Volume [2023-08-16 04:00:41] DEBUG Pod Observer started watching [2023-08-16 04:00:41] DEBUG PersistentVolumeClaimObserver started watching [2023-08-16 04:00:41] DEBUG ContainerMetricsObserver started watching [2023-08-16 04:00:41] DEBUG VolumeAttachmentObserver started watching [2023-08-16 04:00:41] DEBUG Created PVC vol-multi-pod-test-x7pr2 [2023-08-16 04:00:41] INFO Attaching Volume to original pod [2023-08-16 04:00:41] DEBUG Created Pod iowriter-test-qxhkz [2023-08-16 04:00:41] INFO Waiting for all pods in functional-test to be READY [2023-08-16 04:00:41] DEBUG Waiting for pod iowriter-test-djr6h to be ready [2023-08-16 04:00:41] DEBUG Waiting for pod iowriter-test-jpsc8 to be ready [2023-08-16 04:00:41] DEBUG Waiting for pod iowriter-test-kdww9 to be ready [2023-08-16 04:00:41] DEBUG Waiting for pod iowriter-test-khrd9 to be ready [2023-08-16 04:00:41] DEBUG Waiting for pod iowriter-test-lcqdv to be ready [2023-08-16 04:00:41] DEBUG Waiting for pod iowriter-test-lkb7p to be ready [2023-08-16 04:00:41] DEBUG Waiting for pod iowriter-test-m9jqs to be ready [2023-08-16 04:00:41] DEBUG Waiting for pod iowriter-test-mk8pq to be ready [2023-08-16 04:00:41] DEBUG Waiting for pod iowriter-test-qxhkz to be ready [2023-08-16 04:00:43] DEBUG Waiting for pod iowriter-test-djr6h to be ready [2023-08-16 04:00:43] DEBUG Waiting for pod iowriter-test-jpsc8 to be ready [2023-08-16 04:00:43] DEBUG Waiting for pod iowriter-test-kdww9 to be ready [2023-08-16 04:00:43] DEBUG Waiting for pod iowriter-test-khrd9 to be ready [2023-08-16 04:00:43] DEBUG Waiting for pod iowriter-test-lcqdv to be ready [2023-08-16 04:00:43] DEBUG Waiting for pod iowriter-test-lkb7p to be ready [2023-08-16 04:00:43] DEBUG Waiting for pod iowriter-test-m9jqs to be ready [2023-08-16 04:00:43] DEBUG Waiting for pod iowriter-test-mk8pq to be ready [2023-08-16 04:00:43] DEBUG Waiting for pod iowriter-test-qxhkz to be ready [2023-08-16 04:00:45] DEBUG Waiting for pod iowriter-test-djr6h to be ready [2023-08-16 04:00:45] DEBUG Waiting for pod iowriter-test-jpsc8 to be ready [2023-08-16 04:00:45] DEBUG Waiting for pod iowriter-test-kdww9 to be ready [2023-08-16 04:00:45] DEBUG Waiting for pod iowriter-test-khrd9 to be ready [2023-08-16 04:00:45] DEBUG Waiting for pod iowriter-test-lcqdv to be ready [2023-08-16 04:00:45] DEBUG Waiting for pod iowriter-test-lkb7p to be ready [2023-08-16 04:00:45] DEBUG Waiting for pod iowriter-test-m9jqs to be ready [2023-08-16 04:00:45] DEBUG Waiting for pod iowriter-test-mk8pq to be ready [2023-08-16 04:00:45] DEBUG Waiting for pod iowriter-test-qxhkz to be ready [2023-08-16 04:00:47] DEBUG Waiting for pod iowriter-test-djr6h to be ready [2023-08-16 04:00:47] DEBUG Waiting for pod iowriter-test-jpsc8 to be ready [2023-08-16 04:00:47] DEBUG Waiting for pod iowriter-test-kdww9 to be ready [2023-08-16 04:00:47] DEBUG Waiting for pod iowriter-test-khrd9 to be ready [2023-08-16 04:00:47] DEBUG Waiting for pod iowriter-test-lcqdv to be ready [2023-08-16 04:00:47] DEBUG Waiting for pod iowriter-test-lkb7p to be ready [2023-08-16 04:00:47] DEBUG Waiting for pod iowriter-test-m9jqs to be ready [2023-08-16 04:00:47] DEBUG Waiting for pod iowriter-test-mk8pq to be ready [2023-08-16 04:00:47] DEBUG Waiting for pod iowriter-test-qxhkz to be ready [2023-08-16 04:00:47] DEBUG Waiting for pod iowriter-test-t4xxz to be ready [2023-08-16 04:00:47] INFO All pods are ready in 6.020017993s [2023-08-16 04:00:47] INFO Creating new pods with original Volume attached to them [2023-08-16 04:00:47] DEBUG Created Pod iowriter-test-j9ksz [2023-08-16 04:00:47] DEBUG Created Pod iowriter-test-4p72n [2023-08-16 04:00:47] INFO Waiting for all pods in functional-test to be READY [2023-08-16 04:00:47] DEBUG Waiting for pod iowriter-test-4p72n to be ready [2023-08-16 04:00:49] DEBUG Waiting for pod iowriter-test-4p72n to be ready [2023-08-16 04:00:49] DEBUG Waiting for pod iowriter-test-djr6h to be ready [2023-08-16 04:00:49] DEBUG Waiting for pod iowriter-test-j9ksz to be ready [2023-08-16 04:00:49] DEBUG Waiting for pod iowriter-test-jpsc8 to be ready [2023-08-16 04:00:49] DEBUG Waiting for pod iowriter-test-kdww9 to be ready [2023-08-16 04:00:49] DEBUG Waiting for pod iowriter-test-khrd9 to be ready [2023-08-16 04:00:49] DEBUG Waiting for pod iowriter-test-lcqdv to be ready [2023-08-16 04:00:49] DEBUG Waiting for pod iowriter-test-lkb7p to be ready [2023-08-16 04:00:49] DEBUG Waiting for pod iowriter-test-m9jqs to be ready [2023-08-16 04:00:49] DEBUG Waiting for pod iowriter-test-mk8pq to be ready [2023-08-16 04:00:49] DEBUG Waiting for pod iowriter-test-qxhkz to be ready [2023-08-16 04:00:49] DEBUG Waiting for pod iowriter-test-t4xxz to be ready [2023-08-16 04:00:49] INFO All pods are ready in 2.019203442s [2023-08-16 04:00:49] INFO Writing to Volume on 1st originalPod [2023-08-16 04:00:49] INFO Executing command: [dd if=/dev/urandom of=/data0/blob.data bs=1M count=128 oflag=sync] [2023-08-16 04:00:49] DEBUG Executing command: [dd if=/dev/urandom of=/data0/blob.data bs=1M count=128 oflag=sync] [2023-08-16 04:00:51] INFO Writer originalPod: iowriter-test-qxhkz [2023-08-16 04:00:51] DEBUG 128+0 records in 128+0 records out 134217728 bytes (134 MB, 128 MiB) copied, 2.01924 s, 66.5 MB/s
[2023-08-16 04:00:51] INFO Executing command: [/bin/bash -c sha512sum /data0/blob.data > /data0/blob.sha512] [2023-08-16 04:00:51] DEBUG Executing command: [/bin/bash -c sha512sum /data0/blob.data > /data0/blob.sha512] [2023-08-16 04:00:52] INFO Checking hash sum on all of the other pods [2023-08-16 04:00:52] INFO Executing command: [/bin/bash -c sha512sum -c /data0/blob.sha512] [2023-08-16 04:00:52] DEBUG Executing command: [/bin/bash -c sha512sum -c /data0/blob.sha512] [2023-08-16 04:00:52] INFO Hashes match [2023-08-16 04:00:52] INFO Executing command: [/bin/bash -c sha512sum -c /data0/blob.sha512] [2023-08-16 04:00:52] DEBUG Executing command: [/bin/bash -c sha512sum -c /data0/blob.sha512] [2023-08-16 04:00:53] INFO Hashes match [2023-08-16 04:00:53] DEBUG VolumeAttachmentObserver finished watching [2023-08-16 04:00:53] DEBUG PersistentVolumeClaimObserver finished watching [2023-08-16 04:00:53] DEBUG Pod Observer finished watching [2023-08-16 04:00:56] DEBUG ContainerMetricsObserver finished watching [2023-08-16 04:00:56] INFO SUCCESS: MultiAttachSuite in 15.041563582s [2023-08-16 04:00:56] INFO Trying to connect to cluster... [2023-08-16 04:00:56] DEBUG Creating new KubeClient [2023-08-16 04:00:56] INFO Created new KubeClient [2023-08-16 04:00:56] INFO During this run 100.0% of suites succeeded [root@master-1-QY58amanSV377 cert-csi]#
@sakshi-garg1 , what i concern is the code logic blow. if i understand the code correctly, it calculate the checksum and store it in file "sum" on original pod. then on new pod, it calculates the the hash of file sum. it didn't do any comparison in new pod.
log.Info("Checking hash sum on all of the other pods")
for _, p := range newPods {
writer := bytes.NewBufferString("")
if err := podClient.Exec(ctx, p.Object, []string{"**/bin/bash", "-c", "sha512sum -c " + sum**}, writer, os.Stderr, false); err != nil {
return delFunc, err
}
**if strings.Contains(writer.String(), "OK") {
log.Info("Hashes match")**
} else {
return delFunc, fmt.Errorf("hashes don't match")
}
}
Hi @vincent1chen , I did some research over this and found that sha512sum -c is for comparing, well your concern also put me in doubt that where we are calculating the sha again and comparing so I performed below steps to verify that logic implemented is correct or not.
[root@master-1-eS6qj1hngsjdO ~]# cd adarsh/
[root@master-1-eS6qj1hngsjdO adarsh]# ls
auto.sh fc-sc.yaml main.go pod-pvc-fc.yaml pod.yaml sc.yaml secret.yaml
ch.yaml iscsi-sc.yaml pod2.yaml pod-pvc.yaml pw-snapcalss.yaml secondpvcfromsnap.yaml volumeSnapshot.yaml
// **Generate the sha and store it**
[root@master-1-eS6qj1hngsjdO adarsh]# sha512sum auto.sh > sha
[root@master-1-eS6qj1hngsjdO adarsh]# cat sha
78543f7e296750f2a6841dc7038c4dc525f9f36c9241869e9758466bd0ed97dfec95b79ad7a3a29fb48dc7462eeb4c20113ac38635919e909c1f1a8792053664 auto.sh
// **Now try to copy this file to some diff folder/place**
[root@master-1-eS6qj1hngsjdO ~]# mkdir test
[root@master-1-eS6qj1hngsjdO ~]# cp adarsh/auto.sh ./test/.
[root@master-1-eS6qj1hngsjdO ~]# cd test/
[root@master-1-eS6qj1hngsjdO test]# ls
auto.sh
[root@master-1-eS6qj1hngsjdO test]# sha512sum -c ../adarsh/sha
auto.sh: OK
// **I have changed the content here**
[root@master-1-eS6qj1hngsjdO test]# vi auto.sh
[root@master-1-eS6qj1hngsjdO test]# sha512sum -c ../adarsh/sha
auto.sh: FAILED
sha512sum: WARNING: 1 computed checksum did NOT match
[root@master-1-eS6qj1hngsjdO test]#
As per my understanding, In the code, we're attempting the same procedure by running the sha512sum -c
command within the mountpath directory. This command will then perform a comparison against the stored SHA hash file.
If this explanation isn't sufficient, we can schedule a call to go over it together and ensure a clearer understanding of the question.
good explanation. I understand it clearly. the output of "sha512sum -c" get the file name from "sum" and perform the comparison against stored hash value. the code has no problem. thanks.
Bug Description
I am confused with code line 2650~2661 in perf-suites.go. by design, it need to compare checksum between original pod and other pods. below code section seems don't run comparison. the logic below i understand is to caculate checksum of sum file. if the file exits, then it pass he verifcation.
Logs
[root@registry ~]# ./cert-csi --debug functional-test multi-attach-vol --sc powerstore-ext4 --access-mode ReadOnlyMany [2023-08-06 23:21:20] INFO Starting cert-csi; ver. 0.8.1 [2023-08-06 23:21:20] INFO Using EVENT observer type [2023-08-06 23:21:20] INFO Using default config [2023-08-06 23:21:20] INFO Successfully loaded config. Host: https://192.168.30.2:6443 [2023-08-06 23:21:20] DEBUG Creating new KubeClient [2023-08-06 23:21:20] INFO Created new KubeClient [2023-08-06 23:21:20] DEBUG Created PersistentVolumeClaim client in functional-test namespace [2023-08-06 23:21:20] DEBUG Created Pod client in functional-test namespace [2023-08-06 23:21:20] DEBUG Created VA client in functional-test namespace [2023-08-06 23:21:20] DEBUG Created Metrics client in functional-test namespace [2023-08-06 23:21:20] DEBUG Created NodeClient [2023-08-06 23:21:20] INFO Using default number of pods [2023-08-06 23:21:20] DEBUG VolumeAttachmentObserver started watching [2023-08-06 23:21:20] DEBUG PersistentVolumeClaimObserver started watching [2023-08-06 23:21:20] DEBUG Pod Observer started watching [2023-08-06 23:21:20] DEBUG ContainerMetricsObserver started watching [2023-08-06 23:21:20] INFO Creating Volume [2023-08-06 23:21:20] DEBUG Created PVC vol-multi-pod-test-rkww5 [2023-08-06 23:21:20] INFO Attaching Volume to original pod [2023-08-06 23:21:20] DEBUG Created Pod iowriter-test-lbg6r [2023-08-06 23:21:20] INFO Waiting for all pods in functional-test to be READY [2023-08-06 23:21:20] DEBUG Waiting for pod iowriter-test-c7tgz to be ready [2023-08-06 23:21:20] DEBUG Waiting for pod iowriter-test-lbg6r to be ready ... ... [2023-08-06 23:21:30] DEBUG Waiting for pod iowriter-test-sxs87 to be ready [2023-08-06 23:21:30] DEBUG Waiting for pod iowriter-test-tfb62 to be ready [2023-08-06 23:21:30] INFO All pods are ready in 10.013648481s [2023-08-06 23:21:30] INFO Creating new pods with original Volume attached to them [2023-08-06 23:21:30] DEBUG Created Pod iowriter-test-8khl5 [2023-08-06 23:21:30] DEBUG Created Pod iowriter-test-v2qvn [2023-08-06 23:21:30] INFO Waiting for all pods in functional-test to be READY [2023-08-06 23:21:30] DEBUG Waiting for pod iowriter-test-8khl5 to be ready [2023-08-06 23:21:32] DEBUG Waiting for pod iowriter-test-8khl5 to be ready [2023-08-06 23:21:34] DEBUG Waiting for pod iowriter-test-8khl5 to be ready ... ...
[2023-08-06 23:22:04] DEBUG Waiting for pod iowriter-test-v2qvn to be ready [2023-08-06 23:22:04] INFO All pods are ready in 34.022962375s [2023-08-06 23:22:04] INFO Writing to Volume on 1st originalPod [2023-08-06 23:22:04] INFO Executing command: [dd if=/dev/urandom of=/data0/blob.data bs=1M count=128 oflag=sync] [2023-08-06 23:22:04] DEBUG Executing command: [dd if=/dev/urandom of=/data0/blob.data bs=1M count=128 oflag=sync] [2023-08-06 23:22:06] INFO Writer originalPod: iowriter-test-lbg6r [2023-08-06 23:22:06] DEBUG 128+0 records in 128+0 records out 134217728 bytes (134 MB, 128 MiB) copied, 1.86752 s, 71.9 MB/s
[2023-08-06 23:22:06] INFO Executing command: [/bin/bash -c sha512sum /data0/blob.data > /data0/blob.sha512] [2023-08-06 23:22:06] DEBUG Executing command: [/bin/bash -c sha512sum /data0/blob.data > /data0/blob.sha512] [2023-08-06 23:22:06] INFO Checking hash sum on all of the other pods [2023-08-06 23:22:06] INFO Executing command: [/bin/bash -c sha512sum -c /data0/blob.sha512] [2023-08-06 23:22:06] DEBUG Executing command: [/bin/bash -c sha512sum -c /data0/blob.sha512] [2023-08-06 23:22:07] ERROR Suite MultiAttachSuite failed; error=command terminated with exit code 1 [2023-08-06 23:22:07] DEBUG PersistentVolumeClaimObserver finished watching [2023-08-06 23:22:07] DEBUG Pod Observer finished watching [2023-08-06 23:22:07] DEBUG VolumeAttachmentObserver finished watching [2023-08-06 23:22:10] DEBUG ContainerMetricsObserver finished watching [2023-08-06 23:22:10] INFO FAILURE: MultiAttachSuite in 50.093522846s [2023-08-06 23:22:10] INFO Trying to connect to cluster... [2023-08-06 23:22:10] DEBUG Creating new KubeClient [2023-08-06 23:22:10] INFO Created new KubeClient [2023-08-06 23:22:10] FATAL During this run 0.0% of suites succeeded
Screenshots
No response
Additional Environment Information
No response
Steps to Reproduce
review code
Expected Behavior
the code should caculate the data checksum on new pods and then compare if the sum is same as the value in blob.sha512
CSM Driver(s)
CSM for powerstore
Installation Type
helm 3.0
Container Storage Modules Enabled
No response
Container Orchestrator
K8s 1.24.4
Operating System
debian 11