Closed bramdehart closed 11 months ago
@microsoft-github-policy-service agree
Hi @bramdehart, thanks so much for this PR!
Would you mind moving the input check to the annotateResources
function in src/strategyHelpers/deploymentHelper.ts
(it's called by annotateAndLabelResources)? That way, this input is supported for any other time the annotateResources
function is called.
Moreover, would you mind adding testing for this functionality? Our integration tests (workflows located in .github/workflows
and the testing script is test/integration/k8s-deploy-test.py
) support but don't currently use tests for annotations so one option would be there. Unit tests would also be great, which you can add to a new file called deploymentHelper.test.ts
(following convention for other testing files in the repo). Check out src/types/kubectl.test.ts for how we mock input to the action with Jest.
Hi @jaiveerk,
This because this PR will hopefully resolve our problems with the pods.stdout
dump that is executed during the assignment of allPods
. This assignment takes a lot of resources and sometimes it reaches our limits resulting in exiting the dump, resulting in a corrupt JSON in the dump, resulting in an Unexpected end of JSON input
error message.
By moving the input check to another method, this assignment would still take place and we would still be dealing with this error. We could increase our CPU and limits, but that would only be to resolve this error, nothing more. In our case the assignment is never used.
So with that said, do you mind me moving the assignment of allPods
to the annotateChildPods
method in kubectlUtils.ts
? The pods are now passed as parameters via multiple methods but in the end it is only used in this annotateChildPods
method. I think it is cleaner to avoid unnecessary parameters.
Also, do you mind me wrapping the input check around the annotateResources
call within the annotateAndLabelResources
method in stead of moving it to the annotateResources
method itself? I think it is better practise to wrap conditional statements around methods itself and I do not see the annotateResources
method being used elsewhere.
@jaiveerk
In my latest 3 commits I implemented the proposed solution that I explained in the previous message. Let me know if you agree or not. If you agree I will write additional tests. Thanks! :)
Hey @bramdehart, thanks for the new implementation. As for the change to the allPods
assignment - that certainly looks cleaner.
The new approach looks good to me, so you're good to go ahead and add testing.
Thanks again for this contribution!
This PR is idle because it has been open for 14 days with no activity.
Hi @bramdehart ,
Just wanted to check in on this to see if you were still interested in merging this PR. For the integration test, you'd need to format your annotations like
testkey1:testvalue1,testkey2:testvalue2
Since this test is to check the removal of annotations, this may be a bit tricky and require a change in the testing script itself to see if, when no annotations are provided from the workflow, that the deployment does indeed end up with no annotations.
Additionally, some unit testing on this change would be great. I'd suggest adding them in a new file called deploymentHelper.test.ts but that can be up to you!
Thanks again for this contribution.
Hi @jaiveerk, thanks for the input. Still interested in merging it. I've been very buddy the last few days but will take a look at the upcoming days.
This PR is idle because it has been open for 14 days with no activity.
@jaiveerk,
It has been a while but a made time to add some more contributions including integration tests.
Since annotations in general consist of more than just resource annotations, for example deployment.kubernetes.io/revision
and kubectl.kubernetes.io/last-applied-configuration
, it was not sufficient to check whether the annotation is not present when not annotating the resources. Some standard annotations will always be there.
Also, testing the annotations in the integration test on key value following this format (testkey1:testvalue1,testkey2:testvalue2
) is hard to realise since annotations, especially for child annotations, can contain full dumps of the objects. Objects with variable content are hard to test this way. That's why I choose to test on keys only following the format testkey1,testkey2,testkey3
.
I also found a little bug on the k8-deploy-delete.py script. It would try to delete with parameter all
in stead of --all
resulting in a pod with the name 'all' not being found. Also this script prepended test-
to the namespace. But those were already added by all the GitHub workflow environments, resulting in namespace test-test-github_run_id
not being found. These issues impacted all the integration tests. Fixed that.
I hope these contributions are enough the get the PR merged.
Thank you!
This PR is idle because it has been open for 14 days with no activity.
@jaiveerk any updates on this? @OliverMKing FYI :)
Thanks for the approval @jaiveerk, one maintainer approval left to go :) However one integration test is failing Minikube Integration Tests - canary SMI and I feel like it is not due to my code changes. Is that possible?
@bramdehart hmm this is strange - I agree that this is most likely not related to your code change. I'm not sure if this will fix it but I noticed the other workflows run on ubuntu-latest whereas the failing test is running on ubuntu-20.04.
Would you mind updating the canary smi to run on ubuntu latest as well?
@jaiveerk nice find! That did the trick. I updated the ubuntu version and the test now succeeds.
This PR is idle because it has been open for 14 days with no activity.
@jaiveerk @OliverMKing we need one more approval from your colleagues. Could you request someone to take a look at this PR?
This PR is idle because it has been open for 14 days with no activity.
@bramdehart would you mind signing your commits? This guide should help out with that
@bramdehart would you mind signing your commits? This guide should help out with that
Hi @jaiveerk , new commits with additional signature are now added
@davidgamero Do you have an indication of when this will be released? :) I see also v4.10 not marked as latest release yet and that release was from may.
@davidgamero @jaiveerk @OliverMKing I don't want to come over as annoying, but I need clarity about the expected schedule of release. I have to say that for an open source project where contributors are spending valuable time on, the follow up on contributions and questions is something that can be improved :) Thanks for informing!
i am sorry for the delay, we just merged the new release workflow https://github.com/Azure/k8s-deploy/pull/297 and now need to bring the dependencies up to date to be able to cut a release with a successful build The goal with the new workflow is to be able to support nightly releases which will avoid this happening in the future.
We would like to run the deployment action without annotating the resources afterwards so I made an change to pass this as an optional parameter.