Open avivek opened 3 years ago
Can you provide more info?
Are you talking about 2 or 3?
Are you using git
or github
authentication to the shared library.
Related PR which fixed it for most users: https://github.com/jenkinsci/github-checks-plugin/pull/70
Specifics especially config as code would be helpful to get a fix for this, and it likely belongs to github-checks-plugin
So we have a global implicit loaded library after which the pipeline code checks out as follows:
stage('SCM')
{
dir('xxxx')
{
git branch: 'AMCC-287-Test', credentialsId: 'JenkinsGithubApp', url: '$REPO_URL'
}
}
stage('Docker-Image-Scan')
{
publishChecks conclusion: 'NONE', detailsURL: "${JOB_URL}", name: 'Build', status: 'IN_PROGRESS', summary: 'Building
Docker image', title: 'Build'
# do some work here
publishChecks detailsURL: "${JOB_URL}", name: 'Build', summary: 'Building Docker image', title: 'Build'
}
We are using the github app for the credentials - both the shared library and this repo. There are also cases when we might use another repo in the same pipeline.
We are using github notify right now and were exploring the checks api.
The question is since we are picking context and credentials implicitly, what is the logic used?
I feel that finding the correct repository and commit is among the most difficult parts in implementing a checks publisher plugin. In multibranch pipeline projects, I think SCMSource.SourceByItem.findSource(run.getParent())
and SCMRevisionAction.getRevision(scmSource, run)
is generally the right choice, but maybe not if submodules are involved. It might make sense to move all that logic into checks-api-plugin, and then have an SCM selection override ability as part of withChecks
so that it could be used without tying the job or pipeline to a specific publisher. There is a similar feature in https://github.com/jenkinsci/forensics-api-plugin/pull/246.
Yes while a default behaviour is good in most cases having the ability to override would be really powerful. But even then : the pipeline has a default check which will carry the information from what is happening at the pipeline stages to github. Where do u think that will go to, and how will we override that information.
The default check should be correct at least with multi branch pipelines, the PR I linked above fixed that in all cases I know of
Pipelines doing custom notifications like you are above likely won’t work with this, the suggestion above does look good though
SO the default check was going to the wrong repository i.e. the shared library in my case. Is that something that is fixed and may be if I am having the issue, I can reopen the bug on the Github checks end??
It was fixed for multibranch pipelines but are you using the non-multibranch "Pipeline" project type instead?
Yes am using a regular pipeline NOT a multi-branch.
I see. Making that work will require changes in github-checks-plugin then, and preferably in checks-api-plugin as well.
Thanks a lot for quick responses. We will watch this ticket.
Can SCM.getModuleRoots(FilePath workspace, AbstractBuild build) be used for checking whether the SCM checked out something to the workspace or was only referenced for a pipeline library?
CM.getModuleRoots(FilePath workspace, AbstractBuild build)
I don't think so, anything with AbstractBuild
is normally only for freestyle builds.
The global pipeline library checkout apparently happens by SCMSourceRetriever.doRetrieve calling SCMStep.checkout, which in turn calls SCM.checkout. I don't see any obvious way to mark the SCM as not to be used by Checks API. But if the withChecks or publishChecks step in the pipeline could specify a string that Checks API would compare to SCM.getKey(), that should work. Could then have another parameter for the credentials ID as requested in https://github.com/jenkinsci/checks-api-plugin/issues/103.
Need to check how the checks publisher plugins compute the API endpoint URL from the SCM.
I have a pipeline that runs a tool on several hundred files in the same Git repository as the pipeline itself, but also on six extra files that it imports from four other Git repositories. To import a file, it makes an HTTP request with credentials and saves the response.
Now though, if the tool reports warnings on any of the imported files, then I'd like to publish the warnings to the repository from which that file came. Checks API does not currently support that, as ChecksPublisherFactory
(source) creates a ChecksPublisher
instance for a whole Job
or Run
, not for an individual SCM
(javadoc) and SCMRevision
(javadoc). Even if such a feature were added to Checks API, it would not work with my pipeline, because Jenkins has no idea about the repositories from which the pipeline imports the files.
I tried changing the pipeline to use Jenkins checkout
steps to shallow clone the large repositories and make sparse checkouts. This would create a WorkflowRun.SCMCheckout
(source) instance for each repository, and the SCM
instances would then be available to Checks API. However, because the other repositories also contain approximately 12000 files that the pipeline doesn't need, the checkout
steps would cost 57 seconds, while the HTTP downloads cost only 13 seconds. Partial clone might speed that up, but JENKINS-64844 says it is not going to be implemented; it recommends instead running git
commands, which would not produce SCM
instances for Checks API.
So, if a multi-repository feature is ever added to Checks API, then I wonder if it would also make sense to implement one of:
GitSCMExtension
that prevents the checkout
step from fetching any commits from the remote, but still allows a WorkflowRun.SCMCheckout
instance to be created.SCMFileSystem
(source) instead of fetching commits.SCM
instances. Something like SARIF-v2.1.0 §3.14.13, but with credential IDs and information about the type of the version control software. This would be usable even if the pipeline copies individual files to a directory not in the SCM checkout.
Version report
Reproduction steps