Open rseeton opened 8 months ago
Hi @rseeton, thank you for reporting this issue. We would greatly appreciate a more detailed understanding of the underlying causes of these issues. Could you please explain how you are using our SARIF output in GitHub? What is the primary use case? Which technology are you scanning? If it is possible for you to attach the SARIF file, it would also assist us in conducting a more thorough investigation.
Thank you very much for your cooperation. We will address this matter as soon as possible and keep you updated.
Hello @omerzi ,
Our organization uses a self-hosted GitHub Enterprise service for source control/CI/CD processing and a self-hosted JFrog Enterprise solution for Security (XRay) & Binary package management.
GitHub's Advanced Security solution (GHAS) provides SAST analysis (CodeQL) and allows users to upload third-party Static Code Analysis results using SARIF files (https://docs.github.com/en/code-security/code-scanning/integrating-with-code-scanning/sarif-support-for-code-scanning).
We are looking to load XRay SCA results into the GHAS solution using the SARIF output to provide a centralized reporting/review interface.
Our products uses a variety of languages including Java, C++ & Rust.
I will review the SARIF file and will attach a sanitized version later today.
Let me know if more details are required.
Thanks for your attention.
@omerzi - SARIF file as requested jfrog_scan_artifact-5.3.0-2.sarif.gz
@omerzi @rseeton
I am also having the issue with missing location, i reverted to version 2.48.0 , seems that in 2.49.0 location info was removed, causing a regression, possibly caused by this change: https://github.com/jfrog/jfrog-cli/issues/2135
"locations": [
{
"physicalLocation": {
"artifactLocation": {
**"uri": " Package Descriptor"**
}
}
}
]
Unfortunately, attempts to run the older jfrog cli 2.48.0 are failing:
jf version 2.48.0 2415:11:46 [Info] JFrog Xray version is: 3.65.2 2515:11:46 [Info] JFrog Xray Indexer 3.65.2 is not cached locally. Downloading it now... 26Error: 7 [Error] failed while attempting to download Xray indexer: failed while attempting to download 'https://artifactory.ssnc.dev/xray/api/v1/indexer-resources/download/linux/386': 27server response: 400 Bad Request 28Error: Process completed with exit code 1.
After follow-up with GitHub, their requirements are laid out here:
locations[] | The set of locations where the result was detected up to a maximum of 10. Only one location should be included unless the problem can only be corrected by making a change at every specified location. Note: At least one location is required for code scanning to display a result. Code scanning will use this property to decide which file to annotate with the result. Only the first value of this array is used. All other values are ignored.
(Which makes sense as we will need to know where the offending artifact is to determine how to resolve the problem).
Hi @rseeton ,
Thank you for bringing this issue to our attention. You can track the progress of the fix by following this link: https://github.com/jfrog/jfrog-cli-core/pull/1021.
We appreciate your cooperation.
Hello @attiasas,
I see that the artifactLocation has been added to the 2.51.1 package, producing the following in each results element:
"locations": [
{
"physicalLocation": {
"artifactLocation": {
"uri": "file://Package Descriptor"
}
}
}
]
}
The file does pass the SARIF Validation, including the GitHub Ingestion rules, which is good. However, none of these entries are populated, resulting in the default "Package Descriptor" entry. There are two issues with this:
Error: Code Scanning could not process the submitted SARIF file:
an invalid URI was provided as a SARIF location: parse "file://Package Descriptor": invalid character " " in host name
(It looks like the 'space' character in Package Descriptor is the issue - replacing this with Package_Descriptor resolves the URI validation issue)
Hi @rseeton,
Thank you for your feedback!
Regarding the issue you raised:
For some technologies, this feature (connecting a file to the issue) is not yet supported. In such cases, the default value is shown (Package Descriptor
).
I believe that in your case you are referring to a result of a Docker scan, am I right? We currently do not support it, so please feel free to open a feature request, and we will address it when prioritized.
I will modify the default value to replace the space character as suggested. you can follow https://github.com/jfrog/jfrog-cli-core/pull/1028/files for the fix.
@attiasas - Thanks for looking into the URI formatting issue.
For the general "Missing path information" problem, these scans are being run against the tar file that we ship for on-premise installations. (As opposed to a tarball of a docker image as discussed here: https://jfrog.com/help/r/jfrog-cli/scanning-image-tarballs-on-the-local-file-system - these also don't provide path details).
I've attached a test tar file with known issue (CVE-2023-43642) testing.
Using jf 2.51.1, we get the following 'results' section in the SARIF file:
"results": [ { "ruleId": "CVE-2023-43642_org.xerial.snappy:snappy-java_1.1.10.1", "ruleIndex": 0, "level": "error", "message": { "text": "[CVE-2023-43642] org.xerial.snappy:snappy-java 1.1.10.1" }, "locations": [ { "physicalLocation": { "artifactLocation": { "uri": "file://Package Descriptor" } } } ] } ]
For comparison, using an open-source solution against the same file, we get a populated location URI:
"results": [ { "ruleId": "CVE-2023-43642", "level": "warning", "message": { "text": "CVE-2023-43642 - snappy-java is a Java port of the snappy, a fast C++ compresser/decompresser developed by Google. The SnappyInputStream was found to be vulnerable to Denial of Service (DoS) attacks when decompressing data with a too large chunk size. Due to missing upper bound check on chunk length, an unrecoverable fatal error can occur. All versions of snappy-java including the latest released version 1.1.10.3 are vulnerable to this issue. A fix has been introduced in commit `9f8c3cf74` which will be included in the 1.1.10.4 release. Users are advised to upgrade. Users unable to upgrade should only accept compressed data from trusted sources." }, "partialFingerprints": { "vulnerabilityHash": "f13459376482773b4656c86e512a17a4" }, "locations": [ { "physicalLocation": { "artifactLocation": { "uri": "file:///mnts/homes/rseeton/sarif_scan_test.tar/rhel-x86_64-8.5/third-party-libs/snappy-java-1.1.10.1.jar", "index": 0 } }, "logicalLocations": [ { "fullyQualifiedName": "pkg:maven/org.xerial.snappy/snappy-java@1.1.10.1" } ] } ] } ]
Describe the bug
GitHub Advanced Security requires a 'physicalLocation.artifactLocation.uri' value. GHAS will fail to load the SARIF results if this data is not provided.
The SARIF file from jfrog-cli 2.50.2 scans is missing this value (previous releases have loaded, but I don't have any of them available to confirm).
Each result location must provide the property 'physicalLocation.artifactLocation.uri'. GitHub Advanced Security code scanning will not display a result whose location does not provide the URI of the artifact that contains the result.
Current behavior
SARIF file generated by jfrog-cli 2.50.2 fails the validation using the GHAS ingestion rules ( https://sarifweb.azurewebsites.net/Validation )
Reproduction steps
Generate SARIF file from 2.50.2 Run through the https://sarifweb.azurewebsites.net/Validation page with the GitHub ingestion rules enabled
Expected behavior
Clean results from SARIF Web validation
JFrog CLI version
jf version 2.50.2
Operating system type and version
Centos 8
JFrog Artifactory version
Enterprise Plus 7.55.10 rev 75510900
JFrog Xray version
{"xray_version":"3.65.2","xray_revision":"bca527a"}