Open prabhu opened 1 year ago
Hi @prabhu , Do we need to compare with multi-language SBOM generation tool or we also need to include language specific SBOM generation tool
Both.
I was shared yet another benchmark that merely ran cdxgen and a bunch of other tools using the default commands. The hilarious thing was that, in such a default invocation, only cdxgen tried to generate a "build" lifecycle SBOM, while the other tools went for "pre-build". cdxgen must have encountered numerous errors, which it wouldn't report without the CDXGEN_DEBUG_MODE=debug
environment variable, so these new users wouldn't know what had happened.
I honestly do not know what to do and what magic to add in the tool, other than to request the users to get in touch or try the tool and its argument to get a grasp of what it does. Merely running tens of SBOM tools with default settings and producing an academic paper out of such incorrect data is really of no use for this project.
https://www2.cose.isu.edu/~minhazzibran/resources/MyPapers/SBOM_SAC24_Published.pdf
Here the authors took some npm projects with no package-lock.json file such as this, ran an older version of cdxgen that doesn't automatically invoke npm install and came up with a result that some unknown tool is better than cdxgen. There are zero matches for the word "lifecycle", so it is possible they are not even comparing the same SBOMs generated for the same lifecycle such as build
or pre-build
. Number of npm projects shown in Table 2 is also quite low, indicating that they are only counting the dependencies referred in the package.json and not the full list including transitive dependencies from a lock file!
The paper has made zero attempts to show case the research profile with automatic occurrence and callstack evidence generated by cdxgen. Nor have they run with FETCH_LICENSE=true
to produce an SBOM for compliance.
We need some docs on evaluating and benchmarking cdxgen with other tools.