DependencyTrack / dependency-track

Dependency-Track is an intelligent Component Analysis platform that allows organizations to identify and reduce risk in the software supply chain.
https://dependencytrack.org/
Apache License 2.0
2.44k stars 531 forks source link

BOM upload fails without feedback due to field max length #1665

Open ecaisse opened 2 years ago

ecaisse commented 2 years ago

A BOM file that contains a component with a "publisher" field with more than 255 character fails due to the constraints of the field. However, there is no feedback or way to know that the BOM upload failed other than figuring it out from the log file.

Current Behavior:

When uploading a broken BOM file (see bom-broken.xml in bom-broken.zip), no components will be loaded, and there is no way to see that the BOM processing failed outside the log file. See "Additional Details" section for the stacktrace.

I manually created the BOM file to only include the broken component, however, this is how https://github.com/CycloneDX/cyclonedx-dotnet would generate the component for Hangfire.PostgreSql@1.9.6.

Steps to Reproduce:

To test that the issue is with the publisher field, simply truncate the field in the XML file and reupload it to Dependency Track. The component should appear correctly in the Components tab.

Expected Behavior:

I expect one of 2 behaviors:

  1. The BOM processing should succeed and limits put on fields should be removed unless explicitly stated by the CycloneDX specification
  2. There should be a way to see that the BOM processing failed in the frontend, especially when uploaded manually

I think option 1 is better than option 2. The restriction makes it a problem for using automation as there is no way to predict if a component will break Dependency Track. Since the specification does not contain restrictions on field lengths, Dependency Track should not enforce arbitrary ones.

Environment:

Additional Details:

DT stacktrace

2022-05-27 16:37:25,861 [] ERROR [org.dependencytrack.tasks.BomUploadProcessingTask] Error while processing bom
javax.jdo.JDOFatalUserException: Attempt to store value "Frank Hommers and others (Burhan Irmikci (barhun), Zachary Sims(zsims), kgamecarter, Stafford Williams (staff0rd), briangweber, Viktor Svyatokha (ahydrax), Christopher Dresel (Dresel), Vytautas Kasparavičius (vytautask), Vincent Vrijburg, David Roth (davidroth)." in column "PUBLISHER" that has maximum length of 255. Please correct your data!
        at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:615)
        at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:720)
        at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:740)
        at alpine.persistence.AbstractAlpineQueryManager.persist(AbstractAlpineQueryManager.java:417)
        at org.dependencytrack.persistence.ComponentQueryManager.createComponent(ComponentQueryManager.java:320)
        at org.dependencytrack.persistence.QueryManager.createComponent(QueryManager.java:379)
        at org.dependencytrack.tasks.BomUploadProcessingTask.processComponent(BomUploadProcessingTask.java:170)
        at org.dependencytrack.tasks.BomUploadProcessingTask.inform(BomUploadProcessingTask.java:124)
        at alpine.event.framework.BaseEventService.lambda$publish$0(BaseEventService.java:99)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
        at java.base/java.lang.Thread.run(Unknown Source)
Caused by: org.datanucleus.exceptions.NucleusUserException: Attempt to store value "Frank Hommers and others (Burhan Irmikci (barhun), Zachary Sims(zsims), kgamecarter, Stafford Williams (staff0rd), briangweber, Viktor Svyatokha (ahydrax), Christopher Dresel (Dresel), Vytautas Kasparavičius (vytautask), Vincent Vrijburg, David Roth (davidroth)." in column "PUBLISHER" that has maximum length of 255. Please correct your data!
        at org.datanucleus.store.rdbms.mapping.column.CharColumnMapping.setString(CharColumnMapping.java:253)
        at org.datanucleus.store.rdbms.mapping.java.SingleFieldMapping.setString(SingleFieldMapping.java:183)
        at org.datanucleus.store.rdbms.fieldmanager.ParameterSetter.storeStringField(ParameterSetter.java:158)
        at org.datanucleus.state.StateManagerImpl.providedStringField(StateManagerImpl.java:1853)
        at org.dependencytrack.model.Component.dnProvideField(Component.java)
        at org.dependencytrack.model.Component.dnProvideFields(Component.java)
        at org.datanucleus.state.StateManagerImpl.provideFields(StateManagerImpl.java:2528)
        at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:352)
        at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObjectInTable(RDBMSPersistenceHandler.java:162)
        at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:138)
        at org.datanucleus.state.StateManagerImpl.internalMakePersistent(StateManagerImpl.java:4569)
        at org.datanucleus.state.StateManagerImpl.makePersistent(StateManagerImpl.java:4546)
        at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2026)
        at org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionContextImpl.java:1869)
        at org.datanucleus.ExecutionContextImpl.persistObject(ExecutionContextImpl.java:1724)
        at org.datanucleus.ExecutionContextThreadedImpl.persistObject(ExecutionContextThreadedImpl.java:219)
        at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:715)
        ... 10 common frames omitted
ringerl commented 1 year ago

Any updates on this one? - also the PURL field/column is affected:

2023-02-01 12:49:33,255 ERROR [BomUploadProcessingTask] Error while processing bomjavax.jdo.JDOFatalUserException: Attempt to store value "pkg:npm/%40types/testing-library__jest-dom@5.14.5?download_url=https%3A%2F%2Fartifactory.power.inet%3A443%2Fartifactory%2Fapi%2Fnpm%2Fnpm-viopt%2F%40types%2Ftesting-library__jest-dom%2F-%2Ftesting-library__jest-dom-5.14.5.tgz#types/testing-library__jest-dom" in column ""PURL"" that has maximum length of 255. Please correct your data!

nscuro commented 1 year ago

No progress until now.

If you're using the cyclonedx-node-npm module to generate your BOMs, it supports the --short-PURLs flag for exactly this purpose: https://github.com/CycloneDX/cyclonedx-node-npm#usage

esnible commented 1 year ago

Similar problem if the metadata.component.name field is long.

2023-06-12 18:10:25,423 ERROR [GlobalExceptionHandler] Uncaught internal server error
javax.jdo.JDOFatalUserException: Attempt to store value "GH_mydb_v11571_linuxamd64_image, GH_mydb_v1158, GH_oemtools_v1158, mydb/v1158, mydb_main_test, mydb_master, mydb_test, mydb_test2, mydb_v1158, oemtools/v1158, oemtools_master, oemtools_test2, oemtools_v1158, pipeline_mydb_scans, pipeline_oemtools_scans, sbom_scan, tracker_20261" in column ""NAME"" that has maximum length of 255. Please correct your data!
    at org.datanucleus.api.jdo.JDOAdapter.getJDOExceptionForNucleusException(JDOAdapter.java:678)
    at org.datanucleus.api.jdo.JDOPersistenceManager.jdoMakePersistent(JDOPersistenceManager.java:702)
    at org.datanucleus.api.jdo.JDOPersistenceManager.makePersistent(JDOPersistenceManager.java:722)
    at alpine.persistence.AbstractAlpineQueryManager.persist(AbstractAlpineQueryManager.java:427)
    at org.dependencytrack.persistence.ProjectQueryManager.createProject(ProjectQueryManager.java:431)
...
Caused by: org.datanucleus.exceptions.NucleusUserException: Attempt to store value "GH_mydb_v11571_linuxamd64_image, GH_mydb_v1158, GH_oemtools_v1158, mydb/v1158, mydb_main_test, mydb_master, mydb_test, mydb_test2, mydb_v1158, oemtools/v1158, oemtools_master, oemtools_test2, oemtools_v1158, pipeline_mydb_scans, pipeline_oemtools_scans, sbom_scan, tracker_20261" in column ""NAME"" that has maximum length of 255. Please correct your data!

Here is an SBOM that triggers the problem:

{
  "bomFormat": "CycloneDX",
  "specVersion": "1.4",
  "version": 1,
  "metadata": {
    "tools": [
      {
        "vendor": "MyCompany",
        "name": "MyTool SBOM Generator",
        "version": "0.0.1"
      }
    ],
    "component": {
      "type": "application",
      "name": "GH_mydb_v11571_linuxamd64_image, GH_mydb_v1158, GH_oemtools_v1158, mydb/v1158, mydb_main_test, mydb_master, mydb_test, mydb_test2, mydb_v1158, oemtools/v1158, oemtools_master, oemtools_test2, oemtools_v1158, pipeline_mydb_scans, pipeline_oemtools_scans, sbom_scan, tracker_20261",
      "version": "0.0.0.0"
    }
  }
}

The D-T API returns a 500 when I attempt an upload using github.com/DependencyTrack/client-go. A 40x might be better. The D-T UI didn't complain at all.

savek-cc commented 10 months ago

We just ran into this issue with a node purl including repository names in the purl-string. Is there an actual upper bound for the length of a purl? Crashing and only handling an incomplete SBOM on SBOM import doesn't seem like the best strategy. (There is no way for a product to notice that the BOM import was incomplete because the failing insert-into-db isn't handled gracefully - but just truncates the handled BOM)

ataraxus commented 9 months ago

jup, just crashed into this issue. PURL is too long...

sfmcgee commented 3 months ago

We have a Customer required RPM that is installed on our RHEL 8 hosts with a very long name (81 characters). The generated PURL in the SBOM is more than 255 characters (276 characters), so ingestion into Dependency Track breaks for all of our systems. We would be happy with truncating the PURL at 255 characters on ingest or allowing the longer fields.

We also have now enabled an alert for BOM consumption and failures so that we have visibility into the fact that the ingest fails.