clone project via API (includeAuditHistory=true, includeComponents=true)
check token (/events...) until processing=false
upload BOM
check token (/events...) until processing=false
check results
delete project
Quite often it happens that the upload of the BOM fails with something like
Insert of object "org.dependencytrack.model.ProjectMetadata@6cb89346" using statement "INSERT INTO "PROJECT_METADATA" ("AUTHORS","PROJECT_ID","SUPPLIER") VALUES (?,?,?)" failed : ERROR: duplicate key value violates unique constraint "PROJECT_METADATA_PROJECT_ID_IDX"
Detail: Key ("PROJECT_ID")=(215) already exists.
Stacktrace
```
javax.jdo.JDODataStoreException: Insert of object "org.dependencytrack.model.ProjectMetadata@27963218" using statement "INSERT INTO "PROJECT_METADATA" ("AUTHORS","PROJECT_ID","SUPPLIER") VALUES (?,?,?)" failed : ERROR: duplicate key value violates unique constraint "PROJECT_METADATA_PROJECT_ID_IDX"
Detail: Key ("PROJECT_ID")=(269) already exists.
at org.datanucleus.api.jdo.JDOAdapter.getJDOExceptionForNucleusException(JDOAdapter.java:605)
at org.datanucleus.api.jdo.JDOPersistenceManager.flush(JDOPersistenceManager.java:2057)
at org.dependencytrack.tasks.BomUploadProcessingTaskV2.processProject(BomUploadProcessingTaskV2.java:369)
at org.dependencytrack.tasks.BomUploadProcessingTaskV2.lambda$processBom$0(BomUploadProcessingTaskV2.java:297)
at org.dependencytrack.persistence.QueryManager.lambda$runInTransaction$0(QueryManager.java:1433)
at org.dependencytrack.persistence.QueryManager.runInTransaction(QueryManager.java:1464)
at org.dependencytrack.persistence.QueryManager.runInTransaction(QueryManager.java:1432)
at org.dependencytrack.tasks.BomUploadProcessingTaskV2.processBom(BomUploadProcessingTaskV2.java:296)
at org.dependencytrack.tasks.BomUploadProcessingTaskV2.processEvent(BomUploadProcessingTaskV2.java:187)
at org.dependencytrack.tasks.BomUploadProcessingTaskV2.inform(BomUploadProcessingTaskV2.java:162)
at alpine.event.framework.BaseEventService.lambda$publish$0(BaseEventService.java:110)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.base/java.lang.Thread.run(Unknown Source)
Caused by: org.postgresql.util.PSQLException: ERROR: duplicate key value violates unique constraint "PROJECT_METADATA_PROJECT_ID_IDX"
Detail: Key ("PROJECT_ID")=(269) already exists.
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2725)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2412)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:371)
at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:502)
at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:419)
at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:194)
at org.postgresql.jdbc.PgPreparedStatement.executeUpdate(PgPreparedStatement.java:155)
at com.zaxxer.hikari.pool.ProxyPreparedStatement.executeUpdate(ProxyPreparedStatement.java:61)
at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.executeUpdate(HikariProxyPreparedStatement.java)
at org.datanucleus.store.rdbms.SQLController.doExecuteStatementUpdate(SQLController.java:463)
at org.datanucleus.store.rdbms.SQLController.executeStatementUpdateDeferRowCountCheckForBatching(SQLController.java:413)
at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:532)
at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObjectInTable(RDBMSPersistenceHandler.java:235)
at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:211)
at org.datanucleus.state.StateManagerImpl.internalMakePersistent(StateManagerImpl.java:4614)
at org.datanucleus.state.StateManagerImpl.flush(StateManagerImpl.java:5848)
at org.datanucleus.flush.FlushOrdered.execute(FlushOrdered.java:96)
at org.datanucleus.ExecutionContextImpl.flushInternal(ExecutionContextImpl.java:4050)
at org.datanucleus.ExecutionContextImpl.flush(ExecutionContextImpl.java:3996)
at org.datanucleus.api.jdo.JDOPersistenceManager.flush(JDOPersistenceManager.java:2040)
... 12 common frames omitted
```
It also seems to be dependent on the project. Some work without issues and some mostly fail but I cannot really spot a difference (apart from components/suppressions)
You may wonder why I clone the project, just to delete it right after :) Im working on an integration of dtrack analysis in our pipelines. For each PR I want to analyse policy violations&vulns on the basis of the audit history of the "main" version. I think there was once an issue for this feature but I cannnot find it anymore.
Steps to Reproduce
clone project via
check for token (/events...) until processing=false
Current Behavior
I do something like this:
Quite often it happens that the upload of the BOM fails with something like
Stacktrace
``` javax.jdo.JDODataStoreException: Insert of object "org.dependencytrack.model.ProjectMetadata@27963218" using statement "INSERT INTO "PROJECT_METADATA" ("AUTHORS","PROJECT_ID","SUPPLIER") VALUES (?,?,?)" failed : ERROR: duplicate key value violates unique constraint "PROJECT_METADATA_PROJECT_ID_IDX" Detail: Key ("PROJECT_ID")=(269) already exists. at org.datanucleus.api.jdo.JDOAdapter.getJDOExceptionForNucleusException(JDOAdapter.java:605) at org.datanucleus.api.jdo.JDOPersistenceManager.flush(JDOPersistenceManager.java:2057) at org.dependencytrack.tasks.BomUploadProcessingTaskV2.processProject(BomUploadProcessingTaskV2.java:369) at org.dependencytrack.tasks.BomUploadProcessingTaskV2.lambda$processBom$0(BomUploadProcessingTaskV2.java:297) at org.dependencytrack.persistence.QueryManager.lambda$runInTransaction$0(QueryManager.java:1433) at org.dependencytrack.persistence.QueryManager.runInTransaction(QueryManager.java:1464) at org.dependencytrack.persistence.QueryManager.runInTransaction(QueryManager.java:1432) at org.dependencytrack.tasks.BomUploadProcessingTaskV2.processBom(BomUploadProcessingTaskV2.java:296) at org.dependencytrack.tasks.BomUploadProcessingTaskV2.processEvent(BomUploadProcessingTaskV2.java:187) at org.dependencytrack.tasks.BomUploadProcessingTaskV2.inform(BomUploadProcessingTaskV2.java:162) at alpine.event.framework.BaseEventService.lambda$publish$0(BaseEventService.java:110) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) at java.base/java.lang.Thread.run(Unknown Source) Caused by: org.postgresql.util.PSQLException: ERROR: duplicate key value violates unique constraint "PROJECT_METADATA_PROJECT_ID_IDX" Detail: Key ("PROJECT_ID")=(269) already exists. at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2725) at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2412) at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:371) at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:502) at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:419) at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:194) at org.postgresql.jdbc.PgPreparedStatement.executeUpdate(PgPreparedStatement.java:155) at com.zaxxer.hikari.pool.ProxyPreparedStatement.executeUpdate(ProxyPreparedStatement.java:61) at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.executeUpdate(HikariProxyPreparedStatement.java) at org.datanucleus.store.rdbms.SQLController.doExecuteStatementUpdate(SQLController.java:463) at org.datanucleus.store.rdbms.SQLController.executeStatementUpdateDeferRowCountCheckForBatching(SQLController.java:413) at org.datanucleus.store.rdbms.request.InsertRequest.execute(InsertRequest.java:532) at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObjectInTable(RDBMSPersistenceHandler.java:235) at org.datanucleus.store.rdbms.RDBMSPersistenceHandler.insertObject(RDBMSPersistenceHandler.java:211) at org.datanucleus.state.StateManagerImpl.internalMakePersistent(StateManagerImpl.java:4614) at org.datanucleus.state.StateManagerImpl.flush(StateManagerImpl.java:5848) at org.datanucleus.flush.FlushOrdered.execute(FlushOrdered.java:96) at org.datanucleus.ExecutionContextImpl.flushInternal(ExecutionContextImpl.java:4050) at org.datanucleus.ExecutionContextImpl.flush(ExecutionContextImpl.java:3996) at org.datanucleus.api.jdo.JDOPersistenceManager.flush(JDOPersistenceManager.java:2040) ... 12 common frames omitted ```This is logged by
BomUploadProcessingTaskV2
. The error message is somehow similar to https://github.com/DependencyTrack/dependency-track/issues/3324It also seems to be dependent on the project. Some work without issues and some mostly fail but I cannot really spot a difference (apart from components/suppressions)
You may wonder why I clone the project, just to delete it right after :) Im working on an integration of dtrack analysis in our pipelines. For each PR I want to analyse policy violations&vulns on the basis of the audit history of the "main" version. I think there was once an issue for this feature but I cannnot find it anymore.
Steps to Reproduce
Expected Behavior
No error
Dependency-Track Version
4.11.4
Dependency-Track Distribution
Container Image
Database Server
PostgreSQL
Database Server Version
15.7
Browser
Google Chrome
Checklist