conductor-oss / conductor

Conductor is an event driven orchestration platform
https://conductor-oss.org
Apache License 2.0
18.22k stars 487 forks source link

Postgres indexing problem with task_type length #111

Closed vinize closed 6 months ago

vinize commented 7 months ago

Describe the bug When changing the indexing database from elastic to postgres, an error occurs related to the length of the task name.

Caused by: com.netflix.conductor.core.exception.NonTransientException: ERROR: value too long for type character varying(32)
    at com.netflix.conductor.postgres.dao.PostgresBaseDAO.getWithRetriedTransactions(PostgresBaseDAO.java:148) ~[conductor postgres-persistence.jar!/:?]
    at com.netflix.conductor.postgres.dao.PostgresBaseDAO.queryWithTransaction(PostgresBaseDAO.java:210) ~[conductor-postgres-persistence.jar!/:?]
    at com.netflix.conductor.postgres.dao.PostgresIndexDAO.indexTask(PostgresIndexDAO.java:151) ~[conductor-postgres-persistence.jar!/:?]
    at com.netflix.conductor.core.dal.ExecutionDAOFacade.updateTask(ExecutionDAOFacade.java:517) ~[conductor-core.jar!/:?]
    ... 18 more

It turns out that there is a problem with the length of the task_type field. According to the database schema, it should be up to 32 characters long. However, in the case of SIMPLE tasks, the task_name values go there instead of task_type

Details Conductor version: 3.18 Persistence implementation: Postgres Queue implementation: Redis Lock: Redis Workflow definition:

{
  "accessPolicy": {},
  "name": "TestWorkflowWithLongTaskName",
  "description": "Test Workflow to Index long task name in postgres",
  "version": 1,
  "tasks": [
    {
      "name": "TaskWithLongNameInTestWorkflowWithLongTaskName",
      "taskReferenceName": "TaskWithLongNameInTestWorkflowWithLongTaskName",
      "inputParameters": {},
      "type": "SIMPLE",
      "startDelay": 0,
      "optional": false,
      "asyncComplete": false,
      "permissive": false
    }
  ],
  "inputParameters": [],
  "outputParameters": {},
  "schemaVersion": 2,
  "restartable": true,
  "workflowStatusListenerEnabled": false,
  "ownerEmail": "example@email.com",
  "timeoutPolicy": "ALERT_ONLY",
  "timeoutSeconds": 0,
  "variables": {},
  "inputTemplate": {}
}

To Reproduce Steps to reproduce the behavior:

  1. Use Postgres Index
  2. Create workflow with task which has name more then 32 symbols
  3. Run it and terminate (because lack of worker)

Expected behavior In case on simple tasks value 'SIMPLE' should be inserted to task_type column, or task_type column should be extended to 255 length (like task_name)

Additional context bug initially reported at: https://github.com/Netflix/conductor-community/issues/252

Prebiusta commented 7 months ago

I have faced exactly the same issue, had to revert back to ES indexing

bjpirt commented 7 months ago

OK - should be pretty straightforward to change the column length in the database schema for this

Prebiusta commented 6 months ago

I think this issue can be closed

vinize commented 6 months ago

looks good in version 3.19.0

Robban1980 commented 5 months ago

@vinize was the code updated to insert the type instead of the name in the task_type column, i only see a PR for making the field length longer?

vinize commented 5 months ago

in version 3.19 only the size was increased, but this change allows you to run a workflow with a long name