copper-engine / copper-engine

COPPER - a high performance Java workflow engine
http://www.copper-engine.org/
Apache License 2.0
269 stars 71 forks source link

Duplicate Execution of Workflow Parts in Persistent Workflows when using distributed Copper Engines #118

Closed dareckf closed 2 months ago

dareckf commented 3 months ago

Description: We are using the Copper Engine in a cloud environment and leveraging MariaDB (DBaaS) for persistent workflows. In our workflows, we set savepoints. To meet disaster recovery requirements, we now run our pods in a second cloud location in parallel using the same database. However, we observe that parts of a started workflow are executed twice, which contradicts the information under "High Availability/ Load Distribution" in the documentation, suggesting that such duplicate execution should not occur.

Environment: Copper Engine Version: 5.4.2 Database: MariaDB (DBaaS) Cloud Environment: private cloud based on Kubernetes

Possible Workaround: We have found a workaround that appears to work based on our tests: we set different Pool IDs and configured each instance to use only its own Pool ID.

Unfortunately, I cannot share the specific code as it is not public.

Is it supported to use the same database instance across different containers for persistent workflows?

Keymaster65 commented 3 months ago

Which EngineIdProvider do you use in your PersistentScottyEngine? Using different engineIds in containers should help in this context.