Right now if a demo needs to be analysis (for example after being uploaded) a simple entry is added to a queue in the process itself. This works fine currently but does not persist between restarts and does not allow for any scaling or separation between the api process and analysis process.
Proposal
For now we could use a very basic table in Postgresql for handling tasks.
There are a couple of examples online for how to use postgres as a task queue which is exactly what we want.
This has the added benefit of allowing for restarts without losing the queue, separating the api and analysis and in general decoupling the different parts.
Future considerations
Depending on the pressure this puts on the DB and how active this queue will be, we should keep in mind to maybe switch to something different/purpose built (like kafka) for the task-queue. However currently postgres should work just fine and make it simpler by only needing a single external dependency
Current State
Right now if a demo needs to be analysis (for example after being uploaded) a simple entry is added to a queue in the process itself. This works fine currently but does not persist between restarts and does not allow for any scaling or separation between the api process and analysis process.
Proposal
For now we could use a very basic table in Postgresql for handling tasks. There are a couple of examples online for how to use postgres as a task queue which is exactly what we want. This has the added benefit of allowing for restarts without losing the queue, separating the api and analysis and in general decoupling the different parts.
Future considerations
Depending on the pressure this puts on the DB and how active this queue will be, we should keep in mind to maybe switch to something different/purpose built (like kafka) for the task-queue. However currently postgres should work just fine and make it simpler by only needing a single external dependency