Script "worker.py" that can be run as a standalone worker client, point it to the redis
Functions that may register a job (worker.run('task', **kw)) with the queue or execute the task immediately in a subprocess (depending on what mode is set, for test servers we can just run everything on the same machine)
Use BRPOPLPUSH in REDIS to take a task from the available tasks and put it on the processing tasks list. The task must be updated with worker info.
Each worker has a heartbeat on REDIS periodically so that we can tell if it died or if it still running a job.
Server does a blocking pop on the result key, so when the worker puts the data there it is returned to the client.
If a worker died while it is processing a job, for now we can just remove the job and return an error to the client.
A simple job queue that has