(I think) this will facilitate the save-all-strava-activities functionality, which currently fails on server with the message "EOF" which indicates the action ran out of RAM. Also it will allow the user to do something else while the action proceeds in the background, instead of locking them to the page and waiting for all of them to be done.
Notes
Operating locally
command line 1: ./run_redis.sh
command line 2: celery -A celery_worker.celery worker -l INFO
Any info from celery tasks appears in this stdout. It won't be in flask stdout.
command line 3: run the flask app
Need a broker (separate service used by celery to send/receive msgs)
Redis (preferred because of other uses for app)
RabbitMQ
Unclear how to use: celery worker server
Runs in the background as a daemon in production
celery -A tasks worker
Results backend
Optional if I don't need to store results or state (ie if I just perform the operations)
Similar to broker service?
Can use redis here too. (Or SQLAlchemy, RabbitMQ, ...)
Examples
from tasks import add
result = add.delay(4, 4)
result.ready() # False
Can rate-limit too
eg 10 tasks of a certain type can be processed in a minute.
(I think) this will facilitate the save-all-strava-activities functionality, which currently fails on server with the message "EOF" which indicates the action ran out of RAM. Also it will allow the user to do something else while the action proceeds in the background, instead of locking them to the page and waiting for all of them to be done.
Notes
./run_redis.sh
celery -A celery_worker.celery worker -l INFO
celery -A tasks worker
apscheuler.schedulers.background.BackgroundScheduler
Then edit the line in
/etc/redis/redis.conf
to readsupervised systemd
. Finally:/etc/systemd/system/celery.service
/etc/conf.d/celery
References