Closed partrita closed 6 years ago
I believe you have 2 separate brokers setup. Is the raspberrypi also hosting your webserver? Your rabbit broker is currently localhost
, instead of a shared rabbit broker than can pass messages from the frontend/backend.
@Chris7 I don't get it.
Yes, raspberri pi host the webserver by nginx. How can I share rabbitmq
broker with wooey
?
Could you post your user_settings.py
?
Here is my file. I just change the amqp settings.
import errno
import os
from .wooey_settings import *
INSTALLED_APPS += (
)
# Whether to allow anonymous job submissions, set False to disallow 'guest' job submis$
WOOEY_ALLOW_ANONYMOUS = True
## Celery related options
INSTALLED_APPS += (
'django_celery_results',
'kombu.transport.filesystem',
)
CELERY_RESULT_BACKEND = 'django-db'
CELERY_BROKER_URL = 'amqp://myuser:mypassword@localhost:5672/myvhost'
def ensure_path(path):
try:
os.makedirs(path)
except Exception as e:
if e.errno == errno.EEXIST:
pass
else:
raise
return path
broker_dir = ensure_path(os.path.join(BASE_DIR, '.broker'))
CELERY_BROKER_TRANSPORT_OPTIONS = {
"data_folder_in": ensure_path(os.path.join(broker_dir, "out")),
"data_folder_out": ensure_path(os.path.join(broker_dir, "out")),
"data_folder_processed": ensure_path(os.path.join(broker_dir, "processed")),
}
CELERY_TRACK_STARTED = True
WOOEY_CELERY = True
CELERY_SEND_EVENTS = True
CELERY_IMPORTS = ('wooey.tasks',)
# A cache interface. This provides realtime updates for scriots and should definitel$
# to use something like redis or memcached in production
WOOEY_REALTIME_CACHE = 'default'
CACHES = {
'default': {
'BACKEND': 'django.core.cache.backends.db.DatabaseCache',
'LOCATION': 'wooey_cache_table',
}
}
# Things you most likely do not need to change
# the directory for uploads (physical directory)
MEDIA_ROOT = os.path.join(BASE_DIR, 'user_uploads')
# the url mapping
MEDIA_URL = '/uploads/'
# the directory to store our webpage assets (images, javascript, etc.)
STATIC_ROOT = os.path.join(BASE_DIR, 'static')
# the url mapping
STATIC_URL = '/static/'
## Here is a setup example for production servers
## A better celery broker -- using RabbitMQ (these defaults are from two free rabbit$
#
#
#CELERY_BROKER_URL = os.environ.get('AMQP_URL') or \
# os.environ.get('RABBITMQ_BIGWIG_TX_URL') or \
# os.environ.get('CLOUDAMQP_URL', 'amqp://guest:guest@localhost:5672/')
#CELERY_BROKER_POOL_LIMIT = 1
#CELERYD_CONCURRENCY = 1
#CELERY_TASK_SERIALIZER = 'json'
#CELERY_TASK_ACKS_LATE = True
AUTHENTICATION_BACKEND = 'django.contrib.auth.backends.ModelBackend'
this is the results
$ celery -A biohack worker -l info
-------------- celery@raspberrypi v4.2.0 (windowlicker)
---- **** -----
--- * *** * -- Linux-4.14.34-v7+-armv7l-with-debian-9.4 2018-06-27 12:59:28
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app: biohack:0x75efb1d0
- ** ---------- .> transport: amqp://partrita:**@58.79.113.145:5672/myvhost
- ** ---------- .> results:
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
[tasks]
. biohack.wooey_celery_app.debug_task
. wooey.tasks.cleanup_dead_jobs
. wooey.tasks.cleanup_wooey_jobs
. wooey.tasks.submit_script
[2018-06-27 12:59:29,093: INFO/MainProcess] Connected to amqp://partrita:**@58.79.113.145:5672/myvhost
[2018-06-27 12:59:29,188: INFO/MainProcess] mingle: searching for neighbors
[2018-06-27 12:59:30,325: INFO/MainProcess] mingle: all alone
[2018-06-27 12:59:30,422: INFO/MainProcess] celery@raspberrypi ready.
Thanks for your time :+1:
I've try deploy wooey at raspberry pi. It's basically should be same as 'digital ocean' deployments, right? Scripts works at localhost but, not on the web. you can check it here I thought it because of
celery
so I installRabbitMQ
for broker. also edituser_setting.py
for this.but still doesn't work at all. I followed this tutorial. any advice?