wooey / Wooey

A Django app that creates automatic web UIs for Python scripts.
http://wooey.readthedocs.org
BSD 3-Clause "New" or "Revised" License
2.14k stars 186 forks source link

how to setup wooey for my raspberry Pi? #256

Closed partrita closed 6 years ago

partrita commented 6 years ago

I've try deploy wooey at raspberry pi. It's basically should be same as 'digital ocean' deployments, right? Scripts works at localhost but, not on the web. you can check it here I thought it because of celery so I install RabbitMQ for broker. also edit user_setting.py for this.

$celery -A biohack worker -l info
-------------- celery@raspberrypi v4.2.0 (windowlicker)
---- **** -----
--- * ***  * -- Linux-4.14.34-v7+-armv7l-with-debian-9.4 2018-06-26 07:39:23
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app:         biohack:0x75ee9fd0
- ** ---------- .> transport:   amqp://guest:**@localhost:5672//
- ** ---------- .> results:
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery

[tasks]
  . biohack.wooey_celery_app.debug_task
  . wooey.tasks.cleanup_dead_jobs
  . wooey.tasks.cleanup_wooey_jobs
  . wooey.tasks.submit_script

[2018-06-26 07:39:24,408: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2018-06-26 07:39:24,490: INFO/MainProcess] mingle: searching for neighbors
[2018-06-26 07:39:25,603: INFO/MainProcess] mingle: all alone
[2018-06-26 07:39:25,689: INFO/MainProcess] celery@raspberrypi ready.

but still doesn't work at all. I followed this tutorial. any advice?

$pip freeze
amqp==2.3.2
billiard==3.5.0.3
celery==4.2.0
clinto==0.2.1
Django==1.11.13
django-autoslug==1.9.3
django-celery-results==1.0.1
jsonfield==2.0.2
kombu==4.2.1
pkg-resources==0.0.0
pytz==2018.4
six==1.11.0
uWSGI==2.0.17
vine==1.1.4
wooey==0.10.0
Chris7 commented 6 years ago

I believe you have 2 separate brokers setup. Is the raspberrypi also hosting your webserver? Your rabbit broker is currently localhost, instead of a shared rabbit broker than can pass messages from the frontend/backend.

partrita commented 6 years ago

@Chris7 I don't get it. Yes, raspberri pi host the webserver by nginx. How can I share rabbitmq broker with wooey?

Chris7 commented 6 years ago

Could you post your user_settings.py?

partrita commented 6 years ago

Here is my file. I just change the amqp settings.

import errno
import os
from .wooey_settings import *
INSTALLED_APPS += (
)

# Whether to allow anonymous job submissions, set False to disallow 'guest' job submis$
WOOEY_ALLOW_ANONYMOUS = True

## Celery related options
INSTALLED_APPS += (
    'django_celery_results',
    'kombu.transport.filesystem',
)
CELERY_RESULT_BACKEND = 'django-db'
CELERY_BROKER_URL = 'amqp://myuser:mypassword@localhost:5672/myvhost'

def ensure_path(path):
    try:
        os.makedirs(path)
    except Exception as e:
        if e.errno == errno.EEXIST:
            pass
        else:
            raise
    return path
broker_dir = ensure_path(os.path.join(BASE_DIR, '.broker'))
CELERY_BROKER_TRANSPORT_OPTIONS = {
    "data_folder_in": ensure_path(os.path.join(broker_dir, "out")),
    "data_folder_out": ensure_path(os.path.join(broker_dir, "out")),
    "data_folder_processed": ensure_path(os.path.join(broker_dir, "processed")),
}

CELERY_TRACK_STARTED = True
WOOEY_CELERY = True
CELERY_SEND_EVENTS = True
CELERY_IMPORTS = ('wooey.tasks',)

# A cache interface. This provides realtime updates for scriots and should definitel$
# to use something like redis or memcached in production
WOOEY_REALTIME_CACHE = 'default'
CACHES = {
    'default': {
        'BACKEND': 'django.core.cache.backends.db.DatabaseCache',
        'LOCATION': 'wooey_cache_table',
    }
}

# Things you most likely do not need to change

# the directory for uploads (physical directory)
MEDIA_ROOT = os.path.join(BASE_DIR, 'user_uploads')
# the url mapping
MEDIA_URL = '/uploads/'

# the directory to store our webpage assets (images, javascript, etc.)
STATIC_ROOT = os.path.join(BASE_DIR, 'static')
# the url mapping
STATIC_URL = '/static/'
## Here is a setup example for production servers
## A better celery broker -- using RabbitMQ (these defaults are from two free rabbit$
#
#
#CELERY_BROKER_URL = os.environ.get('AMQP_URL') or \
#              os.environ.get('RABBITMQ_BIGWIG_TX_URL') or \
#              os.environ.get('CLOUDAMQP_URL', 'amqp://guest:guest@localhost:5672/')
#CELERY_BROKER_POOL_LIMIT = 1
#CELERYD_CONCURRENCY = 1
#CELERY_TASK_SERIALIZER = 'json'
#CELERY_TASK_ACKS_LATE = True

AUTHENTICATION_BACKEND = 'django.contrib.auth.backends.ModelBackend'

this is the results

$ celery -A biohack worker -l info

 -------------- celery@raspberrypi v4.2.0 (windowlicker)
---- **** ----- 
--- * ***  * -- Linux-4.14.34-v7+-armv7l-with-debian-9.4 2018-06-27 12:59:28
-- * - **** --- 
- ** ---------- [config]
- ** ---------- .> app:         biohack:0x75efb1d0
- ** ---------- .> transport:   amqp://partrita:**@58.79.113.145:5672/myvhost
- ** ---------- .> results:     
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** ----- 
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery

[tasks]
  . biohack.wooey_celery_app.debug_task
  . wooey.tasks.cleanup_dead_jobs
  . wooey.tasks.cleanup_wooey_jobs
  . wooey.tasks.submit_script

[2018-06-27 12:59:29,093: INFO/MainProcess] Connected to amqp://partrita:**@58.79.113.145:5672/myvhost
[2018-06-27 12:59:29,188: INFO/MainProcess] mingle: searching for neighbors
[2018-06-27 12:59:30,325: INFO/MainProcess] mingle: all alone
[2018-06-27 12:59:30,422: INFO/MainProcess] celery@raspberrypi ready.

Thanks for your time :+1:

partrita commented 6 years ago

I give up using rabbitMQ , then it works with db. which is not recommend. Guessing I screw up with user_settings.py. I will fix it later.