Closed spabinger closed 6 years ago
Can you paste your Dockerfile
and docker-compose
file or whatever you are running with? It seems like you are missing the celery portion. This should be the general layout for a compose file that runs wooey + celery (This is written entirely without testing atm, just from memory of what it should look like):
services:
wooey:
...
ports:
- 8080:8080
command: python manage.py runserver 0.0.0.0:8080
celery:
depends_on:
- rabbit
command: python manage.py celery worker -c 4 --beat -l info
rabbit:
image: rabbitmq
Thanks for your reply
I modified my Dockerfile and created a docker-compose but I still can't execute a script.
Dockerfile:
FROM python
RUN pip install wooey
RUN pip install honcho
RUN apt-get update && apt-get install nano
RUN apt-get -y install mc
RUN mkdir /opt/ait_scripts
WORKDIR /opt
RUN wooify -p ait_python
WORKDIR /opt/ait_python
RUN cp /opt/ait_python/manage.py /opt/ait_python/manage.py.old
## Set the admin user
RUN echo "from django.contrib.auth.models import User;" >> /opt/ait_python/manage.py
RUN echo "User.objects.create_superuser('admin', 'stephan.pabinger@ait.ac.at', 'admin')" >> /opt/ait_python/manage.py
RUN python /opt/ait_python/manage.py migrate
## Do this administration stuff only once
RUN mv /opt/ait_python/manage.py.old /opt/ait_python/manage.py
RUN sed -i "s/ALLOWED_HOSTS = \[\]/ALLOWED_HOSTS = \['\*\']/g" /opt/ait_python/ait_python/settings/django_settings.py
EXPOSE 8000
#CMD [ "python", "/opt/ait_python/manage.py", "celery", "worker", "-c", "1", "--beat", "-l", "info" ]
#CMD [ "python", "/opt/ait_python/manage.py", "runserver", "0.0.0.0:8000"]
docker-compose.yml:
version: '3'
services:
wooey:
build: .
ports:
- 10001:8080
command: python manage.py runserver 0.0.0.0:8080
celery:
build:
context: .
depends_on:
- rabbit
command: python manage.py celery worker -c 4 --beat -l info
rabbit:
image: rabbitmq
Thanks in advance, Stephan
Ok, there are a lot of reasons it doesn't work. But the basics:
user_settings.py
file.user_settings.py
file and use docker-compose's volumes to override the user settings for Wooey.This should work (keep in mind things like -- you shouldn't have passwords in plaintext, you need to pick a persistant place for postgres to store its data, etc.). This is just to show you a demo.
docker-compose.yml
version: '2.1'
volumes:
user_uploads:
services:
common:
build: .
volumes:
- ./user_settings.py:/opt/ait_python/ait_python/settings/user_settings.py
- user_uploads:/opt/ait_python/ait_python/user_uploads
environment:
DATABASE_NAME: wooey
DATABASE_USER: wooey
DATABASE_URL: db
DATABASE_PASSWORD: wooey
wooey:
extends: common
ports:
- 10001:8080
depends_on:
- rabbit
- db
command: python manage.py runserver 0.0.0.0:8080
celery:
extends: common
environment:
C_FORCE_ROOT: 'true'
command: python manage.py celery worker -c 4 --beat -l info
rabbit:
image: rabbitmq
db:
image: postgres
environment:
POSTGRES_USER: wooey
POSTGRES_PASSWORD: wooey
POSTGRES_DB: wooey
Dockerfile
FROM python
RUN pip install wooey psycopg2
RUN mkdir /opt/ait_scripts
WORKDIR /opt
RUN wooify -p ait_python
WORKDIR /opt/ait_python
EXPOSE 8080
user_settings.py
from os import environ
from .wooey_settings import *
# This file is where the user can override and customize their installation of wooey
# Wooey Apps - add additional apps here after the initial install (remember to follow everything by a comma)
INSTALLED_APPS += (
)
# Whether to allow anonymous job submissions, set False to disallow 'guest' job submissions
WOOEY_ALLOW_ANONYMOUS = True
## Celery related options
INSTALLED_APPS += (
'djcelery',
'kombu.transport.django',
)
CELERY_RESULT_BACKEND='djcelery.backends.database:DatabaseBackend'
BROKER_URL = 'amqp://guest@rabbit'
CELERY_TRACK_STARTED = True
WOOEY_CELERY = True
CELERY_SEND_EVENTS = True
CELERY_IMPORTS = ('wooey.tasks')
# Things you most likely do not need to change
# the directory for uploads (physical directory)
MEDIA_ROOT = os.path.join(BASE_DIR, 'user_uploads')
# the url mapping
MEDIA_URL = '/uploads/'
# the directory to store our webpage assets (images, javascript, etc.)
STATIC_ROOT = os.path.join(BASE_DIR, 'static')
# the url mapping
STATIC_URL = '/static/'
## Here is a setup example for production servers
## A postgres database -- for multiple users a sqlite based database is asking for trouble
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
# for production environments, these should be stored as environment variables
# I also recommend the django-heroku-postgresify package for a super simple setup
'NAME': os.environ.get('DATABASE_NAME', 'wooey'),
'USER': os.environ.get('DATABASE_USER', 'wooey'),
'PASSWORD': os.environ.get('DATABASE_PASSWORD', 'wooey'),
'HOST': os.environ.get('DATABASE_URL', 'localhost'),
'PORT': os.environ.get('DATABASE_PORT', '5432')
}
}
## A better celery backend -- using RabbitMQ (these defaults are from two free rabbitmq Heroku providers)
# CELERY_RESULT_BACKEND = 'amqp'
# BROKER_URL = os.environ.get('AMQP_URL') or \
# os.environ.get('RABBITMQ_BIGWIG_TX_URL') or \
# os.environ.get('CLOUDAMQP_URL', 'amqp://guest:guest@localhost:5672/')
# BROKER_POOL_LIMIT = 1
# CELERYD_CONCURRENCY = 1
# CELERY_TASK_SERIALIZER = 'json'
# ACKS_LATE = True
#
## for production environments, django-storages abstracts away much of the difficulty of various storage engines.
## Here is an example for hosting static and user generated content with S3
# from boto.s3.connection import VHostCallingFormat
#
# INSTALLED_APPS += (
# 'storages',
# 'collectfast',
# )
## We have user authentication -- we need to use https (django-sslify)
## NOTE: This is MIDDLEWARE and not MIDDLEWARE_CLASSES in Django 1.10+!
# if not DEBUG:
# MIDDLEWARE_CLASSES = ['sslify.middleware.SSLifyMiddleware']+list(MIDDLEWARE_CLASSES)
# SECURE_PROXY_SSL_HEADER = ('HTTP_X_FORWARDED_PROTO', 'https')
#
ALLOWED_HOSTS = (
'localhost',
'127.0.0.1',
)
#
# AWS_CALLING_FORMAT = VHostCallingFormat
#
# AWS_ACCESS_KEY_ID = environ.get('AWS_ACCESS_KEY_ID', '')
# AWS_SECRET_ACCESS_KEY = environ.get('AWS_SECRET_ACCESS_KEY', '')
# AWS_STORAGE_BUCKET_NAME = environ.get('AWS_STORAGE_BUCKET_NAME', '')
# AWS_AUTO_CREATE_BUCKET = True
# AWS_QUERYSTRING_AUTH = False
# AWS_S3_SECURE_URLS = True
# AWS_FILE_OVERWRITE = False
# AWS_PRELOAD_METADATA = True
# AWS_S3_CUSTOM_DOMAIN = environ.get('AWS_S3_CUSTOM_DOMAIN', '')
#
# GZIP_CONTENT_TYPES = (
# 'text/css',
# 'application/javascript',
# 'application/x-javascript',
# 'text/javascript',
# )
#
# AWS_EXPIREY = 60 * 60 * 7
# AWS_HEADERS = {
# 'Cache-Control': 'max-age=%d, s-maxage=%d, must-revalidate' % (AWS_EXPIREY,
# AWS_EXPIREY)
# }
#
# STATIC_URL = 'http://%s.s3.amazonaws.com/' % AWS_STORAGE_BUCKET_NAME
# MEDIA_URL = '/user-uploads/'
#
# STATICFILES_STORAGE = DEFAULT_FILE_STORAGE = 'wooey.wooeystorage.CachedS3BotoStorage'
# WOOEY_EPHEMERAL_FILES = True
AUTHENTICATION_BACKEND = 'django.contrib.auth.backends.ModelBackend'
Now you can perform all your management functions like: docker-compose run --rm wooey python manage.py createsuperuser
and run wooey like docker-compose up wooey
. Remember to run celery or add celery to the depends_on
for wooey if you want that automatically spun up.
Thank you very much for the instructions.
I had to run
docker-compose run --rm wooey python manage.py migrate auth
docker-compose run --rm wooey python manage.py migrate
before running
docker-compose run --rm wooey python manage.py createsuperuser
To store the db files I created a volume.
Best, Stephan
Hi,
I installed Wooey in a Docker and can successfully open Wooey in a browser, add scripts via the admin interface, and submit jobs. However, the jobs are not executed - I only get a "Wating" status in Wooey.
Do you know how to make Wooey work inside a Docker? Can you point me to logs to see what is going wrong? Are there specific settings that I need to change?
Thanks, Stephan