Closed Kokoserver closed 8 months ago
Hello. I have a question regarding your broker configuration. Can you please show your taskiq
module?
And please post it using code blocks.
```python def a(): ... ```
My assumption is that you don't have required environment variables in your taskiq worker. Can you copy all environment variables of the backend
service to the taskiq_worker
service?
@s3rius here is my taskiq setup
from taskiq_redis import RedisAsyncResultBackend, ListQueueBroker import taskiq_fastapi from taskiq.schedule_sources import LabelScheduleSource from taskiq import TaskiqScheduler from src.core.settings import config
redis_async_result = RedisAsyncResultBackend( redis_url=config.taskiq_worker_backend_url, result_px_time=1000000, )
broker = ListQueueBroker( url=config.taskiq_worker_url, queue_name=config.project_name, ).with_result_backend(redis_async_result)
scheduler = TaskiqScheduler( broker=broker, sources=[LabelScheduleSource(broker)], )
taskiq_fastapi.init(broker, "src.asgi:app")
async def app_worker_startup(): if not broker.is_worker_process: await broker.startup()
async def app_worker_shutdown(): if not broker.is_worker_process: await broker.shutdown()
@s3rius all my environment variable is tracks in a centre place to make sure they are loaded completely here is my setup using starlette Config class:
import datetime import json from starlette.config import Config from functools import lru_cache import boto3 import pydantic as pyd from src.utils import get_path
class Settings: config = Config(env_file=".env") debug: bool = __config("DEBUG", default=False) project_name: str = config("PROJECT_NAME", cast=str) project_version = config("PROJECT_VERSION", cast=str) api_version: int = 1 project_description: str = config("PROJECT_DESCRIPTION", cast=str) project_url: str = config("PROJECT_URL", cast=str) environment: str = config("ENVIRONMENT", cast=str) backend_cors_origins: str = config("BACKEND_CORS_ORIGINS", cast=str) admin_email: str = config("ADMIN_EMAIL", cast=str) admin_password: str = config("ADMIN_PASSWORD", cast=str) email_port: int = config("EMAIL_PORT", cast=str) email_host: str = config("EMAIL_HOST", cast=str) email_backend: str = config("EMAIL_BACKEND", cast=str) contact_email: str = config("CONTACT_EMAIL", cast=str) contact_name: str = config("CONTACT_NAME", cast=str) database_url: str = config("EDGY_DATABASE_URL", cast=str) userpool_key_id: str = __config("AWS_POOL_ID", cast=str) pool_region: str = config("POOL_REGION_NAME", cast=str) app_client_id_key: str = config("AWS_POOL_CLIENT_ID", cast=str) taskiq_worker_url: str = __config("TASKIQ_WORKER_URL", cast=str) taskiq_worker_backend_url: str = config("TASKIQ_WORKER_BACKEND_URL", cast=str) aws_secret_key: str = config("AWS_SECRET_KEY", cast=str) aws_access_key: str = config("AWS_ACCESS_KEY", cast=str) aws_secret_id: str = config("AWS_SECRET_ID", cast=str) aws_region_name: str = config("AWS_REGION_NAME", cast=str)
aws_bucket: str = __config("AWS_BUCKET")
base_dir: pyd.DirectoryPath = get_path.get_base_dir()
email_template_dir: pyd.DirectoryPath = get_path.get_template_dir()
def get_access_expires_time(self):
return datetime.timedelta(seconds=self.access_token_expire_time)
def get_refresh_expires_time(
self,
):
return datetime.timedelta(seconds=self.refresh_token_expire_time)
def get_database_url(self) -> str:
if self.environment == "staging":
return "sqlite:///./app.db"
if self.environment == "testing":
return f"{self.database_url}_test"
return self.database_url
@lru_cache def get_settings(): return Settings()
In your docker compose your taskiq_worker
and scheduler
services don't have environment variables. Can you add them? They should be the same as in your application.
you mean adding: environment:
Yes. Not only these. But all variables of your application. To make sure that the environment is the same for all containers.
Thanks very much, worked by adding the environment to each services as you suggested. I really appreciate @s3rius 🙌🙌
I have a FastAPI project i'm working on which is deployed to lightsail when i run it using uvicorn src.asgi:app --host 0.0.0.0 --port 8000 #to start the application taskiq worker src.taskiq:broker -fsd #to start the taskiq worker they both worked but using docker the task is not been picked up. When i inspect the broker since am using redis, the task is there
My Docker file:
FROM python:3.11.4-slim WORKDIR /usr/src/app COPY requirements.txt . RUN pip install --no-cache-dir -r requirements.txt && apt-get update && apt-get install -y libmagic-dev=1:5.44-3 --no-install-recommends && rm -rf /var/lib/apt/lists/* COPY . .
EXPOSE 8000 ENV PROJECT_NAME =$PROJECT_NAME ENV BACKEND_CORS_ORIGINS = $BACKEND_CORS_ORIGINS ENV DEBUG = $DEBUG ENV VERSION = $VERSION ENV API_PREFIX = $API_PREFIX ENV PROJECT_DESCRIPTION = $PROJECT_DESCRIPTION ENV PROJECT_URL = $PROJECT_URL ENV PROJECT_VERSION = $PROJECT_VERSION ENV ENVIRONMENT = $ENVIRONMENT ENV EMAIL_BACKEND = $EMAIL_BACKEND ENV EMAIL_HOST = $EMAIL_HOST ENV EMAIL_PORT = $EMAIL_PORT ENV ADMIN_EMAIL = $ADMIN_EMAIL ENV ADMIN_PASSWORD = $ADMIN_PASSWORD ENV CONTACT_EMAIL = $CONTACT_EMAIL ENV CONTACT_NAME = $CONTACT_NAME ENV EDGY_DATABASE_URL = $EDGY_DATABASE_URL ENV AWS_SECRET_KEY = $AWS_SECRET_KEY ENV AWS_ACCESS_KEY=$AWS_ACCESS_KEY ENV AWS_SECRET_ID=$AWS_SECRET_ID ENV AWS_REGION_NAME=$AWS_REGION_NAME ENV AWS_BUCKET = $AWS_BUCKET ENV AWS_POOL_ID = $AWS_POOL_ID ENV POOL_REGION_NAME = $POOL_REGION_NAME ENV AWS_POOL_CLIENT_ID = $AWS_POOL_CLIENT_ID ENV TASKIQ_WORKER_URL = $TASKIQ_WORKER_URL ENV TASKIQ_WORKER_BACKEND_URL = $TASKIQ_WORKER_BACKEND_URL
Command to run the application using Uvicorn
CMD ["uvicorn", "src.asgi:app", "--host", "0.0.0.0", "--port", "8000"]
My docker-compose:
version: '3.8' services: backend: build: . environment:
please @thoas @asvetlov @emorozov @afonasev help have been on it for days now.