gaiacoop / django-huey

A django integration for huey task queue that supports multi queue management
MIT License
67 stars 5 forks source link

HueyException <task> not found in TaskRegistry #3

Closed wgordon17 closed 2 years ago

wgordon17 commented 2 years ago

When using django-huey, the consumer is able to correctly find the registered tasks, however when running the Django server and trying to enqueue a task, I get an "HueyException not found in TaskRegistry" error. From doing wayy too much debugging, I finally tracked this down to the fact that the huey package initializes itself with RegistryA, and then django_huey initializes itself with a separate RegistryB. So when trying to enqueue a task, it attempts to find the task in RegistryA which is empty, because all of the tasks are in RegistryB.

I use import django_huey as huey everywhere, and then use @huey.task(queue="test"), but I'm still getting this error. I'm not sure what I'm doing wrong 😞

pablop94 commented 2 years ago

Hi, could you please attach a minimal code to reproduce the issue and specify django, huey versions

pablop94 commented 2 years ago

Hi @wgordon17 any update on the issue?

wgordon17 commented 2 years ago

Sorry @pablop94, I had a few other issues on huey (like error mgmt, etc), so I've switched over to just using celery. Thank you anyway!

frossigneux commented 2 years ago

I have the same issue and two queues, configured to use the same Redis server. So the name parameter is identical (not first_tasks and email_tasks). If I put twice the same name, it seems that the second consumer get the tasks (and may be the wrong consumer). If I put different names, the right consumer get the task. How to have the same name, but things working? Thanks

pablop94 commented 2 years ago

Hi, could you please attach your DJANGO_HUEY setting? Why do you want two queues with the same name?

frossigneux commented 2 years ago

    @property
    def DJANGO_HUEY(self):
        prefix, suffix = Production.REDIS_URL["default"]["LOCATION"].split("//")
        url = f"{prefix}//:{Production.REDIS_PASSWORD}@{suffix}"
        pool = BlockingConnectionPool.from_url(
            url,
            max_connections=50,
            timeout=20,
            health_check_interval=30,
        )
        return {
            "default": "category_1",
            "queues": {
                "category_1": {
                    "huey_class": "huey.RedisHuey",  # Huey implementation to use.
                    "name": "default",  # Use db name for huey.
                    "results": True,  # Store return values of tasks.
                    "store_none": False,  # If a task returns None, do not save to results.
                    "immediate": False,  # If DEBUG=True, run synchronously
                    "utc": True,  # Use UTC for all times internally.
                    "blocking": True,  # Perform blocking pop rather than poll Redis.
                    "connection": {
                        "connection_pool": pool,
                    },
                    "consumer": {
                        "workers": 1,
                        "worker_type": "thread",
                        "initial_delay": 0.1,  # Smallest polling interval, same as -d.
                        "backoff": 1.15,  # Exponential backoff using this rate, -b.
                        "max_delay": 10.0,  # Max possible polling interval, -m.
                        "scheduler_interval": 1,  # Check schedule every second, -s.
                        "periodic": True,  # Enable crontab feature.
                        "check_worker_health": True,  # Enable worker health checks.
                        "health_check_interval": 1,  # Check worker health every second.
                    },
                },
                "category_2": {
                    "huey_class": "huey.RedisHuey",  # Huey implementation to use.
                    "name": "default",  # Use db name for huey.
                    "results": True,  # Store return values of tasks.
                    "store_none": False,  # If a task returns None, do not save to results.
                    "immediate": False,  # If DEBUG=True, run synchronously
                    "utc": True,  # Use UTC for all times internally.
                    "blocking": True,  # Perform blocking pop rather than poll Redis.
                    "connection": {
                        "connection_pool": pool,
                    },
                    "consumer": {
                        "workers": 2,
                        "worker_type": "thread",
                        "initial_delay": 0.1,  # Smallest polling interval, same as -d.
                        "backoff": 1.15,  # Exponential backoff using this rate, -b.
                        "max_delay": 10.0,  # Max possible polling interval, -m.
                        "scheduler_interval": 1,  # Check schedule every second, -s.
                        "periodic": True,  # Enable crontab feature.
                        "check_worker_health": True,  # Enable worker health checks.
                        "health_check_interval": 1,  # Check worker health every second.
                    },
                },
            },
        }

I want two queues. But use only one Redis instance (I am limited in Redis instances and would not create a second instance). In the documentation I read: db (int) – Redis database to use (typically 0-15, default is 0). Maybe we can select a specific db? I tried to make two pools with each one url ending by /0 and /1, but get DB index is out of range.

pablop94 commented 2 years ago

You can delete the name property or change it to different values, it will work.

If you want multiple tasks pointing to any of these queues (sometimes category_1, sometimes category_2), multiple queues is not what you need, you need only one queue and if you need it, you can add more workers.