tschellenbach / Stream-Framework

Stream Framework is a Python library, which allows you to build news feed, activity streams and notification systems using Cassandra and/or Redis. The authors of Stream-Framework also provide a cloud service for feed technology:
https://getstream.io/
Other
4.73k stars 542 forks source link

Redis+Celery+feedly setup issue? #20

Closed andrenro closed 10 years ago

andrenro commented 10 years ago

Hi again! Im working on a django project (more info in issue #16 ) I've got my Celery worker up and running, my redis seems database ok and most of my setup seems fine.. The only problem is that when I try to "fanout" to new feeds, the feeds does not seem to reach the Celery worker.. (seems no tasks are added to the worker queue). I know this might be a broker/worker issue, but I haven't found any good indications of what the problem might be :( Everything works fine when CELERY_ALWAYS_EAGER == True , so I don't think there is a problem with my code..

for feedly to work properly with celery and redis the things I would need is a Celery worker running, redis db setup ok, and a task broker (redis or rabbitmq) , right? I think I lack the understanding of how the broker and worker works together with feedly..?

Any tips are much appreciated :)

Adding to feed: activity = Activity(wishlist.user,WishVerb,in_prod.id) feed = UserWishFeed(wishlist.user.id) feed.insert_activity(activity) feed.add(activity) feedly.add_wish(wishlist.user, activity)

A view for fetching friends-feed : @csrf_exempt @api_view(['GET']) @login_required def friends_wish_feed(request, _args, *_kwargs): user = GiftItEndUser.objects.get(pk=kwargs['pk']) if request.user.id != user.pk: content = {"message":"you do not have permission","status":0} json_data = json.dumps(content) return Response(json_data)

    feed = feedly.get_feeds(request.user.id)['normal']
    act_list = []
    json_data = []
    activities = list(feed[:25])
    for act in activities:
            act = {'user_id':act.actor_id,'product_id':act.object_id}
            act_list.append(act)
            json_data = json.dumps(act_list)
    return Response(json_data)

Some celery settings: FEEDLY_NYDUS_CONFIG = { 'CONNECTIONS': { 'redis': { 'engine': 'nydus.db.backends.redis.Redis', 'router': 'nydus.db.routers.redis.PrefixPartitionRouter', 'hosts': { 0: {'prefix': 'default', 'db': 0, 'host': '127.0.0.1', 'port': 6379}, } }, } }

BROKER_URL = 'redis://localhost:6379/0' CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'

tbarbugli commented 10 years ago

your settings looks ok, can you add the version of celery you running and the command you are using to start the worker ?

2014/1/8 Andreas Røed notifications@github.com

Hi again! Im working on a django project (more info in issue #16https://github.com/tschellenbach/Feedly/issues/16) I've got my Celery worker up and running, my redis seems database ok and most of my setup seems fine.. The only problem is that when I try to "fanout" to new feeds, the feeds does not seem to reach the Celery worker.. (seems no tasks are added to the worker queue). I know this might be a broker/worker issue, but I haven't found any good indications of what the problem might be :( Everything works fine when CELERY_ALWAYS_EAGER == True , so I don't think there is a problem with my code..

for feedly to work properly with celery and redis the things I would need is a Celery worker running, redis db setup ok, and a task broker (redis or rabbitmq) , right? I think I lack the understanding of how the broker and worker works together with feedly..?

Any tips are much appreciated :)

Adding to feed: activity = Activity(wishlist.user,WishVerb,in_prod.id) feed = UserWishFeed(wishlist.user.id) feed.insert_activity(activity) feed.add(activity) feedly.add_wish(wishlist.user, activity)

A view for fetching friends-feed : @csrf_exempt @api_view(['GET']) @login_required def friends_wish_feed(request, _args, *_kwargs): user = GiftItEndUser.objects.get(pk=kwargs['pk']) if request.user.id != user.pk: content = {"message":"you do not have permission","status":0} json_data = json.dumps(content) return Response(json_data)

feed = feedly.get_feeds(request.user.id)['normal']
act_list = []
json_data = []
activities = list(feed[:25])
for act in activities:
        act = {'user_id':act.actor_id,'product_id':act.object_id}
        act_list.append(act)
        json_data = json.dumps(act_list)
return Response(json_data)

Some celery settings: FEEDLY_NYDUS_CONFIG = { 'CONNECTIONS': { 'redis': { 'engine': 'nydus.db.backends.redis.Redis', 'router': 'nydus.db.routers.redis.PrefixPartitionRouter', 'hosts': { 0: {'prefix': 'default', 'db': 0, 'host': '127.0.0.1', 'port': 6379}, } }, } }

BROKER_URL = 'redis://localhost:6379/0' CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'

— Reply to this email directly or view it on GitHubhttps://github.com/tschellenbach/Feedly/issues/20 .

andrenro commented 10 years ago

Starting the worker (I want to start it in detached mode /daemon) celery worker -A giftit_webapp -l info --pidfile=/var/run/celery/%n.pid --logfile=/var/log/celery/%n.pid -D celery --version : 3.1.5 (Cipater)

tbarbugli commented 10 years ago

that seems ok, my best guess is that the worker fails discovering the tasks or is not loading your settings; i suggest you to have a look at this docs https://docs.celeryproject.org/en/latest/django/first-steps-with-django.htmland make sure you are following all the instructions. Are you running rabbitmq on that machine ? if yes i suggest you to stop it and restart the worker; that will make things easier to setup. Another hint comes from the celery worker initial output where you should see the list of tasks discovered, can you add that too ?

2014/1/8 Andreas Røed notifications@github.com

Starting the worker (I want to start it in detached mode /daemon) celery worker -A giftit_webapp -l info --pidfile=/var/run/celery/%n.pid --logfile=/var/log/celery/%n.pid -D celery --version : 3.1.5 (Cipater)

— Reply to this email directly or view it on GitHubhttps://github.com/tschellenbach/Feedly/issues/20#issuecomment-31818823 .

andrenro commented 10 years ago

I have followed the first-steps-with-django tutorial, so thats why Im a bit puzzled:) A weird thing is that when I start the celery worker it does not output anything,that is, when using the "-D" for detached mode.. This is maybe a indication of something bad? If I run worker in a separate window I will get some this output though:

Start worker (non-detached..): celery worker -A giftit_webapp -l info

[2014-01-08 11:40:56,537: WARNING/MainProcess] /usr/local/lib/python2.7/dist-packages/celery/apps/worker.py:159: CDeprecationWarning: Starting from version 3.2 Celery will refuse to accept pickle by default.

The pickle serializer is a security concern as it may give attackers the ability to execute any command. It's important to secure your broker from unauthorized access when using pickle, so we think that enabling pickle should require a deliberate action and not be the default choice.

If you depend on pickle then you should set a setting to disable this warning and to be sure that everything will continue working when you upgrade to Celery 3.2::

CELERY_ACCEPT_CONTENT = ['pickle', 'json', 'msgpack', 'yaml']

You must only enable the serializers that you will actually use.

warnings.warn(CDeprecationWarning(W_PICKLE_DEPRECATED))

-------------- celery@theodor v3.1.5 (Cipater) ---- * ----- --- \ * * -- Linux-3.2.0-24-generic-x86_64-with-Ubuntu-12.04-precise -- * - **\ ---

[tasks] . giftit_webapp.celery.debug_task . wishlists.tasks.add . wishlists.tasks.mul . wishlists.tasks.xsum

[2014-01-08 11:40:56,575: INFO/MainProcess] Connected to redis://localhost:6379/0 [2014-01-08 11:40:56,584: INFO/MainProcess] mingle: searching for neighbors [2014-01-08 11:40:57,593: INFO/MainProcess] mingle: all alone [2014-01-08 11:40:57,599: WARNING/MainProcess] celery@theodor ready.

tbarbugli commented 10 years ago

i suggest you not to run the work with -D from the output I can see that the worker is connecting to the right broken (redis on localhost) but its not discovering the tasks; perhaps you forgot to include the code for the task autodiscovery based on django's INSTALLED_APPS ?

2014/1/8 Andreas Røed notifications@github.com

I have followed the first-steps-with-django tutorial, so thats why Im a bit puzzled:) A weird thing is that when I start the celery worker it does not output anything,that is, when using the "-D" for detached mode.. This is maybe a indication of something bad? If I run worker in a separate window I will get some this output though:

Start worker (non-detached..): celery worker -A giftit_webapp -l info

[2014-01-08 11:40:56,537: WARNING/MainProcess] /usr/local/lib/python2.7/dist-packages/celery/apps/worker.py:159: CDeprecationWarning: Starting from version 3.2 Celery will refuse to accept pickle by default.

The pickle serializer is a security concern as it may give attackers the ability to execute any command. It's important to secure your broker from unauthorized access when using pickle, so we think that enabling pickle should require a deliberate action and not be the default choice.

If you depend on pickle then you should set a setting to disable this warning and to be sure that everything will continue working when you upgrade to Celery 3.2::

CELERY_ACCEPT_CONTENT = ['pickle', 'json', 'msgpack', 'yaml']

You must only enable the serializers that you will actually use.

warnings.warn(CDeprecationWarning(W_PICKLE_DEPRECATED))

-------------- celery@theodor v3.1.5 (Cipater) ---- * ----- --- \ * * -- Linux-3.2.0-24-generic-x86_64-with-Ubuntu-12.04-precise -- * - **\ ---

  • \ ---------- [config]
  • \ ---------- .> broker: redis://localhost:6379/0
  • \ ---------- .> app: tasks:0x279e490
  • \ ---------- .> concurrency: 16 (prefork)
  • * --- * --- .> events: OFF (enable -E to monitor this worker) -- ***** ---- --- ***\ ----- [queues] -------------- .> celery exchange=celery(direct) key=celery

[tasks] . giftit_webapp.celery.debug_task . wishlists.tasks.add . wishlists.tasks.mul . wishlists.tasks.xsum

[2014-01-08 11:40:56,575: INFO/MainProcess] Connected to redis://localhost:6379/0 [2014-01-08 11:40:56,584: INFO/MainProcess] mingle: searching for neighbors [2014-01-08 11:40:57,593: INFO/MainProcess] mingle: all alone [2014-01-08 11:40:57,599: WARNING/MainProcess] celery@theodor ready.

— Reply to this email directly or view it on GitHubhttps://github.com/tschellenbach/Feedly/issues/20#issuecomment-31820509 .

andrenro commented 10 years ago

hmm, I will need to run this in the background at some point, as it will be used in production..

this is my celery.py file:

from future import absolute_import import os from celery import Celery from django.conf import settings

set the default Django settings module for the 'celery' program.

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'giftit_webapp.settings')

app = Celery('giftit_webapp')

Using a string here means the worker will not have to

pickle the object when using Windows.

app.config_from_object('django.conf:settings') app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

@app.task(bind=True) def debug_task(self): print('Request: {0!r}'.format(self.request))

A weird thing however, is that when I stop rabbitmq I cannot use commands like "celery status" or other celery commands to inspect nodes etc.. If I have rabbitmq running the celery commands work fine though.. I get this error: socket.error: [Errno 111] Connection refused

Think I stopped rabbitmqctl, but rabbitmq is it now still running, as a daemon?

ps aux | grep rabbitmq root 6720 0.0 0.0 9348 660 pts/10 S+ 12:05 0:00 grep --color=auto rabbitmq rabbitmq 16065 0.0 0.0 7400 324 ? S 2013 0:16 /usr/lib/erlang/erts-5.8.5/bin/epmd -daemon

tbarbugli commented 10 years ago

-D is not the only way to run celery on production; I personally find supervisord a much manage django and celery processes in production environment but thats a whole different topic :) is feedly listed in you INSTALLED_APPS ?

2014/1/8 Andreas Røed notifications@github.com

hmm, I will need to run this in the background at some point, as it will be used in production..

this is my celery.py file:

from future import absolute_import import os from celery import Celery from django.conf import settings set the default Django settings module for the 'celery' program.

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'giftit_webapp.settings')

app = Celery('giftit_webapp') Using a string here means the worker will not have to pickle the object when using Windows.

app.config_from_object('django.conf:settings') app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

@app.task(bind=True) def debug_task(self): print('Request: {0!r}'.format(self.request))

A weird thing however, is that when I stop rabbitmq I cannot use commands like "celery status" or other celery commands to inspect nodes etc.. If I have rabbitmq running the celery commands work fine though.. I get this error: socket.error: [Errno 111] Connection refused

Think I stopped rabbitmqctl, but rabbitmq is it now still running, as a daemon?

ps aux | grep rabbitmq root 6720 0.0 0.0 9348 660 pts/10 S+ 12:05 0:00 grep --color=auto rabbitmq rabbitmq 16065 0.0 0.0 7400 324 ? S 2013 0:16 /usr/lib/erlang/erts-5.8.5/bin/epmd -daemon

— Reply to this email directly or view it on GitHubhttps://github.com/tschellenbach/Feedly/issues/20#issuecomment-31822191 .

andrenro commented 10 years ago

feedly is not in listed in my INSTALLED_APPS .. just add 'feedly'?

andrenro commented 10 years ago

Do you prefer to run celery via the /etc/init.d/celeryd daemon method? I tried that earlier, but no luck , thats why Im trying the current method now :)

tbarbugli commented 10 years ago

the celery task discovery loops over installed_apps, adding feedly will probably solve your issue

2014/1/8 Andreas Røed notifications@github.com

feedly is not in listed in my INSTALLED_APPS .. just add 'feedly'?

— Reply to this email directly or view it on GitHubhttps://github.com/tschellenbach/Feedly/issues/20#issuecomment-31823125 .

tbarbugli commented 10 years ago

I find supervisord (http://supervisord.org/) a much easier way to handle celery (and in general every long running process on production environment)

2014/1/8 Andreas Røed notifications@github.com

Do you prefer to run celery via the /etc/init.d/celeryd daemon method? I tried that earlier, but no luck , thats why Im trying the current method now :)

— Reply to this email directly or view it on GitHubhttps://github.com/tschellenbach/Feedly/issues/20#issuecomment-31823202 .

andrenro commented 10 years ago

Ah thanks!!! It actually worked after adding feedly!! Geeez what a blunder from my side.. :) Now I only need to run the worker in the background ! Will try supervisord

andrenro commented 10 years ago

I just ran it from /etc/init.d/celeryd start and it works great for now. Thanks a bunch for the help, you just took 10 lbs of frustration off my back :)

andrenro commented 10 years ago

Seems my problems did not go away totally :/ When I have a running worker I will get two strange errors, depending on what the action is: If I want to add something to a friendsfeed I will get error with a django model import statement, and when I want to add something to a friendsfeed with feedly i will need to comment out #register(WishVerb) and when I want to use GET to get the feed, i will need to uncomment it.. Seems to be something with the registration of the verbs? The model import is also veeery strange.

in Wishlists/models.py

class WishVerb(Verb): id = 5 infinitive = 'wish' past_tence = 'wished'

register(WishVerb)

An unattached celery worker prints this first when I try to add something to the fanout feed:

[2014-01-14 13:02:16,166: CRITICAL/MainProcess] Can't decode message body: ValueError("cant register verb <class 'wishlists.models.WishVerb'> with id 5 (clashing with verb <class 'wishlists.models.WishVerb'>)",) (type:'application/x-python-serialize' encoding:'binary' raw:"'\x80\x02}q\x01(U\x07expiresq\x02NU\x03utcq\x03\x88U\x04argsq\x04]q\x05U\x05chordq\x06NU\tcallbacksq\x07NU\x08errbacksq\x08NU\x07tasksetq\tNU\x02idq\nU$e0522ddd-4fbb-42bc-8466-dbba1257e8a6q\x0bU\x07retriesq\x0cK\x00U\x04taskq\rU)feedly.tasks.fanout_operation_hi_priorityq\x0eU\ttimelimitq\x0fNN\x86U\x03etaq\x10NU\x06kwargsq\x11}q\x12(U\x10operation_kwargsq\x13}q\x14(U\x04trimq\x15\x88U\nactivitiesq\x16]q\x17cfeedly.activity\nActivity\nq\x18)\x81q\x19}q\x1a(U\x06targetq\x1bNU\x05actorq\x1cccopy_reg\n_reconstructor\nq\x1dcemail_auth.models\nGiftItEndUser\nq\x1ecbuiltin\nobject\nq\x1fN\x87Rq... (1431b)"') Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/kombu/messaging.py", line 583, in _receive_callback decoded = None if on_m else message.decode() File "/usr/local/lib/python2.7/dist-packages/kombu/message.py", line 123, in decode self.content_encoding, accept=self.accept) File "/usr/local/lib/python2.7/dist-packages/kombu/serialization.py", line 165, in loads return decode(data) File "/usr/local/lib/python2.7/dist-packages/kombu/serialization.py", line 49, in pickle_loads return load(BytesIO(s)) File "/home/andrenro/dev_giftit_webapp/env/Scripts/giftit_webapp/email_auth/models.py", line 12, in from wishlists.models import Wishlist File "/home/andrenro/dev_giftit_webapp/env/Scripts/giftit_webapp/wishlists/models.py", line 27, in register(WishVerb) File "/usr/local/lib/python2.7/dist-packages/feedly/verbs/init.py", line 17, in register (verb, verb.id, registered_verb)) ValueError: cant register verb <class 'wishlists.models.WishVerb'> with id 5 (clashing with verb <class 'wishlists.models.WishVerb'>) [2014-01-14 13:04:41,831: INFO/MainProcess] Received task: feedly.tasks.fanout_operation_hi_priority[d1bb9ca5-3509-45c4-8e73-d8d1e26eb7df] [2014-01-14 13:04:41,873: WARNING/Worker-15] /usr/local/lib/python2.7/dist-packages/django/contrib/localflavor/init.py:5: DeprecationWarning: django.contrib.localflavor is deprecated. Use the separate django-localflavor package instead. DeprecationWarning)

Then will like 14 workers print some exit codes and lastly:

[2014-01-14 13:13:19,311: ERROR/Worker-7] Pool process <Worker(Worker-7, started daemon)> error: ImportError('cannot import name GiftItUser',) Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/billiard/pool.py", line 285, in run sys.exit(self.workloop(pid=pid)) File "/usr/local/lib/python2.7/dist-packages/billiard/pool.py", line 338, in workloop req = wait_for_job() File "/usr/local/lib/python2.7/dist-packages/billiard/pool.py", line 429, in receive ready, req = _receive(1.0) File "/usr/local/lib/python2.7/dist-packages/billiard/pool.py", line 401, in _recv return True, loads(get_payload()) File "/usr/local/lib/python2.7/dist-packages/billiard/common.py", line 70, in pickle_loads return load(BytesIO(s)) File "/home/andrenro/dev_giftit_webapp/env/Scripts/giftit_webapp/email_auth/models.py", line 12, in from wishlists.models import Wishlist File "/home/andrenro/dev_giftit_webapp/env/Scripts/giftit_webapp/wishlists/models.py", line 157, in class WishlistSerializer(serializers.HyperlinkedModelSerializer): File "/home/andrenro/dev_giftit_webapp/env/Scripts/giftit_webapp/wishlists/models.py", line 169, in WishlistSerializer custom_wishes = CustomWishSerializer() File "/usr/local/lib/python2.7/dist-packages/rest_framework/serializers.py", line 159, in init self.fields = self.get_fields() File "/usr/local/lib/python2.7/dist-packages/rest_framework/serializers.py", line 196, in get_fields default_fields = self.get_default_fields() File "/usr/local/lib/python2.7/dist-packages/rest_framework/serializers.py", line 655, in get_default_fields reverse_rels = opts.get_all_related_objects() File "/usr/local/lib/python2.7/dist-packages/django/db/models/options.py", line 405, in get_all_related_objects include_proxy_eq=include_proxy_eq)] File "/usr/local/lib/python2.7/dist-packages/django/db/models/options.py", line 417, in get_all_related_objects_with_model self._fill_related_objects_cache() File "/usr/local/lib/python2.7/dist-packages/django/db/models/options.py", line 440, in _fill_related_objects_cache for klass in get_models(include_auto_created=True, only_installed=False): File "/usr/local/lib/python2.7/dist-packages/django/db/models/loading.py", line 197, in get_models self._populate() File "/usr/local/lib/python2.7/dist-packages/django/db/models/loading.py", line 75, in _populate self.load_app(app_name) File "/usr/local/lib/python2.7/dist-packages/django/db/models/loading.py", line 96, in load_app models = import_module('.models', app_name) File "/usr/local/lib/python2.7/dist-packages/django/utils/importlib.py", line 35, in import_module import(name) File "/home/andrenro/dev_giftit_webapp/env/Scripts/giftit_webapp/merchants/models.py", line 101, in from email_auth.models import GiftItUser ImportError: cannot import name GiftItUser [2014-01-14 13:13:19,317: ERROR/MainProcess] Process 'Worker-7' pid:18597 exited with exitcode 1 [2014-01-14 13:13:19,332: INFO/Worker-17] ============================== starting fanout ============================== [2014-01-14 13:13:19,332: INFO/Worker-17] starting batch interface for feed <class 'wishlists.models.WishFeed'>, fanning out to 3 users [2014-01-14 13:13:19,334: INFO/Worker-17] finished fanout for feed <class 'wishlists.models.WishFeed'> [2014-01-14 13:13:19,336: INFO/MainProcess] Task feedly.tasks.fanout_operation_hi_priority[dc52cf53-2572-47c9-a20f-4503bcc60d13] succeeded in 0.00370556116104s: "3 user_ids, <class...

[2014-01-14 13:19:05,460: INFO/MainProcess] Received task: feedly.tasks.fanout_operation_hi_priority[3c021d6a-550d-46ad-ab14-1fa5d90e2177] [2014-01-14 13:19:05,465: INFO/Worker-19] ============================== starting fanout ============================== [2014-01-14 13:19:05,466: INFO/Worker-19] starting batch interface for feed <class 'wishlists.models.WishFeed'>, fanning out to 3 users [2014-01-14 13:19:05,468: INFO/Worker-19] finished fanout for feed <class 'wishlists.models.WishFeed'> [2014-01-14 13:19:05,470: INFO/MainProcess] Task feedly.tasks.fanout_operation_hi_priority[3c021d6a-550d-46ad-ab14-1fa5d90e2177] succeeded in 0.00487464666367s: "3 user_ids, <class...

It actually did the fanout and it worked at some point, but it seems to be somewhat of a intermittent problem maybe? Could feedly/celery try to import my models several times, and why does the registration of feedly verbs not always work? this also seems to be a problem when I try to run the workers as detached.. I know its a lot of info, but maybe there could be a simple solution Im overlooking? :) Thanks again.. Great project!

tbarbugli commented 10 years ago

the error related to the verb seems to be related to multiple registration with the same verb_id but different verb objects. I suggest you to debug this in feedly.verbs.register to see which verbs gets registered. What command do you use to start celery workers ?

2014/1/14 Andreas Røed notifications@github.com

Seems my problems did not go away totally :/ When I have a running worker I will get two strange errors, depending on what the action is: If I want to add something to a friendsfeed I will get error with a django model import statement, and when I want to add something to a friendsfeed with feedly i will need to comment out #register(WishVerb) and when I want to use GET to get the feed, i will need to uncomment it.. Seems to be something with the registration of the verbs? The model import is also veeery strange.

in Wishlists/models.py

class WishVerb(Verb): id = 5 infinitive = 'wish' past_tence = 'wished'

register(WishVerb)

An unattached celery worker prints this first when I try to add something to the fanout feed:

[2014-01-14 13:02:16,166: CRITICAL/MainProcess] Can't decode message body: ValueError("cant register verb with id 5 (clashing with verb )",) (type:'application/x-python-serialize' encoding:'binary' raw:"'\x80\x02}q\x01(U\x07expiresq\x02NU\x03utcq\x03\x88U\x04argsq\x04]q\x05U\x05chordq\x06NU\tcallbacksq\x07NU\x08errbacksq\x08NU\x07tasksetq\tNU\x02idq\nU$e0522ddd-4fbb-42bc-8466-dbba1257e8a6q\x0bU\x07retriesq\x0cK\x00U\x04taskq\rU)feedly.tasks.fanout_operation_hi_priorityq\x0eU\ttimelimitq\x0fNN\x86U\x03etaq\x10NU\x06kwargsq\x11}q\x12(U\x10operation_kwargsq\x13}q\x14(U\x04trimq\x15\x88U\nactivitiesq\x16]q\x17cfeedly.activity\nActivity\nq\x18)\x81q\x19}q\x1a(U\x06targetq\x1bNU\x05actorq\x1cccopy_reg\n_reconstructor\nq\x1dcemail_auth.models\nGiftItEndUser\nq\x1ecbuiltin\nobject\nq\x1fN\x87Rq... (1431b)"') Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/kombu/messaging.py", line 583, in _receive_callback decoded = None if on_m else message.decode() File "/usr/local/lib/python2.7/dist-packages/kombu/message.py", line 123, in decode self.content_encoding, accept=self.accept) File "/usr/local/lib/python2.7/dist-packages/kombu/serialization.py", line 165, in loads return decode(data) File "/usr/local/lib/python2.7/dist-packages/kombu/serialization.py", line 49, in pickle_loads return load(BytesIO(s)) File "/home/andrenro/dev_giftit_webapp/env/Scripts/giftit_webapp/email_auth/models.py", line 12, in from wishlists.models import Wishlist File "/home/andrenro/dev_giftit_webapp/env/Scripts/giftit_webapp/wishlists/models.py", line 27, in register(WishVerb) File "/usr/local/lib/python2.7/dist-packages/feedly/verbs/init.py", line 17, in register (verb, verb.id, registered_verb)) ValueError: cant register verb with id 5 (clashing with verb ) [2014-01-14 13:04:41,831: INFO/MainProcess] Received task: feedly.tasks.fanout_operation_hi_priority[d1bb9ca5-3509-45c4-8e73-d8d1e26eb7df] [2014-01-14 13:04:41,873: WARNING/Worker-15] /usr/local/lib/python2.7/dist-packages/django/contrib/localflavor/init.py:5: DeprecationWarning: django.contrib.localflavor is deprecated. Use the separate django-localflavor package instead. DeprecationWarning)

Then will like 14 workers print some exit codes and lastly:

[2014-01-14 13:13:19,311: ERROR/Worker-7] Pool process error: ImportError('cannot import name GiftItUser',) Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/billiard/pool.py", line 285, in run sys.exit(self.workloop(pid=pid)) File "/usr/local/lib/python2.7/dist-packages/billiard/pool.py", line 338, in workloop req = wait_for_job() File "/usr/local/lib/python2.7/dist-packages/billiard/pool.py", line 429, in receive ready, req =

_receive(1.0) File "/usr/local/lib/python2.7/dist-packages/billiard/pool.py", line 401, in _recv return True, loads(get_payload()) File "/usr/local/lib/python2.7/dist-packages/billiard/common.py", line 70, in pickle_loads return load(BytesIO(s)) File "/home/andrenro/dev_giftit_webapp/env/Scripts/giftit_webapp/email_auth/models.py", line 12, in from wishlists.models import Wishlist File "/home/andrenro/dev_giftit_webapp/env/Scripts/giftit_webapp/wishlists/models.py", line 157, in class WishlistSerializer(serializers.HyperlinkedModelSerializer): File "/home/andrenro/dev_giftit_webapp/env/Scripts/giftit_webapp/wishlists/models.py", line 169, in WishlistSerializer custom_wishes = CustomWishSerializer() File "/usr/local/lib/python2.7/dist-packages/rest_framework/serializers.py", line 159, in init self.fields = self.get_fields() File "/usr/local/lib/python2.7/dist-packages/rest_framework/serializers.py", line 196, in get_fields default_fields = self.get_default_fields() File "/usr/local/lib/python2.7/dist-packages/rest_framework/serializers.py", line 655, in get_default_fields reverse_rels = opts.get_all_related_objects() File "/usr/local/lib/python2.7/dist-packages/django/db/models/options.py", line 405, in get_all_related_objects include_proxy_eq=include_proxy_eq)] File "/usr/local/lib/python2.7/dist-packages/django/db/models/options.py", line 417, in get_all_related_objects_with_model self._fill_related_objects_cache() File "/usr/local/lib/python2.7/dist-packages/django/db/models/options.py", line 440, in _fill_related_objects_cache for klass in get_models(include_auto_created=True, only_installed=False): File "/usr/local/lib/python2.7/dist-packages/django/db/models/loading.py", line 197, in get_models self.

_populate() File "/usr/local/lib/python2.7/dist-packages/django/db/models/loading.py", line 75, in _populate self.load_app(app_name) File "/usr/local/lib/python2.7/dist-packages/django/db/models/loading.py", line 96, in load_app models = import_module('.models', app_name) File "/usr/local/lib/python2.7/dist-packages/django/utils/importlib.py", line 35, in import_module import(name) File "/home/andrenro/dev_giftit_webapp/env/Scripts/giftit_webapp/merchants/models.py", line 101, in from email_auth.models import GiftItUser ImportError: cannot import name GiftItUser [2014-01-14 13:13:19,317: ERROR/MainProcess] Process 'Worker-7' pid:18597 exited with exitcode 1 [2014-01-14 13:13:19,332: INFO/Worker-17] ============================== starting fanout ============================== [2014-01-14 13:13:19,332: INFO/Worker-17] starting batch interface for feed , fanning out to 3 users [2014-01-14 13:13:19,334: INFO/Worker-17] finished fanout for feed [2014-01-14 13:13:19,336: INFO/MainProcess] Task feedly.tasks.fanout_operation_hi_priority[dc52cf53-2572-47c9-a20f-4503bcc60d13] succeeded in 0.00370556116104s: "3 user_ids, <class...

[2014-01-14 13:19:05,460: INFO/MainProcess] Received task: feedly.tasks.fanout_operation_hi_priority[3c021d6a-550d-46ad-ab14-1fa5d90e2177] [2014-01-14 13:19:05,465: INFO/Worker-19] ============================== starting fanout ============================== [2014-01-14 13:19:05,466: INFO/Worker-19] starting batch interface for feed , fanning out to 3 users [2014-01-14 13:19:05,468: INFO/Worker-19] finished fanout for feed [2014-01-14 13:19:05,470: INFO/MainProcess] Task feedly.tasks.fanout_operation_hi_priority[3c021d6a-550d-46ad-ab14-1fa5d90e2177] succeeded in 0.00487464666367s: "3 user_ids, <class...

It actually did the fanout and it worked at some point, but it seems to be somewhat of a intermittent problem maybe? Could feedly/celery try to import my models several times, and why does the registration of feedly verbs not always work? this also seems to be a problem when I try to run the workers as detached.. I know its a lot of info, but maybe there could be a simple solution Im overlooking? :) Thanks again.. Great project!

— Reply to this email directly or view it on GitHubhttps://github.com/tschellenbach/Feedly/issues/20#issuecomment-32260002 .

andrenro commented 10 years ago

Ive tried different things, (im currently trying to setup supervisord like you tipped me about earlier) but one that has worked best now is to run a non-detached worker like this: celery worker -A giftit_webapp -l info as a non-root user. It does connect to redis and it detects the feedly tasks from settings, but it still complains about the model import error . Seems like every worker tries and fails to import it, but the fanout still works in some cases..

tbarbugli commented 10 years ago

I doubt this error is related to how you start celery detached or not, looking at the stack trace seems that there is some problem with the order of import for your models. maybe you can get away with this problem moving the import (from wishlists.models import Wishlist) inside of the function that needs Whishlist.

2014/1/14 Andreas Røed notifications@github.com

giftit_webapp

tbarbugli commented 10 years ago

Hi, I am going to update docs and close this ticket for now. Feel free to reopen it if you still have issues setting up celery.