wimleers / fileconveyor

File Conveyor is a daemon written in Python to detect, process and sync files. In particular, it's designed to sync files to CDNs. Amazon S3 and Rackspace Cloud Files, as well as any Origin Pull or (S)FTP Push CDN, are supported. Originally written for my bachelor thesis at Hasselt University in Belgium.
https://wimleers.com/fileconveyor
The Unlicense
340 stars 95 forks source link

When a newly created file is deleted before it was synced, File Conveyor crashes on a logging statement #112

Closed mrcwinn closed 11 years ago

mrcwinn commented 12 years ago

After updating boto and making a few other changes, FileConveyor finally started to run. The bad news - while it does successfully upload images from my Drupal install, it's only doing about two files at a time before bombing. Any ideas?

2012-04-05 19:34:44,367 - Arbitrator - ERROR - Unhandled exception of type 'KeyError' detected, arguments: '(None,)'. Traceback (most recent call last): File "arbitrator.py", line 283, in run self.process_discover_queue() File "arbitrator.py", line 368, in process_discover_queue self.logger.info("Pipeline queue: merged events for '%s': %s + %s cancel each other out, thus removed this file." % (input_file, FSMonitor.EVENTNAMES[old_event], FSMonitor.EVENTNAMES[event], FSMonitor.EVENTNAMES[merged_event])) KeyError: None

mrfelton commented 12 years ago

I have the same problem.

mrfelton commented 12 years ago

I hackishly resolved by commenting out the offending line in arbitrator.py. All it does is log something, and I'm not so bothered about a missing line in the log that I'm going to throw more time investigating and fixing the root cause.

I should not that I'm getting this problem with the new cumulus transfer (update of mosso, found in my fork at https://github.com/systemseed/fileconveyor).

ghost commented 12 years ago

What line should I comment out ? (I get the error from ur forked version too)

wimleers commented 11 years ago

I'm not sure what's happening here. This should be impossible. Note that this was implemented in #68.

To debug this problem, could you please add the following code:

                    self.logger.info("all events: %d, %d, %d" % (old_event, event, merged_event))
                    self.logger.info("old event: %s" % (FSMonitor.EVENTNAMES[old_event]))
                    self.logger.info("new event: %s" % (FSMonitor.EVENTNAMES[event]))
                    self.logger.info("merged event: %s" % (FSMonitor.EVENTNAMES[merged_event]))

before this line:

                    self.logger.info("Pipeline queue: merged events for '%s': %s + %s = %s." % (input_file, FSMonitor.EVENTNAMES[old_event], FSMonitor.EVENTNAMES[event], FSMonitor.EVENTNAMES[merged_event]))

That should help us narrow it down. You could also set logging.DEBUG to enable debug logging, in your settings.py.

wimleers commented 11 years ago

Okay, so it really was very, very silly. Fixing commit coming up.