wimleers / fileconveyor

File Conveyor is a daemon written in Python to detect, process and sync files. In particular, it's designed to sync files to CDNs. Amazon S3 and Rackspace Cloud Files, as well as any Origin Pull or (S)FTP Push CDN, are supported. Originally written for my bachelor thesis at Hasselt University in Belgium.
https://wimleers.com/fileconveyor
The Unlicense
341 stars 95 forks source link

Prevent multiple simultaneous instances? #151

Open wimleers opened 10 years ago

wimleers commented 10 years ago

It seems #96 was caused by this:

For the casual layman getting this error, I discovered I was inadvertently running multiple instances of arbitrator.py on my system. I think I did this by running 'sudo python arbitrator.py &' more than once during the setup process :P To find the processes I used 'ps -A | grep python' and then killed them with 'sudo kill -9 PID'. For good measure I also deleted the fsmonitor.db, persistent_data.db, and synced_files.db files and then restarted fileconveyor. I don't have a large file system so this wasn't too bad, but I can imagine this could be a big pain in the ass for larger websites.

(Reported by @adunk on #96.)

File Conveyor shouldn't become fragile when a user makes such a small mistake.

We could prevent instantiating the same instance of File Conveyor in some way. Maybe get a lock on a dummy table in one of the SQLite DBs? Any other ideas?