File Conveyor is a daemon written in Python to detect, process and sync files. In particular, it's designed to sync files to CDNs. Amazon S3 and Rackspace Cloud Files, as well as any Origin Pull or (S)FTP Push CDN, are supported. Originally written for my bachelor thesis at Hasselt University in Belgium.
I'm migrating my sites from a Windows server to a Linux server. With that I can now use fileconveyor and CDN but I have some questions about running multiple sites each with their own S3 bucket.
Do I have to run multiple instances of fileconveyor to do this or is there a way to format the config file to assign different sources to different servers?
I'm migrating my sites from a Windows server to a Linux server. With that I can now use fileconveyor and CDN but I have some questions about running multiple sites each with their own S3 bucket.
Do I have to run multiple instances of fileconveyor to do this or is there a way to format the config file to assign different sources to different servers?