hughsk / s3-sync

A streaming interface for uploading multiple files to S3.
Other
79 stars 27 forks source link

Keep a remote copy of the cache db saved on S3 #2

Closed christophercliff closed 11 years ago

christophercliff commented 11 years ago

This is not quite working yet--there is a problem in the getCache callback where the stream events aren't being triggered. Suggested usage like:

var s3sync = require('s3sync')
var readdirp = require('readdirp')

var syncer = s3sync({
    key: YOUR_KEY,
    secret: YOUR_SECRET,
    bucket: YOUR_BUCKET
})
var files = readdirp({
    root: __dirname + '/build'
})

syncer.getCache(function(err){
    files.pipe(syncer)
        .on('data', function(file){ console.log(file.fullPath + ' -> ' + file.url) })
        .once('end', function(){
            syncer.putCache(function(){
                console.log('Done!')
            })
        })
})

Also added level as a dependency--I think that would be the typical use case, no?

Fixes issue #1

hughsk commented 11 years ago

Nice, thanks! I'll have a look at this now - the convention with level is to supply the database yourself, in case someone wants to extend it beforehand or use an alternative module with an equivalent API. I'll include it as a peer dependency :)

Do you have any more specifics on the streaming issue? Or is it just not firing at all?

hughsk commented 11 years ago

Merged! I've made a couple of small changes, and added an example to the repo - I think the only issue before was that the file stream needs to get piped to syncer in the same tick it's created.

christophercliff commented 11 years ago

:thumbsup: Great! Fun project, learned a lot. Now to implement in http://caisson.co/

hughsk commented 11 years ago

Sweet, glad to hear it :) And nice, saw this the other day - look forward to seeing the results!