lumphe / ftp-ts

FTP-ts is an FTP client module for node that provides an asynchronous interface for communicating with a FTP server.
MIT License
9 stars 5 forks source link

monitor put/get progress? #3

Closed warpdesign closed 6 years ago

warpdesign commented 6 years ago

Is there a way to monitor put/get transfer progress ?

I tried adding an handler for on('data') on read/write stream but it's never called.

TimLuq commented 6 years ago

Thanks for the issues @warpdesign.

The stream returned from get() should be an instance of import('stream').Readable and as such you would be able to attach a data event listener.

Are you using encryption and/or compression for the transfer? When looking into this the only thing I saw in my first pass was a possibility of maybe pausing the stream indefinitely when using compression (for which also no test cases exist). Could you provide a minimal example where it doesn't behave as expected?

warpdesign commented 6 years ago

For example, this code doesn't trigger any data event and writeStream.bytesWritten is always 0:

const Client = require('ftp-ts').FTP;
const createWriteStream = require('fs').createWriteStream;

// connect to localhost:21 as anonymous
Client.connect({ host: "myhost", user: 'myuser', password: 'password', port: 21 }).then(async (c) => {
  const stream = await c.get('BigFile');
  const writeStream = createWriteStream('BigFile');
  stream.pipe(writeStream);
  stream.on('data', () => { console.log('data') });
  stream.once('close', function () { c.end(); });
  setInterval(() => {
    console.log('bytesWritten', writeStream.bytesWritten, stream.bytesWritten);
  }, 1000)
});

If I interrupt the download I also notice that local BigFile's length is 0.

The following code, using the ftp package which ftp-ts is based on works and triggers the data event as it should:

var Client = require('ftp');
var fs = require('fs');

var c = new Client();
c.on('ready', function () {
  c.get('BigFile', function (err, stream) {
    if (err) throw err;
    const writeStream = fs.createWriteStream('BigFile');
    stream.once('close', function () { c.end(); });
    stream.pipe(writeStream);
    stream.on('data', (chunk) => console.log('data', chunk.length));
    setInterval(() => {
      console.log('bytesWritten', writeStream.bytesWritten);
    }, 1000);
  });

});

c.connect({
  host: 'myftp',
  user: 'username',
  password: 'password'
});

Also, writeStream.bytesWritten increases over time as expected.

TimLuq commented 6 years ago

We've fixed the problem and released a new version 1.0.8.