Open alexcroox opened 9 years ago
OK after creating a new connection for every single file I seem to be having more success! Is this the correct behaviour? It seems like it should be slower and inefficient but seems OK in my testing!
OK the connection per file won't work as a tester of mine got error too many connections from this IP after a while. Back to the drawing board....
I was just bitten by this as well. It looks like the problem is in connection.js
with self._pasvSocket = socket;
. It replaces _pasvSocket each time. So for example, GET 1 sets _pasvSocket, GET 2 overrides _pasvSocket, then GET 1 completes, which clears _pasvSocket, so GET 2 fails with Unable to make data connection
. In my testing, it was always the third GET that was causing the problem, but I would guess that it depends on the timing for each GET.
@phillipgreenii Did you find a solution for this?
@vintuwei I did not. I ended up enforcing one file at a time via async.series
. You can see my code at https://github.com/phillipgreenii/node-ftp-stream/blob/master/src/index.js
Thank you @phillipgreenii
Thank you as well @phillipgreenii
I also found that using ftp.destroy()
instead of ftp.end()
can resolve the issue. Although that is probably not the right way to do it...
@phillipgreenii @vintuwei I'm using https://github.com/coopernurse/node-pool to try and pool ftp instances to transfer multiple files in parallel. Seems to work okay although I'm having issues with the transfer processes just hanging indefinitely intermittently. Really frustrating...
+1
+1, this is still a thing
+1
I'm having a similar issue, though it seems that the client is able to recover from the error and still process all the files in the directory I have.
I couldn't find a pattern to how many or how often the errors occur, seems somehow related to quality of internet connectivity (i.e. if I'm at the office in the local network I see less errors than while connected over wifi from home over a VPN).
I'm having the same problem when downloading multiple files at the same time, but when I download files one by one asynchronously, it solves the problem.
@YiyuanYin Can you please share code. I am facing same issue
Ran into this issue today as well – I ended up wrapping a Promise
around my c.get()
and it worked like a charm!
Code:
c.on("ready", async() => {
await new Promise((resolve) => {
c.get(filename, (err, stream) => {
...
resolve()
})
});
await new Promise((resolve) => {
c.get(filename2, (err, stream) => {
...
resolve()
})
})
...
c.end()
}
I have a timer that runs every 100ms. It goes through a list of known remote file paths on the FTP server and attempts to download them to the local machine.
The timer has locks to ensure that only 3 requests are active at any one time (every time a download finishes it releases another slot to the pool).
This works well for about 20 seconds before
get()
errors out withUnable to make data connection
Here is a reduced example of what is happening
Should I be creating a new connection for each download rather than trying to use the same, is my FTP server throttling me because I'm not making a new connection for each file?