Open amitkirdatt opened 2 years ago
I can't reproduce the problem against an OpenSSH server. I've tried different node versions, I've tried forcing the SFTP connection to not assume OpenSSH so that it uses the different, lower max read size, and I've tried different file sizes. Every time the SHA1 hash matches on both ends.
What is the ident of this server? Is it not OpenSSH, despite advertising OpenSSH SFTP extensions?
I'd also be curious to know if the OpenSSH SFTP client exhibits the same corruption issue and if not, what the OpenSSH SFTP client debug output looks like for that transfer.
Can you try the master branch of ssh2
directly? There are some changes there that haven't been released yet and I'm wondering if the issue has been fixed or not.
@mscdex I was able to replicate this issue on the latest version. 1gb file was consistently corrupted. I agree with @amitkirdatt that it is a chunk size negotiation issue but I am unsure how else to help.
@Michael-Gibbons well, without any way for me to (reliably) reproduce the problem on my end, it's going to have to be up to others to debug the issue unfortunately.
If it's any help I can confirm the problem does not exist on ssh2 v0.8.9 but it does exist on v1.13.0. Against the same remote server, of course.
Also, here is the remote server software I think?
debug1: Remote protocol version 2.0, remote software version GoAnywhere7.1.2
Had the exact same problem with both libraries. Thanks for sharing your solution!
Hey guys, I had the same issue. However, having read the documentation again, I found that .fastGet()
uses a chunk reading, which does not work for a huge file because a chunk concatenation can corrupt the file, as we see. In this case, .createReadStream()
should be used. It works well for files of 1Mb+ size.
Here is the implementation:
const _sftp_get_file = (sftp_conn, {remote_file_path, local_file_path}) => new Promise(resolve => {
let sftp_read_stream = sftp_conn.createReadStream(remote_file_path, {autoClose: true}),
write_stream = fs.createWriteStream(local_file_path),
downloaded_successfully = false,
_resolve = ({result = false, is_external_error = false, message = ''}) => resolve({
result,
is_external_error,
message,
})
;
console.log(`Download the file ${remote_file_path} => ${local_file_path} from SFTP...`);
sftp_read_stream.on('error', (err) => {
log_error(`Failed ReadStream: ${err.message}`, {err, remote_file_path});
_resolve({
result: false,
is_external_error: true,
message: `Failed to download a file from SFTP: ${err.message}`
});
});
write_stream.on('error', (err) => {
log_error(`Failed WriteStream: ${err.message}`, {err, local_file_path});
_resolve({ result: false });
});
write_stream.on('finish', () => {
console.log(`[!] File has been saved successfully`);
downloaded_successfully = true;
});
sftp_read_stream.on('close', () => {
downloaded_successfully
? _resolve({ result: true })
: 0 // means an Error has occurred
});
sftp_read_stream.pipe(write_stream);
});
Also, here is the remote server software I think?
debug1: Remote protocol version 2.0, remote software version GoAnywhere7.1.2
Just ran into the same issue. Turns out it was the same server in my case:
debug1: Remote protocol version 2.0, remote software version GoAnywhere7.6.1
Thanks a lot for sharing these observations! In my case, I also had to set chunkSize to 31952
Node version: v14.19.1 OS: MacOS 12.1 ssh2-sftp-client: 8.0.0
Issue: We were seeing files being corrupted when using
fastGet
. Resolution: We needed to adjust the chunkSize specific to the sftp server we are connecting toWe started out with ssh2-sftp-client but to narrow the problem down we tried
ssh2
as well. Originally posted here, but also posting in case it is useful to othersWe ended up solving the problem - Just sharing our experience:
Code using ssh2-sftp-client
The text file we are downloading is about ~86MB. We noticed that the file was consistently corrupted (data is missing).
To narrow down the issue we decided to whip up a quick test using
ssh2
We noticed the same corruption.
We then fired up the native sftp client in debug mode and noticed that the server rmax value was
32768
(same as the default chunkSize)We then added debug logging to ssh2
In the debug we noticed the following:
It looked liked the DATA was being broken up into 2 parts: 1 part was 31952 and the 2nd part was 816, which when added up is
32768
. We figured this was causing the issue.We then specified the chunkSize of 31952 for
fastGet
. We didn't see the DATA being broken up into two parts like before (31952 and 816).The files are now downloading correctly.
I am sharing our experience in the hopes that somebody finds it useful.
ssh2-sftp-client updated code that worked for us:
ssh2 code: