Closed romankor closed 6 years ago
There is problem syncronizing large files , since its done all in memory , eventually you ll run out of it Download to memory --> flush to disk
I implemented some kind of a streaming solution:
class StreamingFileTarget(FsTarget): from functools import partial def write_file(self, name, source, blocksize=DEFAULT_BLOCKSIZE, callback=None): def write_stream_data(fp_dst, callback, data): fp_dst.write(data) if callback: callback(data) self.check_write(name) with open(os.path.join(self.cur_dir, name), "wb") as fp_dst: write_stream_partial = partial(write_stream_data, fp_dst, callback) source.read_stream(name, write_stream_partial) class StreamingFtpTarget(FtpTarget): def read_stream(self, name, callback): self.ftp.retrbinary("RETR %s" % name, callback)
But def _copy_file needs to be alerted
def _copy_file
dest.write_file(file_entry.name, src, callback=__block_written)
There is problem syncronizing large files , since its done all in memory , eventually you ll run out of it Download to memory --> flush to disk
I implemented some kind of a streaming solution:
But
def _copy_file
needs to be alerted