Open xkero opened 9 months ago
Actually turns out the crash was due to an unrelated filesystem corruption issue I was having. After fixing that I've found instead that bsdiff4 will just write memory until you run out and the system either hangs or kills the process so chunking the file or streaming it if possible seem to be required for files that exceed available memory.
Running version 1.2.4 installed via AUR package on Archlinux: https://aur.archlinux.org/packages/python-bsdiff4
When trying to run
bsdiff4.file_diff
on a large file (19GB) it sits for awhile, but crashes with:Seems to work fine on smaller files. Do I have to manually chunk large files myself?