Closed irvnriir closed 3 years ago
mmmh, comparing a 14 GB file is heavy. I have to figure out how big files should be treated. In the meantime you can use "l13Diff.ignoreContents": true
if the error appears.
Hi, the new version for large files is now available. Binary files are now compared in a different way. Theoretically it can be up to something like 8 PB. I don't know if this works because in my test environment I compared just GB. If an text file exceeds the maximum buffer length and ignoreEndOfLine or ignoreTrimWhitespace is true, the files will be treated as binary files, because I have to rewrite the buffers. On my computer the max buffer length is 2 GB and on your one it seems it is 4 GB. So it seems it depends on the amount of memory for each machine. VS Code also allows to change the amount of memory for itself. I have also added the new property ´l13Diff.maxFileSize´. If a file size exceeds this value, the file will be ignored for a comparison. If the value is 0 no limit us used.
Could be useful to know which way is used for comparing the binary . Your help is awesome, huge thank for your work .
The binary comparison algothrim compares now in chunks und requires just 64 Mb of RAM. That is meant with "different way". The previous version loaded the whole file into the RAM. That was the reason for the error.
If ignoreEndOfLine or ignoreTrimWhitespace is true and the files are not equal the buffers will be rewritten and the flags are set to true. If a file is treated as equal after the normalization you will see the info in the list view for unchanged files only. A previous version had this info for every comparison (unchanged and modified) and the list view was more like a mess. That's the reason why I removed it.
Hi, I saw your comments in the email.
If l13Diff.ignoreContents
is true
it overwrites all other settings. Please be sure that the value is false
. But yes it can be nearly instantly.
its false
.
so it makes a binary comparison for the whole files .
in testing on 2 GB files, it was nearly instantly to find a change at the end of file, some secs -- in a center, and ~2mins to compare equal files . with these speeds, weight threshold for 13Diff.ignoreContents
could be useful, but is avoidable . thanks again, and sor for my tendency to send messages before i should X) .
no prob. I have written up to 5 or 6 prototypes to figure out the best and fastest way. The first version worked but was really slow. A comparison with two 4 GB files took up to 30 - 40 seconds. After a couple of tests I figured out that a chunk size of 32 MB is blazing fast. I also optimized the loading of these chunks. Finally a full comparison with these two files took only 12 - 16 seconds. It depends now on the speed of your SSD.
im currently using old HDD , thats why its mins, i guess :)
RangeError [ERR_FS_FILE_TOO_LARGE]: File size (14048082216) is greater than possible Buffer: 4294967295 bytes