RamiroCruzo / freearc

Automatically exported from code.google.com/p/freearc
0 stars 0 forks source link

Huge temp files created during ultra compression #387

Closed GoogleCodeExporter closed 8 years ago

GoogleCodeExporter commented 8 years ago
I am compressing an 18GB dataset using the ultra setting in the GUI. At 10% 
progress, the temporary files created already surpassed the available space on 
the C: drive to run out. Additionally, after clearing more space, the reported 
compression ratio remained 0% up until about 15% progress when it rapidly 
jumped to 30%.

This is very odd behaviour - when the archiver has only processed 10% of the 
source, where did the huge temp file data (15GB) come from? Why was it 
reporting 0% ratio on files that should be entropic enough (game texture files) 
to be 40%-60% ratio at best? Why was the file being processed stuck at a ~150K 
.cab file for over 30 minutes since 12% onwards? 

Original issue reported on code.google.com by Dimo.A.P...@gmail.com on 17 Jun 2014 at 6:49

GoogleCodeExporter commented 8 years ago
that's idea of ultra compression mode - it runs two compression algorithms, 
each of which uses all the RAM available to 32-bit program. so it needs to save 
intermediate data (output of first algorithm that serves as input to the second 
one) on the disk and temporary space used may be up to the total size of input 
data

you can change the Temporary directory (Settings/Main) to the disk where you 
have plenty of space

or you can use "Best asymmetric mode" that's essentially use only one 
memory-hungry algo and as a result don't needs the temp space. "Low-memory 
decompression" checkbox in the Compression advanced sewttings dialog does the 
same change to user-configured compression mode

Original comment by bulat.zi...@gmail.com on 17 Jun 2014 at 7:19