schwer-q / xar

Automatically exported from code.google.com/p/xar
0 stars 0 forks source link

Xar chokes with a large number of files #80

Open GoogleCodeExporter opened 9 years ago

GoogleCodeExporter commented 9 years ago
What steps will reproduce the problem?
1. find about 70000 XML files roughly 3k each
2. Attempt to create an XAR archive using the command "xar -cf xml.xar xml/"
3. Watch xar chug for about an hour, eat up gobs of RAM and not finish.

What is the expected output? What do you see instead?

I would like xar to yield an archive in a reasonable amount of time.  
Unfortunately it took so long I just killed the process before it completed.

What version of the product are you using? On what operating system?

xar 1.5.2 on gentoo

Please provide any additional information below.

While the process was running the file on disk remained at 0 bytes.  Does xar 
try to create the entire archive in memory and then dump to disk at the end?  
With a large number of files, this appears to break.

I was attempting to compare the file size and performance difference between 
xar and tar. Tar was able to complete the task within a few minutes and reduce 
the size of the XML directory from 445M to 46M.  I have no numbers for xar as 
it didn't finish.

Original issue reported on code.google.com by roberto....@gmail.com on 16 Mar 2011 at 12:15

GoogleCodeExporter commented 9 years ago
It's my understanding that xar is supposed to do limited work in RAM (e.g. data 
is processed in 4K chunks) and written into /tmp files before the archive is 
constructed.

I've been using the source distributed with Mac OS X. 
<http://www.opensource.apple.com/source/xar/>, which calls itself v1.6

Original comment by 1billgar...@gmail.com on 25 Apr 2011 at 9:22