schwer-q / xar

Automatically exported from code.google.com/p/xar
0 stars 0 forks source link

Feature request: extracting files one chunk at a time, instead of all at once #2

Closed GoogleCodeExporter closed 9 years ago

GoogleCodeExporter commented 9 years ago
The latest SVN snapshot of xar includes a xar_extract_tobuffer() function, 
which is very nice. 
However, if a very large file, such as one that was 1 GB in size, is stored 
inside a xar archive, it 
could be impractical to extract the entire file to a buffer, all at once. For 
this reason, it would be 
nice if there were a way to read a fixed number of bytes at once into a buffer, 
looping until one had 
read the entire file. This would make it much easier to pipe the data in a file 
to another task, for 
example.

Original issue reported on code.google.com by Charle...@gmail.com on 21 Jan 2007 at 12:14

GoogleCodeExporter commented 9 years ago
yeah, something similar to how the bzip and zlib callbacks work would be good.  
I'll need to look back at the xar 
io code to see how practical this would be.

Original comment by bbraun on 28 Jan 2007 at 3:50

GoogleCodeExporter commented 9 years ago

Original comment by bbraun on 18 Apr 2007 at 1:47

GoogleCodeExporter commented 9 years ago

Original comment by hair...@gmail.com on 27 Apr 2007 at 9:56

GoogleCodeExporter commented 9 years ago
This should be addressed by commit 185.  An example of its use is in 
tools/strextract.c

Original comment by bbraun on 2 Oct 2007 at 7:10