jianlinwei / theunarchiver

Automatically exported from code.google.com/p/theunarchiver
Other
0 stars 0 forks source link

Ability to run more than one extraction at once. #176

Open GoogleCodeExporter opened 9 years ago

GoogleCodeExporter commented 9 years ago
Sometimes I am extracting multiple archives from either multiple sources,
Gigabit Network Storage devices or RAID devices which can easily sustain
the IO throughput necessary to have more than one extraction. It would be
nice to have 2 things.

1) Ability to specify a default number of "active" extractions.
2) Ability to click on a queued extraction to "force" start it. Kind of
like how there is a `X` icon on the right to cancel an extraction, if there
also was some kind of play button to start an extraction.

Original issue reported on code.google.com by joe.roback@gmail.com on 19 Jul 2009 at 3:20

GoogleCodeExporter commented 9 years ago
The code has always been designed with this in mind, but in the end I decided 
that this feature would add 
unnecessary complexity for very little actual gain. The cases where this is 
useful are very few, and only give a 
very small increase in speed. Mis-using the feature loses a whole lot of speed, 
and just results in confusion for 
users.

Original comment by paracel...@gmail.com on 19 Jul 2009 at 3:45

GoogleCodeExporter commented 9 years ago
I guess. Its certainly your decision as the developer. And I do understand it 
can be
an easily abused option. But almost no extraction is multithreaded, leaving only
multiple extractions as a way to maximize hardware / extraction times.

For instance, the unrar code (including RARLabs own C++ code even compiled with 
Intel
Compiler) at maximum extracts at 15-20 MB/s on an 8-core Xeon with 8GB of 
memory.
Even laptop disks can sustain 80MB/s on a sequential write. I can unrar 4 
archives at
once before I start seeing a degradation in performance. My most common setup 
looks
like this

Linux Box (8-core xeon with single 7k SATA disk) over a gigabit network to a 
Macbook
Pro 17" with 7.2k harddisk. I can sustain 50MB/s easy with 3 archives using the
command unrar tool.

Again, I not trying to force or heavily suggest you to do anything you don't 
want to
do. I've always liked the idea of a replacement for the built-in Mac OS X 
extractor
tool with more power-user-like options, and I think theunarchiver is really it, 
so I
just want to provide helpful, useful feedback. :-)

Original comment by joe.roback@gmail.com on 19 Jul 2009 at 4:02

GoogleCodeExporter commented 9 years ago
Generally if you run extractions directly on local disks, the problem with 
running multiple ones is that seeking 
overhead will kill your throughput. Of course, if the extraction process is 
particularly slow, you can still afford 
that, but in the end it all turns into a big mess trying to figure out when an 
archive can safely be started in 
parallel.

Like said, It's something I've kept in mind but never figured out how to do 
right. I think I'll leave the issue open 
for future versions, though, in case a good behaviour can be worked out at some 
point.

Original comment by paracel...@gmail.com on 19 Jul 2009 at 12:51

GoogleCodeExporter commented 9 years ago

Original comment by paracel...@gmail.com on 20 Jul 2009 at 4:25

GoogleCodeExporter commented 9 years ago
you know, if implemented, user can receive a warning if they enable more than 1 
or 2
extractions at once, stating they should know what they are doing and that it 
may
hurt extraction performance.

This kind of power-option you will never see in Mac's built-in extractor. I am
probably only one of very few extracting more than one archive at a time from
separate sources, so I understand the low/maybe priority. I'll have to stick to
command extraction for now.

Original comment by joe.roback@gmail.com on 22 Jul 2009 at 12:08

GoogleCodeExporter commented 9 years ago

Original comment by paracel...@gmail.com on 16 Sep 2012 at 7:18