Open GoogleCodeExporter opened 9 years ago
First, I'd need some reliable way to reproduce this. Can you try creating some
dummy RAR files with lots of parts and see if you can figure out a way to
trigger this?
Original comment by paracel...@gmail.com
on 16 Jul 2013 at 11:39
Hold on.. just upgraded to the latest version - don't ask - am trying to
reproduce to confirm issue. Sorry.
Original comment by paulmoons
on 16 Jul 2013 at 11:45
No, same issue at same point. On tenth file open dialogue appears as sated
above. I am running App Store version 3.8 - was running 3.51 previously. Both
on Mountain Lion 10.8.4, Mac Mini Core i5 with 8GB RAM.
I'll have to look into the creation of dummy RARs, though. While I am not an
expert, it feels like maybe theunarchiver is running out of RAM and not
flushing the data from previous extractions during a multi-file extracting run?
Just a guess.
Original comment by paulmoons
on 16 Jul 2013 at 11:59
Correction: The new version behaved slightly differently in that all the
dialogues were of the "Could not extract... error on decrunching" variety.
Original comment by paulmoons
on 16 Jul 2013 at 12:01
Probably not RAM, that would cause swapping and crashes, more likely.
Possibly it is running out of file descriptors, I think there might only be 256
or so of those per process.
Original comment by paracel...@gmail.com
on 16 Jul 2013 at 12:04
Just as an aside, thanks for supporting this awesome app :)
Original comment by paulmoons
on 16 Jul 2013 at 12:11
Okay. I ran another test today, this time on 13 multi (40+)-part RAR archives.
Each multi-part archive was around 4GB (packed). After completing 6 extractions
successfully, theunarchiver began spewing the familiar "Could not extract the
file.. Error on decrunching" dialogue box.
What is also worth noting (and I failed to mention this earlier) is that if I
select "Continue" from this dialogue box, theunarchiver actually creates a
broken file that finder reports to be only 100MB in size. Interestingly, all
subsequent extractions in the run (if continued) result in the same sized
broken file size.
If I press "stop" no file is created but theunarchiver cycles on to the next
file where it returns the same error dialogue box.
Looking at this a different way, the total amount of RAR part files in the
successfully extracted files totals to around 237 and theunarchiver breaks
during the next multi-part file. I know you said streams and not files, but
this has me thinking that maybe 256 part files is the limit being hit?
Furthermore, when I attempt to restart the batch with the remaining multi-part
archive files, theunarchiver breaks after successfully extracting 5 (of the 7)
remaining multi-part archives. Now here is where it gets interesting: The total
parts of the successfully completed files total to 215 which would mean that
during the extraction of the next archive in the batch (where theunarchiver
breaks down) is made up of 41 files which would take the total files
theunarchiver has to deal with during the batch over 256 files!
If I am right, this should be reproducible by attempting to extract any batch
of multi-part RAR files which consists of more than 256 individual RAR parts.
Original comment by paulmoons
on 16 Jul 2013 at 10:44
So that's it then? LOL.
Original comment by paulmoons
on 15 Aug 2013 at 6:30
I am not working on updates right now. I will look into it once I am.
Original comment by paracel...@gmail.com
on 15 Aug 2013 at 9:07
It's all good. I just thought the issue might have been declared a non-issue,
is all. :)
Original comment by paulmoons
on 15 Aug 2013 at 9:25
I'm having the same problem. I'm using The UnArchiver Version 3.9.1 on a 2012
iMac with OS X 10.9 Mavericks. Purchased the Unarchiver on the app store to
support the project. I was also seeing the same problem with previous versions
of OS X and previous versions of the app, and on another computer.
It's exactly the same issue as the OP described: If I queue up more than ~10 or
so multi-part RAR's to extract, then the decrunching/failure messages begin and
all subsequent RAR's in the queue fail or extract as partial/corrupted files.
The workaround is to make sure to only extract around 10 RAR's at a time. Which
is fine in most scenarios, but every few months I do usually run into a project
where I need to extract a few hundred RAR's -- at which point the Unarchiver's
inability to handle bulk extraction is annoying.
I should note, however, that I've run into similar issues with other Un-RAR
apps on Macs -- so the problem doesn't seem to be isolated to just The
Unarchiver. Would be cool if The Unarchiver could fix it though.
Thank you for looking into it!
- Lukecro
Original comment by luke...@gmail.com
on 22 Dec 2013 at 2:27
exactly same issue, using unarchiver 3.9.1
MBP 2.6Ghz, 16gb mem, 1tb sd hdd
Original comment by reymondl...@gmail.com
on 22 Aug 2014 at 4:59
I have resorted to not queuing up more than one or two archive sets at a time.
After the first one or two are extracted, then I add more. As long as I don't
add too many, everything works fine -- Of course this isn't a fix but after a
year of waiting for some movement on this issue, not sure what else to do.
Original comment by paulmoons
on 23 Aug 2014 at 2:30
Unless Apple decides to raise the allowed maximum number of opened files, this
would take some seriously tricky architectural changes to fix, so it's unlikely
to ever happen for such an edge case.
Original comment by paracel...@gmail.com
on 23 Aug 2014 at 11:13
Hmm.. is it possible for The Unarchiver to "buffer" the need to open files?
What I mean is, can it "open" only the files in the first archive set and not
"open" the next archive set (even if it is queued) until it has closed the
first set of files? This should limit the problem to archive sets over 256
parts (which would be rather rare), as opposed to multiple archive sets
totaling 256+ parts (less rare).
Does this make sense?
Original comment by paulmoons
on 23 Aug 2014 at 1:08
It would be possible, but that would require those fiddly architectural changes.
Original comment by paracel...@gmail.com
on 23 Aug 2014 at 5:40
I figured as much.
Original comment by paulmoons
on 24 Aug 2014 at 12:16
Issue 770 has been merged into this issue.
Original comment by paracel...@gmail.com
on 6 Oct 2014 at 9:37
Issue 770 has been merged into this issue.
Original comment by paracel...@gmail.com
on 6 Oct 2014 at 9:40
Original issue reported on code.google.com by
paulmoons
on 16 Jul 2013 at 11:35