Closed hlixed closed 6 years ago
It can be the culprit, rawutil is very helpful, but actually a bit slow... The problem is that in your 55MB GARC (and probably in the 1.8 GB one), there are thousands of files, so thousands of FATO entries, so thousands of values to unpack. And also, as you say, parts of the data can be copied because of recursive call for the iterator [] in the structure, and because rawutil uses struct.unpack_from for the format characters presents in struct, with the whole data... So the thousands of times.... I'll try to optimize rawutil.
I completely rewrote some parts of rawutil to make it more readable (it needed that...) and to limit exchanges of heavy data blocks. It should be a bit faster now. I also added a function to transparently use file objects, so I'll convert some 3DSkit modules to use it, it will completely avoid these performance problems. I already converted unpack.GARC, so you can test. You won't win a lot of time, because I cannot really faster the files decompression, but that's a little gain.
I attempted to extract a GARC file, but found 3DSKit to be unexpectedly slow, and during this time, a folder was created but was not being populated. Investigating further, I noted extracting a 55MB GARC took almost 2 minutes between when extractGARC.readheader() is called and extractGARC.extract() is even called. Given that GARCs can grow to be over a gigabyte big, this renders such large garcs effectively unextractable.
It seems the line that causes this lag in particular is
fato, ptr = self.unpack_from(GARC_FATO_SECTION, data, ptr, getptr=True)
. Once this is done, all other sections of the code, such as the FATB extraction, proceed smoothly. Couldrawutil.py
's_unpack()
be the culprit? Could the entire 55MB be being copied repeatedly through several nested function calls through the arguments when it shouldn't be? Further investigation is warranted to eliminate this unwanted lag.