fraganator / archive-cache-manager

A LaunchBox plugin which extracts and caches large ROM archives, letting you play games faster.
GNU Lesser General Public License v2.1
11 stars 5 forks source link

Switch to Zstandard enabled fork of 7z ? #30

Open nixxou opened 2 years ago

nixxou commented 2 years ago

I'm considering converting my whole 7z library to Zstandard 7z file using https://github.com/mcmilk/7-Zip-zstd Until this day, i was compressing my 7z file with "-m5=LZMA2:d64m -mmt=2" options, and i got good results. But if i do Zstandard, even if i got a slightly bigger file, i get a decompression speed way more fast (i go from 35MB/sec to 230MB/sec on a test file) and on a standard gaming use with your plugin, i think this speed boost can be interesting.

fraganator commented 2 years ago

Hi @nixxou - thanks for the heads up. Funny you should mention zstd, I was just looking at ZArchive (for Wii U games) which also uses zstd.

Have you tested the fork with the plugin? If you replace the 7z.exe.original and 7z.dll.original files in the ArchiveCacheManager\7-Zip folder with the forked versions (renamed to append .original), it should work. Edit: It might need some additional file extension checks for zst, liz, lz4, lz5 in Zip.cs, but that should be a quick change. Are you compressing zstd in the 7z container?

I'll do some testing myself, and if it all looks good I don't see why the forked version shouldn't be used.

nixxou commented 2 years ago

Yes, i'm compressing using the 7z container. Still, i found a technical issue with this zstd compression that prevent me to use it for my personal project.

In my 7z i have a lot of similar files, when i use LZMA2 with a dictionary size bigger than the average size of the files, i get a final size close to the size of one of the files. So basically, the number of similar files don't have impact on the final size of the archive. If i try zstd, no matter the compression parameters, the final size will be the same that if i compress each file individually.

So, for exemple, if i take a subset of my N64 goldeneye archive with only 14 rom of 16MB, with LZMA2 and 24MB dict file, i get a final archive of 16.2MB, with Zstandard, i get 144MB.

So while it's great when you compress individual file, not so much for my use case with multiple similar file. Still i guess that's a feature that would interest some people that archive individual files.

nixxou commented 2 years ago

There is a parameter of zstd that give good result. The long option. Like if i do a zstd golden.tar --long=30 i only get a 17MB archive. Sadly, the 7Z fork with zstd doesn't support this "long" parameter of zstd. So i'm looking of a way to solve this.

nixxou commented 2 years ago

Ok, nevermind, i think i found a way and from my preliminary test, it's freaking fast. 7z.exe a -m0=zstd:long=33 archive.7z path/*

I use as a sample a GoldeneyeN64 archive with nearly 6Go of the same 16MB roms with variations (like level hacks) If i compress a 7z file with LZMA2, Speed 5, Dict 64MB and Thread 8 I get a 461MB file created in 6m20 on my computer (and it use 3.5 Go ram while creating it) I know i can do better with 2core only and normal compression (235MB), but i will have to spend an eternity.

Now with 7z.exe a -m0=zstd:long=33 G64.7z "GoldenEye 007*" It took only 20 sec and i get an archive of 225MB It's also quicker to decompress than my old 235MB LMZA2 archive. It's fucking awesome

Gonna test within the plugin next. Edit : Seems to work fine with the plugin if i just replace the 7z exe and dll.

fraganator commented 2 years ago

Nice work :+1: The decompression speeds are really impressive compared to LMZA2.

nixxou commented 2 years ago

in fact i was not able to reproduce the 30MB vs 230MB, but still in comparison to my heavy compressed LMZA2 file, i get at least double the decompression speed. Definitely worth it.