FIX94 / Nintendont

A Wii Homebrew Project to play GC Games on Wii and vWii on Wii U
1.85k stars 317 forks source link

Added support for ZISO Format #1136

Closed JoseAaronLopezGarcia closed 9 months ago

JoseAaronLopezGarcia commented 9 months ago

Adds support for the ZISO (.zso) file format. The ZISO format is a Compressed ISO format using LZ4 decompression. The format is extremely lightweight and optimized for embeded devices, making it perfect for compressing GameCube ISO images.

The ZSO format is born on the PSP and already supported by the PS2, with Nintendont now also getting support for the format.

To compress ISO images into ZSO you can use this tool: https://github.com/PSP-Archive/ARK-4/blob/main/contrib/PC/ciso/ziso.py (mirror) https://github.com/ps2homebrew/Open-PS2-Loader/blob/master/pc/ziso.py

Build: nintendont_ziso.zip

GerbilSoft commented 9 months ago

Nice; will take a look at this over the weekend, along with the previously mentioned exit hangs when using the latest dkP.

JoseAaronLopezGarcia commented 9 months ago

Note: it might be interesting to get rid of the dupplicated lz4.c/h files. I dupped them to ease development since we need to be able to read ZSO on both PPC (GUI) and ARM (kernel). Also there's an issue with detecting endianess on ARM it seems (sdk related perhaps?), I had to force lz4.h in kernel to big endian.

GerbilSoft commented 9 months ago

From lz4.c:

/*
 * Little Endian or Big Endian ?
 * Overwrite the #define below if you know your architecture endianess
 */
#define LZ4_BIG_ENDIAN 1
#if 0
#if defined (__GLIBC__)
#  include <endian.h>
#  if (__BYTE_ORDER == __BIG_ENDIAN)
#     define LZ4_BIG_ENDIAN 1
#  endif
#elif (defined(__BIG_ENDIAN__) || defined(__BIG_ENDIAN) || defined(_BIG_ENDIAN)) && !(defined(__LITTLE_ENDIAN__) || defined(__LITTLE_ENDIAN) || defined(_LITTLE_ENDIAN))
#  define LZ4_BIG_ENDIAN 1
#elif defined(__sparc) || defined(__sparc__) \
   || defined(__powerpc__) || defined(__ppc__) || defined(__PPC__) \
   || defined(__hpux)  || defined(__hppa) \
   || defined(_MIPSEB) || defined(__s390__)
#  define LZ4_BIG_ENDIAN 1
#else
/* Little Endian assumed. PDP Endian and other very rare endian format are unsupported. */
#endif
#endif

There's no check for big-endian ARM here. Add a check for defined(__ARMEB__) and it'll properly detect big-endian ARM. (This should probably be upstreamed, though big-endian ARM is ridiculously rare outside of Wii/Wii U.) [This macro is defined by gcc; to get the full list of predefined macros, run arm-none-eabi-gcc -mbig-endian -dM -E - </dev/null.]

To de-duplicate the files, you'd want to do something similar to the fatfs directory. One directory with both ppc and arm build targets.

carnage702 commented 9 months ago

So there isnt even a tool that is simple for users to compress gc discs to zso? im not gonna go trough command line to test this, and most users wouldnt ever be able to do that too.

JoseAaronLopezGarcia commented 9 months ago

So there isnt even a tool that is simple for users to compress gc discs to zso? im not gonna go trough command line to test this, and most users wouldnt ever be able to do that too.

How is this an issue? You underestimate users. On PS2 everyone used the script, nobody complained, some people even use it on Android. But if you don't like command line that much, some GUI apps have been made on the PS2 scene.

carnage702 commented 9 months ago

been doing compressions and it doesnt compress?

luigi mansion trimmed 272mb compressed to zso 1,34gb starfox daventures trimmed 777mb zso compressed 1,27gb

unless im doing something wrong the values are way off on this?or zso is not meant for gc isos so the files are much bigger than trimmed files? the zso seemed to play fine but they are much bigger than trimmed isos that also play just the same.

JoseAaronLopezGarcia commented 9 months ago

I'm getting similar results too. Compression isn't very good. LZ4 is very weak in terms of compression ratio, but it offers the best speeds and memory consumption. It also doesn't help much that GC games are already very compressed and rarely have any dummy files (which is where compression works best).

The only way to improve this is by using bigger block sizes. The standard is to use 2K blocks (which matches DVD sector size), but with bigger blocks such as 8K or even 16K we should have better compression. The only problem is that it would consume more ram (2 buffers the size of a block each, so 16K for 8K blocks, or 32K for 16K blocks).

I have no idea how much free RAM we have available for this, it was my understanding that Nintendont is already pretty tight, hence why I went for the traditional 2K blocks.

Another thing we can try is Shrink+ZSO, that might give us better results, but I have no idea about compatibility.

carnage702 commented 9 months ago

I'm getting similar results too. Compression isn't very good. LZ4 is very weak in terms of compression ratio, but it offers the best speeds and memory consumption. It also doesn't help much that GC games are already very compressed and rarely have any dummy files (which is where compression works best).

The only way to improve this is by using bigger block sizes. The standard is to use 2K blocks (which matches DVD sector size), but with bigger blocks such as 8K or even 16K we should have better compression. The only problem is that it would consume more ram (2 buffers the size of a block each, so 16K for 8K blocks, or 32K for 16K blocks).

I have no idea how much free RAM we have available for this, it was my understanding that Nintendont is already pretty tight, hence why I went for the traditional 2K blocks.

Another thing we can try is Shrink+ZSO, that might give us better results, but I have no idea about compatibility.

seems like its a waste on gc iso tbh, nintendont already runs gc games with higher framerate than the gc and load times are much shorter too if you use unlock read speed on the nintendont settings(this affects game compatibility tough) due to using wii cpu so there isnt any performance gains from further compressing imo, and using a compression that makes the size way way bigger than trimmed then i dont see why anyone would go through that and trim plus compression would gain you what 50 mb? like you said developers already did alot of compression due to gc iso size being 1.35gb vs ps2 4.7gb dvds. so there really inst much to gain sadly from all this work if at all, from what i tested i made 5 zso isos they are all way bigger than trimmed or the same size.

carnage702 commented 9 months ago

I'm getting similar results too. Compression isn't very good. LZ4 is very weak in terms of compression ratio, but it offers the best speeds and memory consumption. It also doesn't help much that GC games are already very compressed and rarely have any dummy files (which is where compression works best).

The only way to improve this is by using bigger block sizes. The standard is to use 2K blocks (which matches DVD sector size), but with bigger blocks such as 8K or even 16K we should have better compression. The only problem is that it would consume more ram (2 buffers the size of a block each, so 16K for 8K blocks, or 32K for 16K blocks).

I have no idea how much free RAM we have available for this, it was my understanding that Nintendont is already pretty tight, hence why I went for the traditional 2K blocks.

Another thing we can try is Shrink+ZSO, that might give us better results, but I have no idea about compatibility.

also nintendont ram depends on the emulated memory card size, the bigger the user picks the less there is, there is already several ram issues when users use 16mb emulated memory card, that can even break some game features like disc swap and etc due to eating the entire ram.

JoseAaronLopezGarcia commented 9 months ago

We could try another format with a better compression algorithm. I'm thinking of DAX.

Sadly we can't use CSO (Compressed ISO) because it clashes with the CSO implementation of Nintendont (Compact ISO), so we can have one or the other but not both.

CSO and ZSO are basically the same format, only differ in compression algorithm (DEFLATE vs LZ4), so with very little tweaking we can add CSO support, but we would be losing support for the other CSO (Compact ISO) format.

We could also try with DAX, which also uses DEFLATE instead of LZ4.

Now we know that DEFLATE is slower, but as you said, we're basically running GC games overclocked so we have enough CPU cycles to go around, thus I'm not woried about DEFLATE being slower, but I am woried about RAM.

Would a difference of 16KB be bad?

Perhaps before writing any more code we should do the following:

In any case, it should be fairly easy to tweak the code I've made for ZSO and have it read DAX instead, but I don't want to do it until I know that it will actually give good results.

JoseAaronLopezGarcia commented 9 months ago

I'm gonna play around with DAX format. I think that one is perfect for GameCube, the 8K block size and the DEFLATE algorithm should give us much better compression ratio.

The startlet IOP should be able to handle DEFLATE decompression with not much issues and the ZSO reader code can be easily tweaked to read DAX instead.

I'll do this on a different branch so we'll have them separate, I don't see a reason to have more than one compressed format. For now the main thing to do is figure out a lightweight inflate library, I have a few interesting choices.

carnage702 commented 9 months ago

closing this since its effects aren't worthwhile.