hrydgard / ppsspp

A PSP emulator for Android, Windows, Mac and Linux, written in C++. Want to contribute? Join us on Discord at https://discord.gg/5NJB6dD or just send pull requests / issues. For discussion use the forums at forums.ppsspp.org.
https://www.ppsspp.org
Other
11.01k stars 2.15k forks source link

[Request] Support for CHD format #10417

Closed Danixu closed 11 months ago

Danixu commented 6 years ago

Hello,

I'm using from about a month this emulator, and I've discovered the CHD format. It uses lzma compression so is a space saver (much more than CSO format). Now i'm storing the games in 7z and extracting the ISO when I want to play, but CHD has both benefits (stored in lzma format and playable).

Please, add support for CHD to this core.

Thanks!!

RinMaru commented 6 years ago

adding it to the standalone version would also be nice since not everyone uses RetroArch

hrydgard commented 6 years ago

We already support CSO format. While it might compress slightly worse, it's still pretty good.

unknownbrackets commented 6 years ago

All such formats will compress files in blocks, which won't compress as well as a 7z of the entire file. Have you actually compared a CHD of an ISO to a CSO?

Note that you can use maxcso to create CSOs with a larger block size, and using 7-zip's deflate algorithm, which compresses better than most CSO programs. Files compressed this way already work fine in PPSSPP.

-[Unknown]

i30817 commented 6 years ago

One 'advantage' of CHD is that chdman and mame are supposed to have support for a form softpatching (parent-clones relationships, where clones store different files/sectors).

Pity it doesn't work anywhere else (including retroarch, except, ofc, the MAME core)... and they nixed my proposal on their chdman issue report page to add 'reversible hardpatch' to their format (basically, do the boring work of integrating a 'patch update' tool that takes a binary patcher, patches the underlying bytes, and creates a reversal patch to add as header for the next 'patch update' and recompress that again).

Big advantage of such schemes is that they don't depend on lazy/dead emulators implementing softpatching to work (reimplementing the wheel every time), while still being easy to update. Emulators still need to implement reading chd, but that's much easier. The devs thought that since the format specifies that the format should support parent/clone relationships this wasn't needed. I'm more pessimistic, complicated features are the first to be handwaved away.

Anyway, CHD is bad for now for retroarch anyway because the metadata database and their scanner is not ready to accept CHD checksums, internal or external (even the internal checksums are different from the raw cue/bin because they're of a 'sum' of the involved files, not only the 'first track' or whatever hack RA is using - they previously used cues and gdi and of course this failed alot since many gdi are indistinguishable due to dumping group bad idea, so i'd actually prefer the CHD approach). This may change thou, since they seem proud of the CHD support.

ppmeis commented 6 years ago

@unknownbrackets I know the question is not for this topic, But do you have a plan to make a GUI for maxcso? What I want to do is to compress iso files, but keep integrity. I mean:

1st - Original ISO file, check CRC32 on database (for example I use renascene) 2nd - Compress ISO file with maxcso (looking for zso because as I read before is the best format in terms of compression and load times. 3rd - Uncompress ZSO file and the obtained ISO file has same CRC32 than before.

With CSOPlus you will have the best compression but that's because it will delete UPDATE partition, so ISO file will never be as the original once you decompress it.

LunaMoo commented 6 years ago

Is ZSO even ment for the biggest compression? From my test with available formats from best compression to worst: PBP -> CSO2 with bigger block size -> CSO -> ZSO

Using The Leecherman's "ISO~PBP converter" for PBP and maxcso for other formats, maybe I just don't understand the settings which I should use for ZSO, but from my testing it was the fastest while leaving the biggest file.

Anyway if few mb's is a big deal PBP format seems the best for me, it's also supported on PSP and doesn't modify the file in any way, commonly overlooked with some myths caused by Sony's using it for non-PSP games as well, but PPSSPP supports it just fine.

CHD did had a sligly better compression even than PBP, but I don't think it's worth storing PSP games in that format, it's not native to the platform and doesn't have anything which would make it worth using while coming with all MAME stuff that we don't care about.

Guess I'll change the title as PPSSPP is not related to RA it looks like posted in wrong place.

unknownbrackets commented 6 years ago

The performance difference for ZSO only really matters on the 222Mhz CPU of the PSP. A CFW implemented it, so I tried experimenting with it, but it really didn't seem all that worthwhile for desktop.

If your device has a Mhz higher than 1000, it's probably going to make almost no difference in speed (for decompression.) And CSO compresses better.

However, LZMA has a more significant impact typically on CPU, mostly because it usually uses larger lookbehind buffers and lots more memory and memory searching. This means using CHD might have a non-negligible performance impact compared to using CSO/ZSO even for modern devices. Would need to be tested, though.

Not really planning to add a GUI to maxcso, and also not planning to make it do destructive changes.

Note that maxcso is kinda like pngcrush - it tries to compress the same data multiple times to achieve the very best compression ratio. It will usually compress slower than other tools (although can use tons of cores, so not always), but get a better ratio.

-[Unknown]

RinMaru commented 4 years ago

there any update on this?

hrydgard commented 4 years ago

No.

lamvuong2019 commented 4 years ago

Sorry to ask, but it seem that this hasn't been looked fior a while now. Do you thing its possible at all?

hrydgard commented 4 years ago

How much of a win is CHD over CSO anyway? CSO already successfully squishes games pretty well, as previously mentioned, and we do support that.

LunaMoo commented 4 years ago

Depends on data, but if I recall maybe alike 10mb per gb, PBP container which we also support is somewhere between max compression CSO and CHD, making this pretty irrelevant, through there's only windows software for PBP, so CSO is more widely used.

lamvuong2019 commented 4 years ago

I will run a test now on Final Fantasy type 0 and will come back to you with the result for chd, gz and cso level 9

Sanaki commented 4 years ago

I did a quick check on two games:

Dissidia 012 - Duodecim Final Fantasy (USA).iso  1674M
Dissidia 012 - Duodecim Final Fantasy (USA).cso  1291M
Dissidia 012 - Duodecim Final Fantasy (USA).chd  1154M
Patapon (USA).iso  326M
Patapon (USA).cso  211M
Patapon (USA).chd  161M

In theory, chd should also have faster I/O, but I don't have any easy way available to benchmark that.

Tuxie commented 4 years ago

The biggest win with CHD support is being able to use the same tool everywhere, and native fast scan support in dat tools like clrmamepro. The disk space savings are nice too.

lamvuong2019 commented 4 years ago

Isn't PBP used for psx iso only?

I don't know if this is a good example or not

Final Fantasy Type-0 (English Patched v2) Original 3084866 CSO Failed CHD 2410963 - 22% GZ 2434175 - 21% https://ibb.co/TLc98LZ

Dante's Inferno (USA) (v1.01) Original 1770240 CSO 1468477 - 17% CHD 1379283 - 22% GZ 1424640 - 20% PBP 5637395 increase 118% https://ibb.co/w4y1qM9

Summary Saving https://ibb.co/8x71gGW

i30817 commented 4 years ago

CHD has support for parent-child relationship, which are both a interesting way to softpatch and a interesting way to save space in multiple cd games (by making cd 1 the 'parent' of cd 2).

I definitely don't recommend doing the second ofc, since it's kind of a 'semantic' mess - and i'm not sure you can do both at the same time. Or at least you can't do it twice (parent-parent2-finalchild). It works like this: the child has the sha1sum hash of the 'parent' and when loaded you should provide both to get the 'complete overlay'.

Notice the complete lack of filename or paths here. It's to the application to use a convention to find the 'sets'. Either dats to find the filename, path conventions or a scanning step to collect the sha1sum of all chds on a 'game directory' are all options (i prefer the last option because it doesn't depend on users).

Supporting this is currently not supported in libchd (the non mame library handling chd for other projects).

lamvuong2019 commented 4 years ago

The method 2 you describe is a form of deduplication, I think it will be a total nightmare and a massive performance impact if you don't have the necessary cache or cpu to data crunch.

I think the compatibility and easy maintenance of the chd is definitely a worthy method of backup.

Sanaki commented 4 years ago

As the person who opened that issue, I agree clone support is wonderful, but it's also not terribly important for PSP. It would be quite nice for minor patching, sure, but it's really not as useful as for systems like the PS1 with tons of multidisc games.

Also, PBP is only used optimally for PSP, either for uncompressed PSN titles or for PSX-PSP (which is only of value on real hardware). Most of us ripped our PSN versions to ISO for ease of use though.

EDIT: Clones don't have any performance impact. CHD splits the image into tens of thousands of "compressed hunks of data" (hence the name), each of which is compressed individually based on which compression method is most efficient. Duplicated hunks are already referencing the same data, clones just reference almost all of the data from the parent, barring the few hunks that have changed.

As an example, this is the hunk type breakdown from a clone I made of a translation for a PCE-CD game:

     Hunks  Percent  Name
----------  -------  ------------------------------------
       424     1.5%  Copy from self                          
    18,319    66.4%  Copy from parent                        
     1,917     6.9%  CD LZMA                                 
       184     0.7%  CD Deflate                              
     6,743    24.4%  CD FLAC

To be clear, clones are a fantastic feature and were it supported I'd definitely use it, but I feel like worrying about them before they're supported by libchdr would be a bit premature. For now, let's just see about getting the basic format handled.

i30817 commented 4 years ago

The method 2 you describe is a form of deduplication, I think it will be a total nightmare and a massive performance impact if you don't have the necessary cache or cpu to data crunch.

I would expect that chd is a better format than most for this and doesn't use silly half baked features like putting the original file in memory to 'patch it' which is the achilles flaw of every 'softpatching format expanded to cds' that causes all of them to be massive failures with cds - i'm looking at you BPS.

Libchdr may still screw up of course, but the format was done for this kind of 'deduplication' usecase.

And it's not like the native filesystem dedup works for isos or cue/bin anyway, the FS dedup is oriented to file - it does absolutely nothing to two 99% similar files, and compression dedup often screws up and turns that 99% similar into 20% similar if there are some insignificant but regular offsets discrepancies between two files. In fact, often a file compressor is so worried about compressing inside the file that it loses the opportunity to recognize that two large files are very similar since it forgets about the first file because of the compression window.

While chd knows those two files are related and uses filesystem block matching. Of course, even CHD may not help if those two cds are themselves using filesystem compression or cryptography inside their filesystem. Modern games are lame like that, use it case by case.

hrydgard commented 4 years ago

OK, so a win, but not an enormous one. Some people might find it worthwhile and I do understand that it's nice with a common tool.

So if anyone is interested in implementing this, what you need to do is:

I probably won't get to this in the near future myself, busy with finishing up stuff for 1.10 in the time I have.

Sanaki commented 4 years ago

I don't have the time or expertise to do it myself, but I did toss a $15 bounty on it in case anyone else does. Though bountysource seems to be having some issues right now, so it may take a bit to show up correctly. https://www.bountysource.com/issues/52791233-request-support-for-chd-format

lamvuong2019 commented 4 years ago

I am also willing to support the bounty, its also failing for me too. I tried to top it up with 15 also... getting a massive red errror something went wrong, I will try again later to see if it works.

It had an error but it went through anway, kinda strange anyway my pledge went through and the strangest thing is it is still staying at 0 dolar in the bounty... Hopefully someone pick this up as it is a big bonus for everyone.

Here is my proof for the pledge and no expiration. https://ibb.co/qkKKqD5

Sanaki commented 4 years ago

If it hasn't shown up by tomorrow I'll contact them about it. Given that the transactions are being recorded, it should catch up eventually. PPSSPP CHD bounty

LunaMoo commented 4 years ago

Isn't PBP used for psx iso only?

I don't know if this is a good example or not

Final Fantasy Type-0 (English Patched v2) Original 3084866 CSO Failed CHD 2410963 - 22% GZ 2434175 - 21% https://ibb.co/TLc98LZ

Dante's Inferno (USA) (v1.01) Original 1770240 CSO 1468477 - 17% CHD 1379283 - 22% GZ 1424640 - 20% PBP 5637395 increase 118% https://ibb.co/w4y1qM9

Summary Saving https://ibb.co/8x71gGW

No, PBP is Sony's native container, it's used for all PSP games available from PS Store, it also can be used to compress iso's althrough there's only one software for that and it's win-only ~ https://sites.google.com/site/theleecherman/IsoPbpConverter

You probably used PSOne software to end up with larger file, it has no sense as PBP has better compression than CSO with large chunks and as far as I recall testing size was comparable to chd.

Adding to your results of FFT0, converted to megabytes for readability:

original(megabytes)                                   3 012
CSO1 lvl 9                                            2 497
cso2 with 4096 chunk                                  2 451
PBP                                                   2 404
cso2 with --block=32768 --format=cso2 --use-zopfli*   2 388
CHD                                                   2 354

So you save 50 mb on 3000 mb file(PBP - highest compression format supported currently in PPSSPP vs the unsupported CHD), that's roughtly 16 mb per 1000 mb's. Also the tool you use for CSO sucks, most modern CSO tools can support filesize above 2gb, I'd recommend [Unknown's] Max CSO, it also allows compression with larger chunks which is producing smaller file size than just using lvl 9 compression CSO.

Edit: included cso2 with [Unknown's] settings. So that's an 11,33mb difference per 1004mb between what we already have and CHD.

unknownbrackets commented 4 years ago

If you want to maximize CSO compression I recommend 16384 block size, you could go with 32768 + zopfli if you want to save every byte. Might get it around PBP size with that. Warning: zopfli might easily take over an hour.

I wonder if PBP would improve via "crushing" (not sure what parameters lzrc has to tune or if there's any annoying patents in the way there...)

-[Unknown]

lamvuong2019 commented 4 years ago

Don't know what happen the bounty is still at 0, and bountysource.com is really slow to load up. I will contact support to see if they can fix it.

vnctdj commented 4 years ago

@unknownbrackets

Warning: zopfli might easily take over an hour.

Over an hour? I'm really surprised: I've just compressed Monster Hunter Freedom Unite's ISO (845.6 MB) with the following command: maxcso --block=32768 --use-zopfli input.iso -o output.cso and this took a bit less than 6 minutes on my laptop with an i7-6700HQ to produce a 763.4 MB CSO.

unknownbrackets commented 4 years ago

Right, but you've got 8 threads, and iirc FF Type-0 is closer to 3 GB. It could take a lot longer for someone on a weaker CPU (maxcso speed is basically linear to size of ISO and number of cores, which should hopefully be true up to at least 32 cores.) Zopfli specifically can vary a lot in speed depending on data as well, iirc.

Does any tool exist to try different LZMA settings at a block level to produce the smallest possible CHD (like maxcso)? Are the provided numbers already this, or might there be lower, more compelling numbers?

-[Unknown]

Danixu commented 4 years ago

I only know chdman for conversions to CHD, and the only thing you can change is the hunk size, but making it bigger has worst compression ratio. I'm working on convert all my library to CSO using the options --block=32768 --use-zopfli to compare with the best compresson. For now the result is about a 10% of reduction on CHD compared with CSO with just 7zip option (from 67.3Gb to 61.2Gb).

Sanaki commented 4 years ago

For making chd, chdman is the only tool at the moment, partially because nothing else is needed. I wouldn't be surprised if some GUI wrappers show up for it in the future though. As for options, this is everything you can change when creating:

Usage:
   chdman createcd [options], where valid options are:
--output, -o <filename>: output file name (required)
--outputparent, -op <filename>: parent file name for output CHD
--force, -f: force overwriting an existing file
--input, -i <filename>: input file name (required)
--hunksize, -hs <bytes>: size of each hunk, in bytes
--compression, -c <none|type1[,type2[,...]]>: which compression codecs to use (up to 4)
--numprocessors, -np <processors>: limit the number of processors to use during compression

As for the bounty, I sent them a message about it (and referenced your order ID as well), but haven't heard back yet.

unknownbrackets commented 4 years ago

Sure. I'm talking about these: https://stackoverflow.com/questions/3057171/lzma-compression-settings-details

LZMA internally has different options, which are tradeoffs. maxcso for example tries multiple different zlib options for each block (zopfli is just the slowest one it can try, and the only one disabled by default.) This makes it (much) slower than it could be, but improves the compression ratio.

For example, with LZMA, maybe some blocks or "hunks" would improve with a lower hash bytes or high fast bytes setting. Maybe mc should be higher. If a 1 GB file is made up of 32768 blocks, each block could use separate settings. If one block is better by 10% with one setting, another by 5% with another, etc... you could end up saving several more percentage points.

Downside of course is you basically have to try each of those different settings. If you try 10 different settings for each block to pick the smallest result, you slow down compression by 10x.

But maybe none of these settings really make a difference when compressing 32,786 bytes.

I've actually considered what compression could look like using a dictionary (like zstd.) Would sacrifice possibly 64 KB (compressed) of the overall file to have a few dictionaries each block could start with. In compression, this is the typical solution to compressing small files - but it works best when they often have patterns. ISOs probably do, but I haven't tried it out to see what savings this could yield.

Another idea (without any "sacrifice") would be to chain blocks together, i.e. use every 16th block as the dictionary for the following 15. This would help more than the above if there's a lot of data locality in the ISO, which again might be - but again, that's unproven.

I know this is a bit off the topic of CHD support specifically, but if you care about compressing your files the above is possibly interesting.

-[Unknown]

Sanaki commented 4 years ago

You may want to look at the issues around the chd format itself on the mame repo, since that's where chdman is developed. Supposedly they're slowly working on v6 of the format right now, though they haven't said anything publicly about it. Regardless, I've seen no sign that there's any way to fine-tune compression methods. Probably because it attempts to optimize for every hunk separately, and forcing options universally would be detrimental to both performance and compression ratio.

Tuxie commented 4 years ago

Part of the idea with CHD is that it's supposed to be deterministic. Everyone with the same source data should get the same resulting CHD file. ROM managers can convert back and forth between split/merged sets and validate and identify the files using checksums. When CHD v6 is released, everyone can convert their CHD v5 collection to CHD v6 themselves, then point a CHD v6 torrent to the converted files without having to redownload anything.

LunaMoo commented 4 years ago

@vnctdj you sure you didn't made any typo in the command? For me compressing FFT0 did lasted around an hour with:

--threads=10 --block=32768 --format=cso2 --use-zopfli

and with the same setting MHFU compressed in ~ 4 minutes, BUT - I got significantly smaller size - 736mb instead of yours 763mb(starting size was same). Through I do use cso2 instead of default cso1, so maybe that makes that difference.

Also interesting, in this game, CSO2 was smaller than CHD by 4mb, through maybe I just don't know how to use Chdman, I used only "createcd" parameter.

Edit: Derp, maybe csov2 was just unsafe to use;c, seems it corrupted the results, oh well. Actually the corruption is caused by threads parameter. Removing it resulted in same file size, but properly made CSO.(Note I have more, but played some native game in the background;p.) It's weird I got smaller file size, but maybe yours is a typo and 63 should have been 36 as I re-tested without limiting threads and got proper cso and cso2, both with 736mb size, 4mb smaller than chd.

Sanaki commented 4 years ago

Bountysource refunded me and said they'll update the community when they have the problem resolved. So for anyone looking to solve this, even if you do before that's fixed, I'll still put the bounty back up for you to claim once I can, but uh... can't really do that yet.

lamvuong2019 commented 4 years ago

Same here, I was also got a refund, so if anyone is interested the money is there and I add the bounty when I get the OK from them.

lamvuong2019 commented 4 years ago

@vnctdj you sure you didn't made any typo in the command? For me compressing FFT0 did lasted around an hour with:

--threads=10 --block=32768 --format=cso2 --use-zopfli

and with the same setting MHFU compressed in ~ 4 minutes, BUT - I got significantly smaller size - 736mb instead of yours 763mb(starting size was same). Through I do use cso2 instead of default cso1, so maybe that makes that difference.

Also interesting, in this game, CSO2 was smaller than CHD by 4mb, through maybe I just don't know how to use Chdman, I used only "createcd" parameter.

Edit: ~Derp, maybe csov2 was just unsafe to use;c, seems it corrupted the results, oh well.~ Actually the corruption is caused by threads parameter. Removing it resulted in same file size, but properly made CSO.(Note I have more, but played some native game in the background;p.) It's weird I got smaller file size, but maybe yours is a typo and 63 should have been 36 as I re-tested without limiting threads and got proper cso and cso2, both with 736mb size, 4mb smaller than chd.

I just think that going to a updated/supported format aiming at emulations is definitely a benefit for everyone especially when mame has massive community that will keep things moving.

It is like the usb / thunderbolt / apple cable... we now have only one usb c... but we still have apple who is not following the trend and sticking to their lightning cable.

unknownbrackets commented 4 years ago

Can you create an issue in maxcso if threads isn't working? I just tried it on a smaller ISO (I don't have FFT-0) and it worked, but maybe something is up...

Tuning different parameters per block (which chdman doesn't seem to do) definitely would produce the same results every time. It'll just use tuned parameters per block, but which parameters are best per each block will stay constant. That said, I have little interest in the large for-pirating torrent use case.

From looking briefly, "createcd" may be the wrong command to use, as it seems to try to automatically strip and reapply the error correction bytes. Not sure if it has a worse ratio in ISOs that don't have these.

-[Unknown]

Sanaki commented 4 years ago

No, createcd is the correct command to use for iso files. Though I'm not sure what you're referring to by error correction bytes, I thought those were only present in 2352 bins, not isos (which are all 2048).

LunaMoo commented 4 years ago

I just think that going to a updated/supported format aiming at emulations is definitely a benefit for everyone especially when mame has massive community that will keep things moving.

It is like the usb / thunderbolt / apple cable... we now have only one usb c... but we still have apple who is not following the trend and sticking to their lightning cable.

CHD is more aimed at piracy and torrenting full console game libraries than anything else, it's aim is definitely not emulation, MAME itself is also in the grayzone of emulation with very few of it's users owning even a single of the games it supports and probably 0 people have full library due to price of buying and storing all of the arcade cabinets and special gear they came with etc. just because you replace the word "piracy" with "preservation", will not make it legal.

CSO, PBP, ZSO etc. on the other hand are great to support in PSP emulator, because they origin on PSP and are supported there just like they are on other Sony's consoles that supports PSP emulation with some homebrew enhancements.

Also even through this analogy is poor as hardware vs software hardly ever translates over, we don't have "only USB-C" either, other USB standards are just as widely used as they're cheaper to manufacture and/or have more sense in different devices. USB-C is pretty much only standard in modern smartphones and even then you have to carry your own cable around just like iOS users, because not everyone owns one. Worth mentioning, older smartphones will never get USB-C support, so your example is pointing to the opposite you'd wish.

i30817 commented 4 years ago

The only cd compressed fileformat with sensible softpatching, metadata management, and unique checksums (all non single file formats fail at this or do not memoize it, even if i like the separate track approach for xdelta binary patching) is 'just for piracy'? It's not multi cd but in my biased opinion that's good since it would complicate the interesting parts like the softpatching or the checksum checks for downstreams.... though it could have some possibilities for compression since filesystem file parsing might get better results for block compression if it tries to recognize that gamedata is shared on multiple discs, something agnostic stream compressors mostly fail at because of their pathetic compression window and modern games fascination with compressed virtual filesystems..

A bold claim, that i suggest you reconsider sooner or later. That digression about how 'the other originated on the psp' as a justification is quite amusing, considering that i quite remember people complaining about the official releases of ps1 reissues in pbp needing reconversion to pirate copies because sony habit of lowering audio quality. Why are you even invested against this?

unknownbrackets commented 4 years ago

I think a discussion about the finer points of in what ways file formats can be inherently evil, or are just tools that can be used in good or bad ways, is a discussion that belongs somewhere other than this issues list (perhaps the forum or reddit or something?)

And to note, the Thunderbolt comparison can also definitely be spun multiple ways too, which I'll avoid clarifying to avoid throwing more flames on the fire. As long as we're not talking about removing CSO/PBP/etc. support when adding CHD, it probably doesn't matter which is more like USB-C anyway. Again, let's keep arguments about the finer points of that elsewhere too.

Sorry if I veered this off topic.

-[Unknown]

lamvuong2019 commented 4 years ago

I don't know why or how we arrived at a hey can we please have a new flexible format for storing our backup images into a piracy conversation...

lamvuong2019 commented 4 years ago

I think a discussion about the finer points of in what ways file formats can be inherently evil, or are just tools that can be used in good or bad ways, is a discussion that belongs somewhere other than this issues list (perhaps the forum or reddit or something?)

And to note, the Thunderbolt comparison can also definitely be spun multiple ways too, which I'll avoid clarifying to avoid throwing more flames on the fire. As long as we're not talking about removing CSO/PBP/etc. support when adding CHD, it probably doesn't matter which is more like USB-C anyway. Again, let's keep arguments about the finer points of that elsewhere too.

Sorry if I veered this off topic.

-[Unknown]

Nothing has been requested to be removed, and we are not recreating anything but just adding a feature that is already there into other platforms into ppsspp.

vnctdj commented 4 years ago

@vnctdj you sure you didn't made any typo in the command? For me compressing FFT0 did lasted around an hour with:

--threads=10 --block=32768 --format=cso2 --use-zopfli

and with the same setting MHFU compressed in ~ 4 minutes, BUT - I got significantly smaller size - 736mb instead of yours 763mb(starting size was same). Through I do use cso2 instead of default cso1, so maybe that makes that difference.

Also interesting, in this game, CSO2 was smaller than CHD by 4mb, through maybe I just don't know how to use Chdman, I used only "createcd" parameter.

Edit: ~Derp, maybe csov2 was just unsafe to use;c, seems it corrupted the results, oh well.~ Actually the corruption is caused by threads parameter. Removing it resulted in same file size, but properly made CSO.(Note I have more, but played some native game in the background;p.) It's weird I got smaller file size, but maybe yours is a typo and 63 should have been 36 as I re-tested without limiting threads and got proper cso and cso2, both with 736mb size, 4mb smaller than chd.

@LunaMoo, I confirm I didn't made a typo. It's indeed weird... Only thing I could think: do you use the USA version of the game ? I use EU version. But if it is the reason for the file size difference, this would be surprising given that how close the versions are!

Sanaki commented 4 years ago

In light of Bountysource's lack of reliability and recent decisions I won't be refiling the bounty with them. If another bounty provider crops up, I'll place it there. Otherwise I'll still honor the promise I made in case this issue is resolved, but it may have to be direct over paypal.

liberodark commented 3 years ago

I think yould be good to use : https://github.com/rtissera/libchdr for fix this issue

chinarut commented 3 years ago

I notice most of the comments here center around compression ratio (and if I understand correctly, also flexibility). Just want to chime in that “time to compress” is an important criteria for me with massively aging hardware (2010 mac mini server w patched 10.11.6 w 10GB RAM) doing all the compression.

Earlier today, I accidentally converted about 7 ISOs using chdman (v0.226) and braced myself to spend another hour creating CSOs using PSP CSO converter.

I was a bit shocked it took 50 seconds to compress a 856MB ISO (logical size 1GB) at compression level 6 down to 770MB. Chdman takes ~13 minutes at the same compression level (literally also 770MB!).

While I am all up for unifying around CHDs, I now can see how CSOs work well w PPSSPP

unknownbrackets commented 3 years ago

You'll get better performance and possibly better compression using maxcso with the --fast parameter. See this table for more compression options.

This is a good chart for comparisons of compression/decompression time and ratio tradeoffs: https://quixdb.github.io/squash-benchmark/unstable/

LZMA is known to be slow both to compress and decompress.

-[Unknown]