F32Organization / Faithful32-1.7.10

1.7.10 repository for Faithful 32 Modded Edition.
http://f32.me/
105 stars 56 forks source link

Suboptimal image compression #281

Open Zyx-2000 opened 9 years ago

Zyx-2000 commented 9 years ago

I noticed that many textures in this pack are far from optimally compressed, making the download size unnecessarily big, and perhaps also impacting loading performance. If the people responsible for this pack agree, I can recompress all textures to a more optimal size (lossless, of course), and make a pull request.

Many textures take up ~4.5kB, which is more than the 4kB they would use if saved as raw data, and a lot more than ~1kB with better compression. Some of the existing texture have decent compression (~1.5kB) which in many cases can be reduced even further. For many mods, the total size of the textures can be halved.

ghost commented 9 years ago

You are welcome to try but please test all the textures you compress in the game. I know its a pain to check that many but last time I let someone try compressing a bunch of textures, it created some visual glitches and I had to revert over 25 thousand files.

Zyx-2000 commented 9 years ago

Do you know of any way to view a texture in-game without installing all the mods? It would reduce the amount of work required a lot, since my computer wouldn't be able to handle that many mods at once. Otherwise I might only recompress the textures for the mods in my modpack.

I'd guess the guy who tried recompressing earlier used some lossy compression method, thus causing the glitches. Perhaps he forced images down to 256 colours using paletted compression when it wasn't suitable.

ghost commented 9 years ago

Supposedly PNGOUT was used. To view the textures properly, you will have to have the mods installed. I assume several of them use special rendering because I know both ThaumCraft amber and Forestry apatite had issues, and those are just the two I remember.

I am not honestly all that concerned about compression, texture loading seems to be the quickest part of the startup.

Zyx-2000 commented 9 years ago

After a few seconds of testing, I discovered that minecraft doesn't seem to handle grayscale images correctly. They get a much brighter colour than intended.

ghost commented 9 years ago

Indeed, that was another issue that was discovered now that I recall. I believe the vanilla items have already undergone optimization and it worked well and I think that's because none of them are complete grayscale.

StefanTT commented 9 years ago

Converting the images to indexed colors usually helps. But as it makes manipulating them harder I did not do it for most of my images.

Maybe we could have a script that converts the images to indexed if this makes the file smaller, and run this script when creating a release package?

Zyx-2000 commented 9 years ago

I first noticed the grayscale issue on the vanilla stone bricks; the fact that the problem didn't appear on the mossy variant was what told me that it is a grayscale issue.

Anyway, what I'm going to do is that I will make compressed textures for the mods in my modpack over time, uploading them when I've tested them. I'll also make my own edition of this pack, only containing support for those mods, for usage by myself and friends.

mouse-an commented 9 years ago

Well, I just run pngcrush though all folders with "-brute -l 9" arguments.... According to 7-zip info: Original pack: 54 345 453 bytes After pngcrush: 33 479 819 bytes in uncompressed folder.

Tested with mods I'm using - no difference with the original pack. If someone interested: https://dl.dropboxusercontent.com/u/43499309/F32-1.7.10_crushed.zip

Evolution0 commented 9 years ago

Was gonna mention using -reduce with pngcrush but MC doesn't like that at all.. (see below)

Edit: Either -reduce is causing issues or my version of pngcrush is borked.. will keep testing. So -reduce actually causes the grayscale issue (as it converts to grayscale on textures that have no color) and as @Zyx-2000 mentioned MC doesn't handle it correctly.. though from what I can see the gAMA data is messed up. I say this because you can open the crushed images with messed up gamma into a plain old image viewer and its fine but use the image in-game as a texture or in photoshop and it looks incorrect.

If adding an optional compacted release package seems like a good idea extending the compilation script to do so is quite easy, here is an example:

#1.7.10 build portion.
#Check to see if build variable is set to true or false, then perform appropriate actions.
if [ "$B1710" == "true" ] ; then
    #Unpack downloaded .zip file. -q performs the operation with less screen output. -qq would be even quieter.
    unzip -q $DL2
    #Switch to unpacked directory for 1.7.10.
    cd ./Faithful32-1.7.10-master
    #Remove TODO folder and any *.md files.
    rm -r TODO/ ; rm *.md ; rm -r OldModTextures/
    #Package contents of unpacked directory into .zip file with silent and recursive modes on.
    zip -rq F32-1.7.10.zip *
    #Copy .zip file to web directory, replacing old version forcefully.
    cp -f F32-1.7.10.zip $BASE
    #Crush and package
    for png in `find . -name "*.png"`;
    do  
        pngcrush -q -brute -l 9 "$png" temp.png
        mv -f temp.png $png
    done;
    zip -rq F32-1.7.10-Crushed.zip *
    #Copy .zip file to web directory, replacing old version forcefully.
    cp -f F32-1.7.10-Crushed.zip $BASE
    #Switch back to Git directory in preparation for removing unpacked folder.
    cd $GIT
    #Remove unpacked folders.
    rm -r Faithful32-1.7.10-master
    #Delete old checksum file then replace it with the new one.
    rm $SUM4
    mv $SUM6 $SUM4
fi

Crushing after zipping the regular release is intentional, crushing takes a bit of time and this would avoid regular releases being held up by the crushing process.

Would also only require minor editing of the site php/css.

ghost commented 9 years ago

If the compression has no loss and there is no texture issues, I will just use that for the normal packs instead of offering a separated version.

Evolution0 commented 9 years ago

It doesn't seem to so far, it was mainly -reduce and converting B&W textures to grayscale that was the issue.

Also edited my script example, forgot to zip it after it was done. Though you will probably reorganize the script if it becomes the main release anyway.

ghost commented 9 years ago

Have you checked the final size if -brute is not used? Apparently that checks each image with every algorithm instead of using heuristics to find select just a few. I can imagine that dramatically increasing build time.

Evolution0 commented 9 years ago

I'll do a run of each to see, while I think its not actually much (its only a few ms per file on average) I suppose that can add up when you have something like 25k files.

ghost commented 9 years ago

Yeah, its in the range of 20 thousand files I believe. If possible I would also like to know how long each run takes and what processor you are using.

Evolution0 commented 9 years ago

Took a about an hour for the -l 9 only build. My processor is a Phenom II X4 B55 @ 3.2Ghz

Doing the -brute build now (For buildtime only, I have the size from a previous untimed build)

Size difference between them: Non-Brute: 39,168 KB Brute: 37,850 KB

Keep in mind the final zip is uncompressed, this is just throwing them into a store archive as attempting to compress them further actually increases the size due to entropy.

ghost commented 9 years ago

Yikes, an hour is a bit longer than I was expecting. Is it using all 4 cores while doing so? I really don't want max out the CPU on the server for more than a couple minutes at most so I might have do the compression on something with a little more power that I have at my house and then upload the finished file.

Evolution0 commented 9 years ago

Yeah an hour is bad, I had made sure it wasn't underclocking either and did the crushing on my ssd.

As for using all cores no I don't think so, though it seems its possible to make it do so: http://www.nathanboyce.com/pngcrush-script/

Should only require a little bit of modification.

ghost commented 9 years ago

Let me know what the brute timing ends up being. From what I can see, png compression is strictly CPU based once you have small enough seek times, which any SSD should be able to reach.

If we end up multi-threading the compression to use 4 cores I can cut down the non brute time to 10 minutes or less. The cores in the server are a good bit faster than what you have, so it will be more like a 6x improvement instead of 4x.

Evolution0 commented 9 years ago

The brute build ended up being about 15 minutes longer.

ghost commented 9 years ago

I am setting some time aside in the near future to setup a dev environment and then I will be adding compression to the build script.

ghost commented 8 years ago

Currently testing the following method:

for png in find . -name "*.png"; do
pngcrush -ow -brute -l 9 "$png" done;

If anyone has a suggestion for making it faster that would be awesome.

ghost commented 8 years ago

Original pack: 79,474 KB

Compressed .zip after running pngcrush: 74,479 KB

Uncompressed .zip after running pngcrush: 81,056 KB

There is some merit to doing compression in the build script but it will require some tuning. Going to make some changes to reduce the base size first.

Evolution0 commented 8 years ago

Moving over to a python based build script using subprocess would probably speed things up.

After a bit of googling it looks like someone has already done something similar (but better): https://github.com/bitcoin/bitcoin/blob/master/contrib/devtools/optimize-pngs.py

Wouldn't take too much modification to make use of it.

StefanTT commented 8 years ago

I doubt that the shell is the bottleneck. How about parallel execution of pngcrush?

find . -name "*.png" | xargs -n 1 -P 4 pngcrush -ow -brute -l 9

This would have 4 instances of pngcrush running.

Found at http://stackoverflow.com/questions/28357997/running-programs-in-parallel-using-xargs

ghost commented 8 years ago

Interesting approach although I was referring to final size rather than build time.

The pack has gone up in size over time and the best compression plus trimming old textures will now have final size over 70MB. I am currently debating the merit of a restructure so that Github would have the textures sorted my 'modID.version' and the build script becomes responsible for choosing what goes in the main pack.

This has the benefit of making automating individual mod downloads but requires maintaining a list of textures that go into the main pack which would be a matter of debate unto itself.