shssoichiro / oxipng

Multithreaded PNG optimizer written in Rust
MIT License
2.72k stars 114 forks source link

Large image error. #326

Closed BuyMyMojo closed 3 years ago

BuyMyMojo commented 3 years ago

When using the windows version I get this error through powershell powershell_MvSp2kpYFI

and using it through the linux subsystem grants me this error wsl_h3mXFlChZO

TPS commented 3 years ago

Is it possible to post the PNG itself?

BuyMyMojo commented 3 years ago

Yeah I can in a sec

BuyMyMojo commented 3 years ago

sorry for the wait https://drive.google.com/file/d/14w8h6Z3Ri9kx5WUkKetujwVPZnERkb1N/view?usp=sharing

TPS commented 3 years ago

@BuyMyMojo Seriously, 1 GB PNG?! Perhaps 7Z/Zip before uploading anyplace. Let me see if any of my other tools can handle it… whenever I get it downloaded.

BuyMyMojo commented 3 years ago

I tried using PAQ on the png at some point and that only got it down to 870mb, the png itself is using the 100% compression straight out of blender. this isn't even the largest png I have ;)

sorry for the large size either way

BuyMyMojo commented 3 years ago

explorer_RPbR6KpDyp

PAQ ultra vs 7Z ultra, not enough file saving lol

TPS commented 3 years ago

I got & am working through it. Part of the size issues seem to be related to its 16-bit color channels (so 48-bit PNG), which I don't know that OxiPNG can handle anyway.

Also, how many unique colors has it? I'm only on a 24-bit display, but my installation of IrfanView counts 50,894. (So, is the image functionally using 16-bit color?) Also, TweakPNG can load it, but can't really manipulate it. All to say, I'm not shocked that OxiPNG dies.

BuyMyMojo commented 3 years ago

Yeah it's a pretty over sized PNG, it's exported straight from blender at 16bit per colour so I doubt it out a cap on the unique colours

shssoichiro commented 3 years ago

16-bit should be supported by oxipng, however the issue seems to be that there's too much memory required and the executable is aborting when it attempts to allocate such a large amount of memory. However... I would think allocating 1.6GB (what the error messages shows) would not be a problem as long as the system has enough memory. I'm testing on a system with plenty of memory and still receiving an error.

BuyMyMojo commented 3 years ago

I'm running 32gb myself, at first I thought it was like an x32 executable error where the ram is limited to 2gb but 1.6 ≠ 2 lol

BuyMyMojo commented 3 years ago

oh yeah there is another error that occurs on rare occasion with this file:

image

BuyMyMojo commented 3 years ago

any ideas on how my totally necessary images break oxipng?

TPS commented 3 years ago

@BuyMyMojo Sorry, meant to report back yesterday. The level 9 toolchain for 64-bit FileOptimizer reduced this to 24-bit 262,663,070 bytes (which beats PAQ above & default OxiPNG below), but the original uncompressed image is roughly 2.3 GiB, which might well hit various 32-bit system limits, especially as OxiPNG might have to hold all that & the current working image in memory @ once.

BuyMyMojo commented 3 years ago

Oh I see, so if I'm understanding this right it should work but since the uncompressed size is larger then 2gb some 32bit parts of OxiPNG breaks?

TPS commented 3 years ago

@BuyMyMojo That's an educated guess, since I don't have a deep understanding of a Win32 Rust exe. (I've been on 64-bit for around a decade.) @shssoichiro would probably be best equipped to answer on that basis. Also, a run on Win64 w/ sufficient memory would instructive.

BuyMyMojo commented 3 years ago

When I tried running it I did it on win64 with 32gb of ram, if you need me to run any builds to help test just give me a shout

TPS commented 3 years ago

Indeed, the Windows release that @shssoichiro provides (i686) seems to be 32-bit, but #314 asks for a x64 binary.

BuyMyMojo commented 3 years ago

So I went back to and looked through the releases and noticed 3.0.0 has a x86_64 release when 3.0.1(the one I was using) only had the i686 release. so I downloaded the x86_64 3.0.0 version and ran it on the original image I linked here and it worked! might just need an updated executable powershell_8As8VXIE6w

BuyMyMojo commented 3 years ago

the 64 bit executable is just one update behind I guess (also I just renamed the executable to oxipng64 for testing so ignore that in the screenshot)

TPS commented 3 years ago

Well, that seems to solve that, but it also seems to show OxiPNG isn't particularly great @ the compression, @ least w/ the defaults, as the image is brought down only to ~800MiB.

BuyMyMojo commented 3 years ago

I was able to knock a massive 2.8% more off using -Z afterwards ;), the image is particularly noisy already so it's probably not the easiest to compress anyway so can't really blame it too much

TPS commented 3 years ago

I'd mentioned above the level 9 FileOptimizer toolchain losslessly brought it down to < ⅓ of that.

BuyMyMojo commented 3 years ago

How do you use the level 9 tool chain?

TPS commented 3 years ago
N.B.: A level 9 FO run requires a serious time commitment on a file like this, often days of run-time.
  1. Download latest FileOptimizer, install, & run.

  2. Open Options from ⚙️ button in toolbar. 1st dropdown in dialog that pops-up is Optimization level, which I set to 9: Best. If you wish, take a look at any of the other settings, but you shouldn't need to. Particularly, under the PNG tab, Allow lossy optimizations (by default) should be off/unchecked. Once done, OK out of Options.

  3. Add file(s) however you choose — dragging files/folders or picking from in-app file/folder dialogs. Then hit ▶️ button in toolbar & away you go.

Also, after run is completed, Shift-F5 (to refresh/reset run info) & rerun (repeating until zero change) can often result in additional savings, but at equal time spent on original iteration.

BuyMyMojo commented 3 years ago

I'll make sure to give this a try

BuyMyMojo commented 3 years ago

it uses optipng instead of oxipng so it's super slow :'( still a kinda cool chunk of software

TPS commented 3 years ago

Last I saw, FO's dev was waiting on a few refinements (APNG support, &c) before replacing OptiPNG w/ OxiPNG.

BuyMyMojo commented 3 years ago

I think they are just waiting for apng support?

TPS commented 3 years ago

@BuyMyMojo How's your FO experience?

BuyMyMojo commented 3 years ago

I've been using FO for a few other files now so it's been good but I havn't ran it on the main file that started this because of some time constraints, I should be able to run it soon though and will get back to you about it. though my experience with the PNGs I've compressed their main reduction in file size was from optiPNG anyway so it's been more useful to just run OxiPNG for time sake

TPS commented 3 years ago

I think you're generally correct on most PNGs (in that 1 tool of the chain will end up making the largest savings on a given file — though which tool varies based on any number of reasons), but your Blender PNGs seem not best served by O*iPNG solely, if your given example file is in anyway representative. There's quite a lot of difference between >800 & ~250 MB.

BuyMyMojo commented 3 years ago

Update: started it an hour ago and at 9/16 OptiPNG seems to be the only thing thats benefited the png, will update more as it happens also I really like the O*iPNG that was funny

BuyMyMojo commented 3 years ago

Update: Using FO: 180mb from OptiPNG, 20mb from whatever step 9 is I think, current size (11/16 done): 801mb Using just OxiPNG: 767.png

on another note, this is cool information: FileOptimizer64_55OtNhqz3i

BuyMyMojo commented 3 years ago

Yikes, the total time it took to process was around 21 hours. it defiantly made a difference now being only 265mb but 21 hours is crazy. I can see this being useful is some scenarios but defiantly wild waiting times, might be work the cost of running it under an AWS or google cloud server

BuyMyMojo commented 3 years ago

Optimizer base: 265mb ZPAQ: 262mb PAQ8O: 262mb 7Z(FLZMA2 16-cores): 600bite saving 7Z(FLZMA2 1-core): 4kb savings Zstd: 100kb saving BCM: 2mb saving QUAD: 2.4mb saving

some of those could be improved a bit using FileOptizer on the compression file I think

TPS commented 3 years ago

it defi[nite]ly made a difference now being only 265mb but 21 hours is crazy. I can see this being useful is some scenarios but defi[nite]ly wild waiting times, might be wor[th] the cost of running it under an AWS or google cloud server

@BuyMyMojo If 1's able to plan ahead, FO works amazingly well in the background. Also, given sufficient system resources & until builtin parallelism is implemented, multiple FO instances (I routinely run 4 on my quad-core) will shorten overall times via splitting up file lists.

BuyMyMojo commented 3 years ago

That is also a great idea!

shssoichiro commented 3 years ago

For organization purposes, I'm going to close this as it is essentially a duplicate of #314.

BPCZ commented 3 years ago

@BuyMyMojo sorry for continuing this closed thread after you found something that works but seeing as you were testing ZPAQ I figured I'd test more exotic compression too. If this is some kind of archive of your blender work you're making you might want to consider FLIF I ran the 1GB test file with FILF at max compression settings, it took 37 minutes on my Zen2 CPU using 1 thread and about 1GB of ram. The output file was 532.8MB and is lossless. more about this at flif.info

BuyMyMojo commented 3 years ago

I have taken a look at flif but it's more trying to shrink it enough for sharing with others, ZPAQ is defiantly odd but it's just there to send the file, keeping it PNG means it's usable in all software still

TPS commented 3 years ago

I ran the 1GB test file with FILF at max compression settings, it took 37 minutes on my Zen2 CPU using 1 thread and about 1GB of ram. The output file was 532.8MB and is lossless. more about this at flif.info

@BPCZ The PNG eventually came down to 265 MB @ some a lot of background time expense, so, for these archives, PNGs are still better. 🤷🏾‍♂️

It would be amazing to see what lossless WebP/JXL could do, though.