Open justjoheinz opened 9 months ago
Would be useful to know if this is MacOS only or Allegro in general (guessing 16-bit pngs are not supported?)
The very same image is displayed properly on both Linux and Windows.
Alright,
https://github.com/liballeg/allegro5/blob/8575e373ab79c7bc25f8425990fc6780e75e69c8/addons/image/macosx.m#L53-L55
appears to be wrong; it's assuming 8 bits per channel, so that number of samples per pixel = number of bits per pixel / 8. In the example png, bitsPerPixel
is 64, i.e. 4 samples of 16 bits each, will be incorrectly calculated as 8 samples of 8 bits each.
There may be other errors subsequent to this.
At the moment we're loading an NSImage
then extracting its first NSImageRep
which we hope is really an NSBitmapImageRep
.
This class represents many different bitmap variants - planar or intermingled, premultiplied or not, with up to 5 channels, each up to 32 bits each. Allegro converts this to an ALLEGRO_BITMAP
of type ALLEGRO_PIXEL_FORMAT_ABGR_8888_LE
but it doesn't handle all possible variants. Doing that would be possible but probably quite tedious, especially as most variants would be rare in practice.
Options I think are
I don't know anything about CoreGraphics but I saw Apple's docs suggesting it should be used if image manipulation was being done.
The 'intermediate context' idea is to create a Cocoa graphics context backed by a fixed bitmap format (ie. the equivalent of ALLEGRO_PIXEL_FORMAT_ABGR_8888_LE
) and just draw the NSImage
into it, letting Cocoa do all the format changing. I think this should work but can't say if it would be too slow.
@SiegeLord has there been previous discussion on this, I can't believe this issue has only surfaced in 2024!
I didn't know 16 bit PNGs even existed, I'm not surprised this hasn't come up before.
Regardless of what's done, we should at least gracefully reject formats we can't handle.
...I think maybe TIFF also supports higher bits per channel.
How about retaining the current code (should be fast) but falling back to the drawing context option if the current code can't handle it? At the moment AFAICS there's no else
clause to catch unhandled format so we don't write anything into the target bitmap!
I can investigate if no-one else feels motivated to.
As a workaround - you can disable the native image addon and fall back to libpng (I don't remember if because of this specific bug, but the native addon never worked for my pngs - I just never investigated more).
I've looked into the render to context option, it works on the png file here and on an 8-bit png. All bitmap loads (bmp, jpg, ...) on MacOS will pass through that function so it's a bit scary to be changing it. I will post some code soon so others can weigh in.
I've also started on trying CIImage , currently it is messing up the colours (swapping channels I guess) and I'm not sure why. (CoreImage looks pretty heavy but the only parts we need are "image from data" and "render to bitmap")
More later!
Have pushed some code, please have a look https://github.com/pedro-w/allegro5/tree/issue-1531 (definitely not ready for a PR yet - in particular I'm not really sure I understand premultiplying alpha) I think it's better than what was there before - we're still at the mercy of MacOS built-in image handling but via CoreImage rather than AppKit (both of these end up in CoreGraphics I suppose) It would be great if someone could have a play with some usual and unusual bitmap formats.
Ironically I was hit by this myself yesterday - spent ages trying to debug something when the issue was that ImageMagick's montage command had created a 16 bit-per-channel PNG for me π
@pedro-w Hey, I've tried your fork under MacOS virtual machine and I'm afraid it still outputs the 16-bit PNG image from OP incorrectly:
(and just a small observation, the visual garbage itself is different every time I launch ex_bitmap_file
)
Disappointing. It worked for me. What version of MacOS are you using? The visual garbage would be different every time under the old code because it allocates the memory and then sees a format it can't handle and skips the writing stage. The result is a bitmap made of of uninitialized memory. Pete
That'd be Monterey. Let me know if I can add some debugging code to clear this thing up.
One useful thing would be to just fwrite
the contents of the surf
array, round about line 52.
Then you can use ImageMagick to convert it to a normal image file. Then we'll see if the problem is (stage 1) loading the image data with CoreImage or (stage 2) the conversion to an Allegro bitmap.
Oh, so sorry, I forgot to checkout the correct branch π With your changes actually applied the image in question is indeed displayed correctly.
Let's not ever mention this again :wink:
Thanks very much for testing. I've also got an old MacOS (Catalina) so would be handy if someone with a newer version, especially an Apple Silicon model, could give this code a try out. @justjoheinz would you be able to try it on your M1?
I can give it a shot. I have M1/Sonoma 14.3.1
LGTM, I tried your issue-1531 branch in your repo.
OK I'll tidy it up for a PR. I'll look to add a test for it too. Only remaining issue is pre-multiplied alpha - I don't really know how to check that's working correctly.
I have a related-looking bug in my Allegro 5 game: Lix #431: Magic pink won't become transparent on macOS.
(My obvious symptom in Lix, the failing conversion from magic pink to transparency, is not the bug itself. Other colors are also slightly wrong. I recolor in-game UI according to pixels from PNGs, and this UI recoloring fails, too.)
@Rhys-T has reduced my bug to the interaction between Allegro 5 and Cocoa (Mac's native image loading API). Our findings so far agree with @pedro-w's findings:
-DWANT_NATIVE_IMAGE_LOADER=off
to CMake during Allegro 5's configuration before building.#FF00FF
from file loads as #FB00FF
.How is the PR going? Do you need more help with some of the formats?
Can Allegro 5 tell Cocoa to always offer the loaded file as 32-bit, so that Allegro 5's assumptions (always 8 bit per channel) become correct?
Should Allegro 5 disable the native Mac image loaders by default? This would fix the bug in future commonly packaged Allegro 5 binaries and make Allegro 5's codebase simpler at the same time.
I don't have a Mac to test myself.
- The bug affects palete-indexed PNGs and 24-bit PNGs
Slight correction: as far as I can tell, it only affects palette-indexed PNGs. I was just saying "don't convert everything to 24-bit, because that loses information from some images that are currently working as-is".
I'm not 100% sure what the correct behavior is here. Converting the colors arguably might be more correct in some ways - as far as I can tell, Allegro is exclusively working with colors in the display's native RGB color space, not the sRGB that the images were in, so leaving the colors numerically the same actually ends up changing them visually. But on the other hand, pre-converting the colors means that things like 'magic pink' (#FF00FF) no longer get recognized by al_convert_mask_to_alpha
and the likeβ¦
In any case, it definitely needs to pay attention to what kind of data the NSBitmapImageRep
is actually giving it.
(i.e., they're not completely garbled as the 16-bit PNG here)
There seem to be two semi-related problems with how the macOS native bitmap loader handles the NSBitmapImageRep it gets back:
Summary of the results of different loader/color format/color space combinations as far as I understand at this point:
Loader | PNG color space and format | Result |
---|---|---|
libpng | any format(?) | β
sRGB colors read correctly, and are then gamma-adjusted. β Magic pink survives the gamma process because it's made entirely of FF s and 00 s, and is recognized. Same with black.β Other special colors aren't so lucky, if Allegro is configured to use a different gamma value than the image. β But they're fine with Allegro's default gamma value of 2.2. β Color space/profile is ignored, and colors are reintepreted as native RGB, and render slightly wrong. (Not sure yet whether the gamma part is right or not.) |
NSImage | sRGB; 24-bit, 32-bit, other 8-bit-per-channel formats sRGB; indexed with 32-bit palette |
β
sRGB colors read correctly and are returned as-is. β Magic pink and other special colors are recognized. β Colors are then reintepreted as native RGB without taking color profiles/spaces into account, and render slightly wrong. |
NSImage | sRGB; 16-bit (4 bits per channel) | β Can't read at all. Returns uninitialized memory. |
NSImage | sRGB; indexed with 24-bit palette | β
sRGB colors read correctly and are converted to native RGB. β Magic pink and other special colors are no longer at their sRGB values and are not recognized. β Color profiles/spaces are still ignored, but the colors are already in the native color space and render correctly. |
CIImage (from https://github.com/liballeg/allegro5/issues/1531#issuecomment-1950198051) |
sRGB; 24-bit, 32-bit, other 8-bit-per-channel formats sRGB; indexed with 32-bit palette |
β
sRGB colors read correctly, and are then gamma-adjusted. β Magic pink survives the gamma process because it's made entirely of FF s and 00 s, and is recognized. Same with black.β Other special colors aren't so lucky, and don't get recognized. |
CIImage (from https://github.com/liballeg/allegro5/issues/1531#issuecomment-1950198051) |
sRGB; 16-bit (4 bits per channel) | β Haven't tried this combo yet. |
I've figured out why I couldn't get @pedro-w's CIImage-based loader to build under Nix: Nix defaults to targeting a minimum macOS version of 10.12, and CIContextOption first appeared in 10.14. I worked around that and started testing it with Lix. (I'm applying that loader as a patch to the same version of Allegro I've been building Lix against, rather than using that branch as-is, to minimize the number of changes that might affect what I'm seeing. However, it seemed to look the same when I used the actual branch.)
Magic pink seems to work correctly with the CIImage loader, but normal colors all look significantly darker than in either of the current loaders, and UI icons don't get remapped from grayscale to the proper light-blue colors.[^lixrecol] I suspect this is a gamma-related issue: magic pink consists of only 00
s and FF
s, which would both map to themselves under any gamma value, so it just happens to survive the process. The libpng loader looks identical if I export SCREEN_GAMMA=1.0
before running the game (default is 2.2).
I'll update the table above with this info.
[^lixrecol]: The lix sprite colors do get remapped properly in multiplayer, which confused me for a bit, but then I remembered that the 'from' and 'to' colors for that mapping are being loaded from lixrecol.png
through the same loader as everything else, rather than being hardcoded in the program.
This is great work. Sorry I've not kept up with this thread.
It seems to me that it would be better to go for the libpng loader, because then any improvements we make would be cross-plaform (ditto any idiosyncrasies that still occur).
Not sure about the color space thing though - is there a library to convert color spaces, or is it acceptable for us to say in the docs that only sRGB files will be reproduced faithfully?
Also if gamma can be set via the config file, does this mean that programs that rely on image pixels being a certain RGB value can be broken? (e.g. I think I remember an RPG-type game where the character sprites were recolored to give them different clothes)
One question I have is, what color space is Allegro supposed to be working in - or in other words, what space is an ALLEGRO_COLOR
treated as? I can't seem to find this documented anywhere. Right now it seems like it's passed through as the screen's native RGB space (at least on macOS - I haven't had a chance to test it on any other platforms yet). That lets you access the full gamut of whatever screen you're on, instead of being limited to what sRGB covers, but it would also require an extra conversion step before trying to display anything, if you want it to look consistent.
It seems like the ideal setup would be to separate loading from converting to native: First load the pixel values as-is, without any conversion or gamma adjustment or anything, so the game can recognize any special colors it needs to. Then either have a function to convert the final bitmap to native explicitly before drawing, or make it so that an ALLEGRO_DISPLAY
has a concept of what color space the program is using and can convert that to native whenever it draws something. But either of those options seems like it would involve (potentially breaking) API changes. (Possibly related: #463)
I've never actually developed anything using Allegro myself, nor done much of anything that needed to care about color spaces, so I can't really offer any insight as to what approach would be best from a developer perspective.
Also if gamma can be set via the config file, does this mean that programs that rely on image pixels being a certain RGB value can be broken? (e.g. I think I remember an RPG-type game where the character sprites were recolored to give them different clothes)
Yes - if I set SCREEN_GAMMA
(or its .allegro5rc
equivalent) to something other than 2.2, Lix fails to identify certain colors. For example, look at the in-game screenshots above: the buttons in the lower-right corner are supposed to be in shades of blue, but with the darker gamma, they're coming out in grays like in the actual image (except where it happens to land on another recognized shade of gray and gets mapped to a blue).
(Lix does have a few other special colors it looks for, but for various reasons, the gamma stuff in particular doesn't affect most of them. Black and magic pink are both used as 'transparent' in various places, but they're at the very ends of the gamma curve and don't get adjusted, so they work. In multiplayer, the various colors in the lix sprites get remapped to shades of each player's color, closer to your RPG example, and those colors do get adjusted - but the colors that the game is looking for are taken from the first row of lixrecol.png, so they've also been adjusted, and still match what's in the sprites. The only other special color I know of that might be getting affected is #505050
, which the game uses to find a lix's eye to correctly position the fuse for an exploder in multiplayer. But if that one's failing, it's subtle enough that I'm not seeing it.)
I use allegro 5.2.9.1 (built by brew) on a Mac OS X M1 architecture. Libpng is 1.6.42.
I have a file (attached) which according to file is :
Using https://github.com/liballeg/allegro5/blob/master/examples/ex_bitmap_file.c to render this image the output is garbled.
If I do
the output is ok