Open vimalaguti opened 11 months ago
I have seen this gain map within a motion photo produced on a pixel 8, but I'm not sure what could be done with it. My understanding is that it allows rendering either in HDR or SDR (whether you apply the gain map or not, respectively), depending on the device capabilities.
By requesting support for it, do you mean that Aves should check device capabilities and apply the map if HDR is supported by the device display?
Yes, currently if I take a photo with my pixel 7, the Google photo app (and also chrome) applies the gain map, if I open it with any other gallery app instead the screen doesn't change brightness. I've just heard from the news, I didn't check the android APIs, but I've heard that reading this format is already integrated into android 14, so yes, I would expect that if the display is capable, aves should apply the gain map
Could you please provide a sample image?
Also, I'm not sure it makes sense, but could you take a screenshot of the image as it appears for you in Google Photos, and as it appears in Aves?
:) screenshot doesn't make sense as when I take it the screen loose it's brightness.
Here a link about HDR with both AVIF and jpg sample images https://gregbenzphotography.com/hdr/
Here a photo I just made
If you have an OLED device, should be quite outstanding the difference wiht and without the gain map
Thanks. I don't have any device with HDR support, or Android 14 for that matter, so I can't test the rendering but I'll take a look at how it works.
Edit:
Thank you! As a last thing, I made a photo of the screen, the top is a screen that does not apply the gain map, the bottom is an OLED with Google photos app with HDR support. Pic is extremely bad, but it's clear the difference
For starters, I'll implement detection (and extraction) of HDR gain maps, so at least it will be easier to spot HDR images in the collection and warn users when metadata removal will de-HDR the image.
About rendering, are you willing to help me test builds? I'm not sure of the framework/codec capabilities, so I'll need confirmation from someone with an HDR capable device.
Here's a sample image that declares a gain map, but does not embed it declare it in XMP GContainer
namespace as your sample does:
On your device, does this image appear differently in Google Photos and in Aves?
Yes the screen changes brightness and the picture is recognized as ultra HDR
About testing, I guess I can try even though I'm not an android dev
Thanks for following up.
This issue is actually an interesting can of worms: this Ultra HDR
format introduces some XMP
metadata, but that metadata is redundant with the HDR gain map metadata defined in the MPF
extension of JPEG. The MPF (Multi-Picture Format) extension is a way to embed multiple pictures inside a JPEG (and I had never heard of it before). It can be used to embed the HDR gain map, but it can also be used to embed multiple variants of an image for stereoscopic effect, or parts of a panorama, or thumbnails.
I'm now implementing detection and display of this MPF bit, and I'm discovering that my Sony camera was embedding large thumbnails in its pictures using MPF. What a surprise!
But I also need to revisit the way motion photos were modelled in Aves, so I need to tackle these other sub-issues before I'll come back to you to try out whether HDR rendering is possible.
@vimalaguti could you please take and share an HDR photo in portrait mode? Anything like the one you provided here is fine, but in portrait. I need to check how dependent images apply the rotation metadata.
Here is it
@vimalaguti could you please install this test build: https://drive.google.com/file/d/1Sp87yc9aid8R9BM-fQqCWdR8Qn-_vrCj/view?usp=sharing
It will install as a separate app "Aves [Profile]", distinct from the regular Aves.
After your HDR items are scanned and detected as HDR (with the icon overlay on the thumbnail), could you please open them in the viewer and let me know: 1) whether your device display switches to the HDR brightness/color mode, 2) whether the image appears as you expect when the image fits in the viewer, 3) whether the image appears as you expect when the image is zoomed in as much as you can, 4) whether your device display switches back to normal mode when it should.
Bonus points if you also try with a motion photo in HDR!
Hello. I tried the test build on my Pixel 7. I can tell that the phone is detecting the presence of a multi-picture format image on screen when I open an Ultra HDR image in the viewer. But it's not applying the intended luminance changes. Instead, it's applying a boost to red saturation, deviating from the original. An android screen recording is unable to capture this change.
To switch the aforementioned effect on and off, I can either click on the image thumbnail to expand it / then close it, or swipe up on the system navigation bar to bring the app into the recent apps picker - where it also stops being applied. This behavior is consistent with how Android 14 handles HDR images in other apps, google photos included. This navigation bar method works the same on a zoomed-in crop of the photo, - also expected.
However, I also tried a motion photo for an image with a gain map, and noticed that the same reddening happens there too. The original google photos implementation does not apply the gain map when you go from viewing a still image to viewing the motion photo (however it also doesn't leave the "HDR image detected" state when switching to the motion photo, perhaps it is overlayed on top of the still image). I think it's ideal for gain maps to not get applied there, since they're almost pixel-precise luminance boosters, and not meant for moving subjects.
I hope you'll find this helpful. I respect that you're not only working to add support for a new format, but doing so even without having the means to test the results yourself.
Hi, I'm not able to detect the redshift, but for sure it's not applying the gain map. I can see the gain map is correctly displayed in the MPF #2 tag,
Not sure what you mean with switches. Something like a button? When the image is in full screen and black background - so when it's supposed to work the HDR - nothing happens.
And yes, images are correctly tagged as HDR in the thumbnail gallery preview
Thanks for the feedback.
About "switching", my understanding is that the app that wants to properly display an HDR photo should request the device display to switch to the HDR color mode. Switching to it is not automatic, and switching back to the default color mode is not automatic either.
Additionally to the color mode, the HDR gain map should be interpreted, but I'm not sure whether the JPEG codec (on modern devices) can handle it by itself automatically, or whether the app should decode and interpret the gain map manually.
From your feedback, it seems the color mode switch works, but the gain map interpretation does not. Maybe because of how Aves decodes regions instead of the whole image, or the flutter intermediate layer to access the JPEG codec, or maybe it's really up to the app to do all the gain map interpretation work...
Here's an Android sample app (code from google, but built by me): https://drive.google.com/file/d/1LexIAhCqhWyrQ9FvLv4LZ307xYLvCJT1/view?usp=sharing
Could you please install the sample app, open it, search for the "Displaying UltraHDR" sample, and let me know whether it looks good as HDR?
And here is a test app in Flutter: https://drive.google.com/file/d/1NQWA8-awOTkaSGMTYn_S-fGNUpm-5KSu/view?usp=sharing
It's loading one of the sample HDR photos and switching the display color mode in the simplest way possible, to see whether the codec used by Flutter is able to decode the gain map. Could you please let me know whether it appears as it should, like it does in the Android sample app above?
Sure, I'll try them. I've also found this app that is able to display HDR images in the correct way - but sometimes their transition doesn't work. Maybe you can give a look at what they do
The android sample app works perfectly , in all HDR widgets.
Instead I don't think that the flutter app works, it shows a label "switched to HDR display mode" and probably it really switches it on, as I can see that the brightness lowers, to increase the contrast, but no pixel is brighter.
Here's an update to the flutter_hdr
app:
https://drive.google.com/file/d/18EQuaW9PYTexjBt5ivx1PrXYr5VfgXPK/view?usp=sharing
In this one, there are 3 images:
asset
: loaded the same way as in the previous version (so it should fail the same way)uri-flutter
: loaded differently, but I expect it to be wrong like asset
uri-glide
: loaded and decoded differently, but I hope this one will work like the Android sample app.Please let me know how it goes.
Nope, none of the three works
Damned, that's disappointing.
What i see from the Android sample app and Les Pas
is that Android apps don't need to decode the gainmap themselves, it's handled automatically by the native bitmap decoder. But it seems that when it's decoded/displayed in Flutter something is lost. I remember that Flutter had some limitations with wide gamut display, maybe this is similar, I'll investigate.
Anyway, thanks for testing and reporting!
I am once again seeing an increase in orange saturation when the flutter app is in focus, and no luminance change. The platform samples app works as expected.
I could be wrong, but i now believe that it's a limitation of Flutter.
There is a related open issue for tackling this in the next-gen rendering engine in Flutter: https://github.com/flutter/flutter/issues/127852
If you're interested in HDR, you could upvote that Flutter issue (by adding a thumb up).
I've not given up just yet.
Here's an updated flutter_hdr
:
https://drive.google.com/file/d/1cVlnnLEmCP0PcjbgvEuYu8Hmd3-_aQwL/view?usp=sharing
Does mem-jpg
or mem-bmp
work?
Unfortunately not. Gain maps can technically affect color, and this might've been obvious without pointing out, but maybe your implementation is affecting chroma instead of luma for some reason? It's only a subtle transformation and only seems to affect areas that were already warm, like those lights. That explanation would make sense if the gain map is supposed to increase the luminance intensity in % rather than add a flat amount, hence the white window is not being affected.
I am trying to rely on the native decoding, so i am not decoding the gain map myself. In the native app it is really trivial. In the Flutter app, I'm trying to use the native capabilities to decode the image in the Android world (where the gain map is handled automatically) and then transfer the decoded bytes to the Flutter world for display. Apparently this doesn't work, but this might be due to a limitation in Flutter rendering.
I don't know if it's related but HDR videos of the pixel 8 pro look super bright - to the point that it looks very bad - on Aves, when compared to the Google Photos app where they look normal.
Example video if it helps looking at the metadata or something: https://nc.djf.lol/s/SYnoFxLKRb2tPbF
Thanks for letting me know, and for the sample.
As a side note, I'd like to identify HDR videos on Aves (just like it identifies HDR images) but I'm not sure of the criteria. It seems to be videos with a 10-bit depth and the color space "smpte170m" or "bt2020nc", but i don't have the expertise. If anybody is knowledgeable, please let me know.
As to rendering HDR videos, there could be limitations from 1) the player, based on FFmpeg, or 2) the rendering framework, Flutter.
For Flutter, i encourage people to upvote https://github.com/flutter/flutter/issues/127852.
I believe the player uses a recent version of FFmpeg that is capable of decoding HDR, but i have no way to test this myself as i don't have HDR capable devices.
By the way, I'm happy with my Samsung from 2019, but if someone wants to send me one of these fancy new Pixel phones, i won't say no :D
Thanks for letting me know, and for the sample.
As a side note, I'd like to identify HDR videos on Aves (just like it identifies HDR images) but I'm not sure of the criteria. It seems to be videos with a 10-bit depth and the color space "smpte170m" or "bt2020nc", but i don't have the expertise. If anybody is knowledgeable, please let me know.
As to rendering HDR videos, there could be limitations from 1) the player, based on FFmpeg, or 2) the rendering framework, Flutter.
For Flutter, i encourage people to upvote https://github.com/flutter/flutter/issues/127852.
I believe the player uses a recent version of FFmpeg that is capable of decoding HDR, but i have no way to test this myself as i don't have HDR capable devices.
By the way, I'm happy with my Samsung from 2019, but if someone wants to send me one of these fancy new Pixel phones, i won't say no :D
No worries https://video.stackexchange.com/questions/22059/how-to-identify-hdr-video
Seems to me that Color Transfer is the key one to identify HDR - SMPTE 2084 or ARIB STD-B67. (Sample video I posted has the second one)
Indeed, if you are looking to accurately identify HDR videos even outside of that which is shot on smartphones, bit depth and color primaries are not foolproof factors, but transfer function is.
smpte2084 is commonly known as PQ, and ARIB-STD-B67 as HLG - which is used by most cameras shooting delivery-ready HDR, whereas PQ is a common choice for video that has undergone mastering.
@davidfrickert for future comments about HDR video:
Hi, is there any plan to support Ultra HDR jpeg gain map?
On pixel phones, photos are now saved with this gain map, and should be supported by all android 14 devices, if I'm not wrong.