darktable-org / darktable

darktable is an open source photography workflow application and raw developer
https://www.darktable.org
GNU General Public License v3.0
9.77k stars 1.14k forks source link

darktable 3.6.1 - JPG export bug: output becomes broken, depending on image size #10283

Closed mzannoni closed 1 year ago

mzannoni commented 3 years ago

Describe the bug/issue I've found out that the export module has extremely inconsistent performance when exporting JPGs. As you can see in the attached pictures below, simply changing the "set size in pixel" values in export module generates very erratic (and unusable) results. What makes it bad is that the output at full resolution is also bad.

Thus far, I could obtain (apparently) decent JPG image only setting the pixel size to 3840x3840. Full resolution and HD (1920x1920) settings give very bad results.

Disabling OpenCL (in "preferences -> gpu/cpu/memory) doesn't solve the issue, but produces differently looking damaged results.

To Reproduce Please provide detailed steps to reproduce the behaviour, for example:

  1. Set different values in the "set size in pixel" fields in the export module in the light table
  2. The result has a chance to come out "broken"

Expected behavior Only the resolution should be different (resized image), but the output should be still consistently looking.

Screenshots This one looks reasonable (size set to 3840x3840) Trentino_estate_2021_0459_dt_4K

But at full resolution it looks terrible: (adding link, file too big) Trentino_estate_2021_0459_dt.jpg

Setting to 1920x1920 also looks terrible: Trentino_estate_2021_0459_dt_HD_web

Same as previous, but with OpenCL disabled: Trentino_estate_2021_0459_dt_HD_web_01

Platform Please fill as much information as possible in the list given below. Please state "unknown" where you do not know the answer and remove any sections that are not applicable

Additional context The raw file is this. The XML file (identical development settings for all the jog above, only difference in the export module, pixel size, and OpenCL disabled in the last).

EDIT: forgot to mention, discussion also going on here (apparently I'm not the only one affected; one reported also for ver 3.7).

mzannoni commented 3 years ago

I've been doing a bit of trials. The issue is also present if I export to tiff (32bit float).

Important finding: if I disable "high quality resampling" apparently the export to HD size (1920x1920) is fixed. But this does not fix exporting to full resolution. (I've tried this both for JPG and tiff)

mzannoni commented 3 years ago

Additional finding: the exact same seems to happen if instead of the image setting for output color profile (sRGB) in the export module I select linear ProPhoto RGB and export to tiff 32bit float (opening with Gimp or Krita shows the exact same look as the jpeg in first post).

Edit: the exact same happens if I set the linear ProPhoto RGB in the output color profile in the darktable and leave image settings in the export module in the lighttable.

kofa73 commented 3 years ago

Reproduced (on Windows, darktable 3.6.1). Disabling haze removal seems to avoid the issue. Note: in the darkroom, the image is fine with haze removal on, even if zoomed to 100%.

mzannoni commented 3 years ago

Also in the lighttable, the preview (key 'w') looks fine.

I also confirm that disabling haze removal avoids the issue.

I've just found out that by using in the export module "set size by scale" with a value of 0.99 works fine (independently of high-q resampling setting), while setting the value to 1 ends up in a bad result. (currently, this is my workaround)

kofa73 commented 3 years ago

With a minimal stack (duplicated the image as 'original', disabled filmic) an exposure correction of about -2 EV is needed to avoid blown highlights. However, with haze removal turned on, the sky gets a huge exposure boost, -12 EV is needed to avoid clipping. Maybe this has something to do with the issue.

kofa73 commented 3 years ago

With master (3.7.0+1343~g24e5e7621): thumbnails of full size export, followed by 2000x2000 with high-quality resampling, and finally without the option selected. image

Almijisti commented 3 years ago

I'm having a related issue with exporting to jpg. Eventually, after about 5 or six exports, any additional exports are totally black and only about 220 kb in size. This seems to be a resource issue as closing and reopening darktable resolves it, at least for the next 5 or 6 exports.

github-actions[bot] commented 2 years ago

This issue did not get any activity in the past 60 days and will be closed in 365 days if no update occurs. Please check if the master branch has fixed it and report again or close the issue.

mzannoni commented 2 years ago

I've just verified this issue is still present and with the same symptoms in darktable 3.8.0.

To summarize: the exported jpg appears broken when:

It seems that it is not broken when haze removal is not used. In 4K exports, it does not appear broken, independently of the above mentioned settings.

peter-joo commented 2 years ago

Hi there, this is the first time I use Darktable :) Haze removal is fantastic, thanks for the huge effort @rabauke !

However I also faced exactly the same problem mentioned in this ticket with version 3.8.0: The exported version is totally different then the one in darkroom tab, if and only if 'Haze removal' is active.

After playing a while I can confirm -and emphasize- that a simple workaround already pointed out by @mzannoni works!

What you have to do is to set 'by scale' to '0.999999' for 'set size' in the 'export' module. That's it :)

Also the setting as '1.0' there does not work. But the setting '0.999999' acts as if it was 1.0, since my exported image's with and height are the same as the input.

I am wondering whether this Haze removal module needs a non-1.0 fraction definition? Or maybe the whole problem is inside the export module itself, around how the scale is defined? I don't know, but definitely the code around scale should be the culprit or at least suspicious.

Once again: the problem is NOT that the 'the sky in the provided image is completely blown out' and therefore 'a completely blown out sky causes problems' written by @rabauke, at least in my case... (https://github.com/darktable-org/darktable/issues/4092)

The bright parts of an image are crucial for estimating the global hazy background light and the local amount of haze. The sky in the provided image is completely blown out. The haze removal module requires a properly exposed image. The haze removal module is able to deal with clipping highlights in some small portions of the image but a completely blown out sky causes problems.

peter-joo commented 2 years ago

Well, this trick works for some of my images and does not work for others.

I don't know why :(

Still, based on the above 0.999999 vs 1.0 difference darktable developers can start their investigation, I suppose.

Thanks for their work, in advance.

Also maybe @rabauke is right saying that 'but a completely blown out sky causes problems'.

kofa73 commented 2 years ago

Note that with the supplied image, the preview changes significantly if gamut clipping is turned on. Maybe we have out-of-gamut colours that dehaze cannot handle. image image

Switching to the 'modern chromatic adaptation' (WB = camera reference + enabling color calibration) and moving the dehaze module above color calibration produces the following preview: image And the following full-size output (well, it's scaled; but it's scaled in the viewer): image

This side-by-side view may better show the outcome: image

rabauke commented 2 years ago

@mzannoni: The haze removal module is expected not to work if the sky is blown out. This example, however, has no blown out sky. A few pixels are over exposed but I do not expect that these cause severe problems. Thanks for this example, which may help me to debug this issue.

peter-joo commented 2 years ago

I hope I helped a bit too :)

Also, one question pls: what is your opinion on the 1.0 vs 0.999999 sensitivity I described above? How come that for a certain image in practice such a minuscule difference in the scale alters the haze removal algorithm that much? Do you plan to further investigate that one too?

mzannoni commented 2 years ago

@rabauke you're welcome, and thanks for considering this. I know about the applicability of haze removal, however I think here the issue is a bit different, since the output, given some specific export parameters, differs so much. Especially, for me exporting without scale, should yield something extremely similar to the preview in darktable.

kofa73 commented 2 years ago

@mzannoni The preview is scaled (unless you zoom to 100%), and it is scaled by the same algorithms as used when you export with scaling and set high quality resampling = no. So the difference between 1.0 and 0.9999 is that the 2nd setting triggers scaling. Due to a bug, using a scale factor for export always used low-quality resampling (so the same lower-quality, but faster algorithms as used for the non-1:1 preview). This was only discovered and fixed recently, see https://github.com/darktable-org/darktable/pull/11079.

peter-joo commented 2 years ago

@kofa73 "So the difference between 1.0 and 0.9999 is that the 2nd setting triggers scaling."

Still, between the 1.0 and 0.9999 scale the Haze removal's output shows major difference. Why? Did you check my post above? https://github.com/darktable-org/darktable/issues/10283#issuecomment-1030678344

I suppose that the the haze removal algorithm is (should be) robust enough, meaning that it is not sensitive for such small difference between 1.0 and 0.9999.

@kofa73 pls comment :)

kofa73 commented 2 years ago

When darktable generates the non-zoomed (-> downscaled) preview, which you see in the editor, it applies the same algorithmic trade-offs (to speed up editing), as when you do a downscaled export with high quality resampling = no. Therefore, when you do an export not at 100% scale, but with 99.9999%, you also get the same, lower-quality algorithms in your output. Therefore, the exported file matches the preview. Due to a bug, scaled exports were done with low-quality mode even if you set high quality resampling = yes.

It's not the scaling factor that counts; it's the application of the low-quality (= faster) algos. At least this is how I understand it.

peter-joo commented 2 years ago

thank you for the explanation @kofa73, then I'll wait for https://github.com/darktable-org/darktable/pull/11079 to be fixed.

kofa73 commented 2 years ago

That won't affect this issue. The only effect will be that with you'll have to set low-quality export explicitly to match the preview. Your image stack has some no-longer recommended modules like shadows-highlights. rgb curves seems to be OK, but I don't know how that handles luminance levels > 1 (which can occur with the scene-referred workflow). See https://pixls.us/articles/darktable-3-rgb-or-lab-which-modules-help/

kofa73 commented 2 years ago

However, the strange thing is that in your case the 100% zoomed-in preview is fine; but the exported image is not: image

Maybe dehaze always uses the fast algorithms for the preview?

Plus, the HQ output (6011x4008 pixels) also looks fine (a bit different from the preview, but that might be due to colour management?): image

And finally, the LQ export at 6011x4008 pixels: image

peter-joo commented 2 years ago

"the strange thing is that in your case the 100% zoomed-in preview is fine; but the exported image is not:" this is exactly what's happened with me too.

Do you want to see the sample image?

kofa73 commented 2 years ago

@rabauke Heiko, there are some comments in the code:

  // hazeremoval module needs the color and the haziness (which yields
  // distance_max) of the most hazy region of the image.  In pixelpipe
  // FULL we can not reliably get this value as the pixelpipe might
  // only see part of the image (region of interest).  Therefore, we
  // try to get A0 and distance_max from the PREVIEW pixelpipe which
  // luckily stores it for us.

Might that have something to do with this difference of the 100% preview vs 100% export?

github-actions[bot] commented 2 years ago

This issue did not get any activity in the past 60 days and will be closed in 365 days if no update occurs. Please check if the master branch has fixed it and report again or close the issue.

mzannoni commented 2 years ago

This is not yet fixed in the latest release. I've just tried with v3.8.1 and it's still look identical (with haze removal the exported jpg is broken).

github-actions[bot] commented 2 years ago

This issue did not get any activity in the past 60 days and will be closed in 365 days if no update occurs. Please check if the master branch has fixed it and report again or close the issue.

github-actions[bot] commented 1 year ago

This issue was closed because it has been inactive for 300 days since being marked as stale. Please check if the newest release or nightly build has it fixed. Please, create a new issue if the issue is not fixed.