sobotka / AgX-Resolve

AgX Picture Formation for DaVinci Resolve
59 stars 6 forks source link

Blender Export #1

Closed DjangoMango666 closed 11 months ago

DjangoMango666 commented 1 year ago

Hi Troy! If i export with standard settings from your AgX (for Blender) to Davinci, with your instructions, it´s not working at all. I looked for the arri wide gammut profile in blender (for export) but there isn´t any. What would you recommend using if i wanted to export an animation from Blender to DaVinci?

Thanks and greetings

sobotka commented 1 year ago

Yikes!

There’s a good reason I haven’t really been pushing for folks to test things without understanding some of the background context. Sorry for the confusion.

The basic AgX that you were testing is quite different to this particular demonstration. This specific demonstration is for creating the picture from tristimulus that is encoded as ARRI Wide Gamut version 3.0, and ARRI LogC version 3.0. The vanilla AgX demonstration over at the other repository is more rendering-centric, and uses a different path.

EXRs and Resolve have many peculiar things (resampling spatial dimensions, grading controls, etc.) that need to be accounted for, but this won’t deal with those issues. I’ll just assume you know about the details, and outline how to see something reasonable within Resolve.

If you are wanting to test this out with renders from Blender, authored under the other AgX demonstration, you would need to broadly use the following chain:

  1. Render using the BT.709 chain you’ve been using.
  2. Set up a node chain in Resolve, with a Colourspace Transform as the first node. If you ingested a uniform tristimulus EXR as your media, your node will need to set “Input Color Space” “REC.709”, “Input Gamma” as “Linear”, “Output Color Space” “ARRI Wide Gamut 3”, and “Output Gamma” to “ARRI LogC3”. This will take the uniform tristimulus BT.709 render, and convert it to ARRI’s LogC3 / AWG in preparation for the DCTL node default settings.
  3. Within the Colorspace node, set the “Tone Mapping” to “None”. This is absolutely critical.
  4. Apply the repository DCTL via a Resolve DCTL node.

Resolve is best suited to a log-like encoding for media. As such, going from a log output to Resolve might be more smooth for folks. Depends!

I did not really intend for the cross pollination between Blender folks and DCTL Resolve users just yet, but here we are. Hope this helps.

BoredSiel commented 1 year ago

Hello Troy.

I'm another blender AgX user trying to cross Renders to Resolve and this post was actually critical to me. I've followed your instructions on Agx Resolve and yes, it didn't worked properly but thanks to this person that fortunatelly had the same issue as me, i was able to correct it and now it's working properly. I've followed your new instructions using the colourspace transform node and the DCTL afterwards and seems to be working fine. I believe your works have been gaining more visibility through blender users, there are quite a few youtube videos refering to your experimental AgX for blender. I speak that from Brazil.

sobotka commented 1 year ago

The name “AgX” was me slapping a label on a general mechanic and set of principles. The good news is that many folks are out there experimenting now.

The bad news is that there’s no “single” AgX, nor do I believe there should be; each specific picture making context will require unique picture formation chains. The AgX mechanic can help here.

The current “standard” I have authored for BT.709 / sRGB AgX I would point to is via the OpenColorIO generator here. A default generated configuration is available here.

The Resolve tool uses a different set of working space values, and as such, will not “match” as it wasn’t designed to. If someone needs to match, I would suggest using Fusion’s OpenColorIO nodes with whatever OpenColorIO version of AgX they used to form the picture in Blender.

Glad folks are exploring the mechanics of forming pictures from tristimulus render data more. If anything I’ve contributed helps someone author their pictures with greater confidence and control to help the output, all the better.

BoredSiel commented 1 year ago

I've been using it and learning from these blender videos but can you point me towards the basic background of this? I want to understand what's going on in detail.

BoredSiel commented 1 year ago

What i did was use this version of AgX on blender:

https://drive.google.com/file/d/1lubkN9Lwx_gyEOCGAFhyqnTusyyzvtiU/view.

Device display is on Agx Display, View transform on sRGB and Look is on AgX Punchy. I've exported using Open EXR and i'm trying to put the image sequence in Resolve.

I used OCIO Colorspace and grabbed the ocio of the AgX version i've used but what is the source space sRGB? i know the output source should be AgX Base and that after image is formed, i should add Color space transform nodes with the REC.709 and Linear inputs. Converting to ARRI LogC3/AWG and using DCTL nodes with camera AgX from AgX Resolve repository. I wonder if i'm doing everything correctly.

sobotka commented 1 year ago

What i did was use this version of AgX on blender:

I can’t recall if Genco used the chain in the SB2383 approach, but it doesn’t matter entirely. To achieve a match, you’ll have to follow the same chain used in Blender, which means paying attention to the encoded signal.

Specifically, you haven’t stated how you are ingesting the encoded signal in Resolve. Are you encoding to an EXR from Blender? A PNG? A TIFF? This is critically important as it helps us to identify how you are encoding the values within the package.

EDIT: I see you used an EXR. This means that if your display is set to sRGB, the encoding inside the EXR will be BT.709 based uniform tristimulus data.

I used OCIO Colorspace and grabbed the ocio of the AgX version i've used but what is the source space sRGB? i know the output source should be AgX Base and that after image is formed, i should add Color space transform nodes with the REC.709 and Linear inputs.

The assumptions in OpenColorIO are that the data is encoded as uniform tristimulus. In the AgX case, it must be BT.709, which can be controlled when saving from Blender. This is commonly the case, as the saving will follow the “Scene” setting, however, not always. For example, if you have set your display to be Display P3, the tristimulus values may end up encoded in the file as Display P3.

The Resolve AgX will never match perfectly, as the working space is tuned to ARRI Wide Gamut v3 using the ARRI LogCv3 as the log encoding. It might work out “good enough” for you, or it might not, context depending.

You should be able, however to get a 1:1 match assuming Fusion uses OpenColorIOv2 in Resolve. I do not know if this is so.

I really should release a Resolve AgX that specifically matches the BT.709 version, but I have yet to do so.

I've been using it and learning from these blender videos but can you point me towards the basic background of this?

While the mechanics are not complicated, there are several steps to understand. I am not sure of a reasonable place to locate how to understand picture forming in more detail, but I have been pondering writing a document for a while. I am happy to answer questions via messaging or email. This is generally the best approach as everyone needs to start from a place of their own understanding.

BoredSiel commented 1 year ago

I'm new to picture forming / Color management but i've been trying to keep up doing my research. So, i've seen a blackmagic forum stating that it indeed uses OpenColorlOv2, which means that apparently, the 1:1 is possible.

I've exported using EXR and Follow Scene, so i've understood that it encoded using BT.709. So just to recap, when i'm in fusion tab and add OCIOColorSpace node, i need to put BT.709 as the source and AgX Base as the output. Then when the image is formed, i need to go to color and add AgX Resolve, using Colorspace and DCTL to match at least enough to the AgX i've been experimenting.

How exactly would i know that it's matching? Or that is not matching?

Thank you sincerely for writing such detailed and oriented responses. This is helping me a lot, specially because i'm learning from the source of what i'm experimenting.

sobotka commented 1 year ago

I've exported using EXR and Follow Scene, so i've understood that it encoded using BT.709.

It should be noted that this will literally follow your scene settings, so it assumes you have set an sRGB display, and then the transforms will be encoded as BT.709 open domain tristimulus in the EXR.

I need to put BT.709 as the source and AgX Base as the output.

I do not believe so as you are loading an EXR. This means that the encoding is uniform tristimulus, aka “Linear BT.709”. The picture isn’t yet formed, and as such, the encoding is not yet AgX Base.

i need to go to color and add AgX Resolve, using Colorspace and DCTL to match at least enough to the AgX i've been experimenting.

No. This will form a picture in a different manner to the OpenColorIO configuration. The picture should be formed in Fusion, then output to Resolve’s handling.

I cannot recall if Resolve encodes to the timeline colourspace assumptions or not when passing to ResFusion, and expecting it to be encoded as such when passed out from ResFusion. You will need to sort this out.

At one point ResFusion was handed an encoded version of the EXRs, encoded to some timeline setting. I don’t know if this still holds true, so test!

How exactly would i know that it's matching? Or that is not matching?

You can difference two formed images to get a sense as to how different they are using the Blender compositor for example. There’s a number of ways that might give you a reasonable analysis.

dubfl0w commented 11 months ago

Yikes!

There’s a good reason I haven’t really been pushing for folks to test things without understanding some of the background context. Sorry for the confusion.

The basic AgX that you were testing is quite different to this particular demonstration. This specific demonstration is for creating the picture from tristimulus that is encoded as ARRI Wide Gamut version 3.0, and ARRI LogC version 3.0. The vanilla AgX demonstration over at the other repository is more rendering-centric, and uses a different path.

EXRs and Resolve have many peculiar things (resampling spatial dimensions, grading controls, etc.) that need to be accounted for, but this won’t deal with those issues. I’ll just assume you know about the details, and outline how to see something reasonable within Resolve. On the right side of the clip viewer you see a png file with the right look. On the left side its the exr file.

If you are wanting to test this out with renders from Blender, authored under the other AgX demonstration, you would need to broadly use the following chain:

1. Render using the BT.709 chain you’ve been using.

2. Set up a node chain in Resolve, with a Colourspace Transform as the first node. If you ingested a uniform tristimulus EXR as your media, your node will need to set “Input Color Space” “REC.709”, “Input Gamma” as “Linear”, “Output Color Space” “ARRI Wide Gamut 3”, and “Output Gamma” to “ARRI LogC3”. This will take the uniform tristimulus BT.709 render, and convert it to ARRI’s LogC3 / AWG in preparation for the DCTL node default settings.

3. Within the Colorspace node, set the “Tone Mapping” to “None”. This is absolutely critical.

4. Apply the repository DCTL via a Resolve DCTL node.

Resolve is best suited to a log-like encoding for media. As such, going from a log output to Resolve might be more smooth for folks. Depends!

I did not really intend for the cross pollination between Blender folks and DCTL Resolve users just yet, but here we are. Hope this helps.

Hello, I'm trying to follow your steps but I cant match the look in DaVinci. I made screenshots to show my setting in DaVinci. The EXR (16Bit) has no LUT applied! Only the color nodes in the Color section of DaVinci are applied. But it does not look like in Blender. Does someone has any idea? thanks

Resolve_ipwdQUh7dd Resolve_2pX8AFqnSE blender_6zcqaSkJ43 Resolve_U8xO6hECud

OK, reduced the exposure in Blender by -2 this is something I have to do in Davinci since this is not burned in to the exr file I realized. I am new to Blender and just realized I can convert my rendered exr files with the video editor in Blender :)

sobotka commented 11 months ago

I would not use the VSE for anything. It’s a bit of a broken mess.

It should also not be expected that the AgX Resolve will match the AgX in my repository, nor the SB2383 variation etc., and doubly not match the thing that ended up in Blender proper; they are all different to this version.

That said, if authored using the outline, the result should be decent quality, albeit quite “different” in some capacities.

If one wanted to match 1:1, all of these would need to be identical values:

  1. The “curve” setting.
  2. The primaries tuning on inset and rotation.
  3. The log encoding used.
  4. The working colourspace.

Achievable, with a little bit of elbow grease and learning. The parameters can be copied into a duplicated DCTL file.

dubfl0w commented 11 months ago

Hey, ok thank you. I will give I will give it a try.