tylerreisinger / prisma

A powerful color representation, manipulation and conversion library that aims to be easy to use.
MIT License
21 stars 5 forks source link

HDR #17

Open LoganDark opened 3 years ago

LoganDark commented 3 years ago

Right now, the documentation says that RGB components can only be in the 0..1 range. I'm currently doing "Ray Tracing In One Weekend" and I wanted to do HDR with a proper color library, but it doesn't look like Prisma allows that. :/

tylerreisinger commented 3 years ago

RGB HDR can absolutely be done with the library by using a larger RGB gamut like rec 2020, going to XYZ and going back to the target color space and tone mapping. It is true that prisma considers 0..1 the only proper range, but conversions between color spaces can absolutely produce and consume colors with values outside of 0..1. I'll update the documentation in the coming days to clarify that. It won't stop you from using and doing math with colors outside the range either.

LoganDark commented 3 years ago

I honestly have no idea what that means, but I guess that's what I get for messing with graphics :P

If Prisma accepts colors outside of 0..1, that should definitely be documented. It even has types that look like they forbid other values...

tylerreisinger commented 3 years ago

I'll work on updating the documentation. The main design purpose for only normalized components being considered valid is for float => integer conversion to be lossless and valid, as well as HS* transformations, however most other things can be done without problem.

Out of curiosity, how are you using out-of-gamut colors (those with components outside of 0..1) for HDR? It isn't something I've considered before.

LoganDark commented 3 years ago

@tylerreisinger Components greater than one for bloom (hopefully), and probably lighting stuff - I'm probably not using the term HDR correctly :/

cessen commented 1 year ago

Drive-by comment here to clarify some use-cases for > 1.0 colors.

When display manufacturers started making HDR displays, "HDR" became synonymous with those displays and their corresponding standards in common nomenclature. But the term HDR significantly predates those displays and standards, and has a broader meaning. Particularly in the fields of VFX and 3D rendering.

A couple examples:

Image-based Lighting

The OpenEXR file format is an HDR image format used in the VFX and 3D animation industries that dates back to 1999. And dating back even further, Greg Ward developed the .hdr image format for his Radiance rendering software, which originated in the 1980s. Neither of these formats have any intentional upper bound on the magnitude of their RGB values (only being limited by their chosen floating point formats), and both formats store their RGB values in linear color. The intention of both formats is to (practically) allow unbounded values that correspond linearly to physical light energy.

One of the important use cases for both formats is to store what are commonly called HDRIs: 360° images that (ideally) encompass the entire luminance range of the environment where they were captured. HDRIs are used for image-based lighting, where the image acts as an all-surrounding source of light for the 3D scene, effectively recreating the lighting of the captured environment.

To reproduce an environment's lighting accurately, HDRIs need to capture luminance levels ranging from the darkest shadows all the way up to the brightness of the sun. And the latter in particular far exceeds the brightness bounds of HDR display standards.

HDRIs are typically stored in a linear RGB color space. For example, they might use the Rec.709 or Rec.2020 chromaticity coordinates, but with unbounded linear RGB values.

Renderer-internal RGB Values

Unbounded linear RGB values are often used as the internal representation of color and physical light energy in the internals of 3D renderers.

For example, the energy of the light rays being emitted by a lamp may be represented as (large) RGB values. The magnitude of these RGB values might be representable within the bounds of HDR display standards, but there's no guarantee of that. And in any case, they need to be linear for the rendering math to work correctly, so they wouldn't be encoded with a transfer function anyway.

The pixel colors of the rendered image are also often represented by arbitrary-magnitude linear RGB values during rendering, because there may be parts of the scene that exceed any given brightness limit (e.g. the sun being reflected in a chrome car bumper, or even just the sun being directly visible). It's only once the render buffer is ready for display that the pixel colors are processed to fit within a range that can be displayed (a process called tone mapping). These large RGB pixel values are also valuable for post effects, like simulated film/sensor bloom and post-process focal blur and motion blur.

All of these renderer-internal RGB values also need to be in a defined (but unbounded-magnitude) linear color space, just like HDRIs.

So... what to do?

With all that said, none of this means that prisma actually needs to support these use cases. If you want prisma to be a display-oriented color library, there's nothing wrong with that all! It's still useful.

And, honestly, trying to fully accommodate the VFX and 3D rendering use cases might not be worth the effort. For example, would you really want to go as far as supporting input device transforms? Or tone mapping? Or gamut mapping for out-of-gamut colors? The rabbit hole is deep.

But just supporting unbounded-magnitude linear RGB color spaces with defined chromaticity coordinates would go a long way, and would at least allow e.g. basic processing of things like HDRIs.

tylerreisinger commented 2 months ago

I've been having health problems and haven't been watching or updating this project during that time, so it's been a while. I definitely don't want to add things like tone-mapping to prisma, that's something for another library to provide, but relaxing the requirement of components to be in-gamut might be a good idea. I know HDR rendering often uses float components outside of the displayable gamut, but supporting those make color space transitions incorrect, which makes me hesitant to support it. Technically, you can use floats outside of 0..1, but going out of gamut effectively makes device-dependent color space transformations undefined behavior.