RenderKit / ospray

An Open, Scalable, Portable, Ray Tracing Based Rendering Engine for High-Fidelity Visualization
http://ospray.org
Apache License 2.0
997 stars 182 forks source link

Path tracer takes ambient light into account for background pixels #404

Closed paulmelis closed 4 years ago

paulmelis commented 4 years ago

Scene with pure green background color. Two spheres, one with obj material and all defaults, one principled pure-specular and metallic, no diffuse. One ambient light, pure red color, intensity 1.

#include <ospray/ospray_cpp.h>

using namespace ospcommon::math;
using namespace ospray::cpp;

const int W = 1024;
const int H = 1024;

void writePPM(const char *fileName, const vec2i &size, const uint32_t *pixel)
{
  FILE *file = fopen(fileName, "wb");
  if (file == nullptr) {
    fprintf(stderr, "fopen('%s', 'wb') failed: %d", fileName, errno);
    return;
  }
  fprintf(file, "P6\n%i %i\n255\n", size.x, size.y);
  unsigned char *out = (unsigned char *)alloca(3 * size.x);
  for (int y = 0; y < size.y; y++) {
    const unsigned char *in =
        (const unsigned char *)&pixel[(size.y - 1 - y) * size.x];
    for (int x = 0; x < size.x; x++) {
      out[3 * x + 0] = in[4 * x + 0];
      out[3 * x + 1] = in[4 * x + 1];
      out[3 * x + 2] = in[4 * x + 2];
    }
    fwrite(out, 3 * size.x, sizeof(char), file);
  }
  fprintf(file, "\n");
  fclose(file);
}

int main(int argc, const char *argv[])
{
    ospInit(&argc, argv);

    Light light1 = Light("ambient");
    light1.setParam("color", vec3f(1.0, 0.0, 0.0));
    light1.setParam("intensity", 1.0f);
    light1.commit();

    Renderer renderer("pathtracer");
    renderer.setParam("backgroundColor", vec3f(0.0f, 1.0f, 0.0f));
    renderer.commit();

    // Diffuse sphere
    vec3f pos(0, 0, 0);

    Geometry sphere("sphere");
    sphere.setParam("sphere.position", Data(pos));
    sphere.setParam("radius", 1.0f);
    sphere.commit();

    Material mat("pathtracer", "obj");
    //mat.setParam("kd", vec3f(0, 0, 1));
    mat.commit();

    GeometricModel gmodel(sphere);
    gmodel.setParam("material", mat);
    gmodel.commit();

    // Shiny sphere
    vec3f pos2(2, 0, 0);

    Geometry sphere2("sphere");
    sphere2.setParam("sphere.position", Data(pos2));
    sphere2.setParam("radius", 1.0f);
    sphere2.commit();

    Material mat2("pathtracer", "principled");
    mat2.setParam("diffuse", 0.0f);
    mat2.setParam("specular", 1.0f);
    mat2.setParam("metallic", 1.0f);
    mat2.setParam("roughness", 0.0f);
    mat2.commit();

    GeometricModel gmodel2(sphere2);
    gmodel2.setParam("material", mat2);
    gmodel2.commit();

    // Scene    

    std::vector<GeometricModel> gmodels;
    gmodels.push_back(gmodel);
    gmodels.push_back(gmodel2);

    Group group;
    group.setParam("geometry", Data(gmodels));
    group.commit();

    Instance instance(group);
    instance.commit();

    World world;
    world.setParam("light", Data(light1));
    world.setParam("instance", Data(instance));
    world.commit();    

    Camera camera("perspective");
    camera.setParam("aspect", 1.0f*W/H);
    camera.setParam("position", vec3f(1, 2, 2));
    camera.setParam("direction", vec3f(0, -1, -1));
    camera.setParam("up", vec3f(0, 1, 0));
    camera.commit();

    FrameBuffer fb(vec2i(W,H), OSP_FB_SRGBA, OSP_FB_COLOR);
    fb.clear();

    fb.renderFrame(renderer, camera, world).wait();

    uint32_t *pixels = (uint32_t *)fb.map(OSP_FB_COLOR);
    writePPM("ambientlight.ppm", vec2i(W,H), pixels);

    fb.unmap(pixels);    
}

The resulting image is showing yellow background pixels, which is apparently the green background color, mixed with the red ambient light.

ambientlight

When setting kd of the obj material of one of the spheres to pure blue both spheres end up being shown in black:

ambientlight2

Here, the shiny sphere should at least pass on the ambient light (and/or background color), as the default principled baseColor is white.

Tradias commented 4 years ago

The alpha channel in the yellow area is transparent. Taking that into account produces the desired image for me.

jeffamstutz commented 4 years ago

This is a bug in geometries which use Embree's user-defined geometry functionality (we have geometry indexing wrong there): the issue came up when we took advantage of Embree's allowing of geometries to be added to more than one scene, but now our indexing gets messed up for UDGs. Looking into a solution...

jeffamstutz commented 4 years ago

This was missed because our geometric model material tests are done with native Embree geometry types, where geometry indexing is done on Embree's side alone.

jeffamstutz commented 4 years ago

It looks like we will need to update Embree for a proper solution, so a workaround can be implemented in the mean time for v2.0.1.

paulmelis commented 4 years ago

The alpha channel in the yellow area is transparent. Taking that into account produces the desired image for me.

Sorry, I don't understand what you mean. Even if I set the background color's alpha to 1 the image still shows (fully opaque) yellow pixels. Plus it would still be the wrong color, the ambient light color should not be visible in the background

johguenther commented 4 years ago

Actually the mixing of colors in the background is intended behavior: per default all lights are directly visible (to the camera), the ambient light is in infinity and covers the whole field-of-view. In v2.0 the background / backplate does not hide/block lights in infinity anymore (maybe this was a bit hidden in the CHANGELOG). If the old behavior is preferred you can make the ambient light invisible (stetting its visible parameter to false).

paulmelis commented 4 years ago

Actually the mixing of colors in the background is intended behavior: per default all lights are directly visible (to the camera), the ambient light is in infinity and covers the whole field-of-view. In v2.0 the background / backplate does not hide/block lights in infinity anymore (maybe this was a bit hidden in the CHANGELOG). If the old behavior is preferred you can make the ambient light invisible (stetting its visible parameter to false).

Ah, good to know! Reading the docs it seems the visible flag only controls direct visibility (camera rays)? So hiding the ambient color from being seen via a mirror-like surface is currently not possible? This is just to be clear, as I'm comparing to the ray visibility controls that blender offers (camera, diffuse, glossy, transmission, volume scatter).

johguenther commented 4 years ago

Correct. There are already some feature requests to extend the visibility like you described (e.g. split, wrt. camera, reflections, transmission, light/shadows #220) – how desperately do you need that (implementation can quickly become a mess...)?

jeffamstutz commented 4 years ago

FYI: the material problem was addressed in 00ed075, which will be in v2.0.1.

paulmelis commented 4 years ago

Correct. There are already some feature requests to extend the visibility like you described (e.g. split, wrt. camera, reflections, transmission, light/shadows #220) – how desperately do you need that (implementation can quickly become a mess...)?

Personally, it would be nice to have a bit more control, but there's other improvements that I would consider more important. These kinds of features go a bit into more advanced shading and lighting controls (i.e. more production rendering than sciviz) and for that having more flexible shading options (like a shader network) would be of more value I think.

carsonbrownlee commented 4 years ago

having more flexible shading options (like a shader network) would be of more value I think.

We did have some preliminary work on OSL integration a while back, but there's no ETA on when we might have that in OSPRay. Would OSL meet your requirements? It's good to have feedback on feature requests.

paulmelis commented 4 years ago

We did have some preliminary work on OSL integration a while back, but there's no ETA on when we might have that in OSPRay. Would OSL meet your requirements? It's good to have feedback on feature requests.

For me having to use OSL would be a bit too low-level. Working with the current materials as building blocks for shading is quite convenient, but being able to combine them in interesting ways would be an nice extension in flexibility. To compare with Blender again, for example, it allows mixing two shaders in a chosen amount (e.g. 30% metal, 70% glass) or to use a color ramp node to generate a color from a colorscale based on an input float value. The latter could come from, say, the amount of AO coverage, or the in-product of the shading normal with the view direction (to get Fresnel-like shading to highlight edges). These kinds of options allow a lot of creative flexibility, while limiting the expertise needed from the user (e.g. no OSL coding). But this is my personal preference from working with Blender a lot.

If the material system gets overhauled at some point then being able to update a material without having to re-assign them to GeometricModels would be awesome. Although it's currently easy to update a material parameter, changing the type (say, obj to metal) is a pain as you need to keep track of all the scene objects using that particular material to update them. If a Material becomes a container of some shader definition then being able to update the definition just like a parameter would be very nice.

jeffamstutz commented 4 years ago

To compare with Blender again, for example, it allows mixing two shaders in a chosen amount (e.g. 30% metal, 70% glass)

We do have a yet-to-be-officially-documented mix material that does this: https://github.com/ospray/ospray/blob/master/ospray/render/pathtracer/materials/Mix.cpp. Maybe this would help?

In general we need to add all of our materials to the public documentation (with screenshots!).

or to use a color ramp node to generate a color from a colorscale based on an input float value.

We could easily use transfer functions as something to modulate materials and/or values from a texture. The API design surrounding it, however, would need some thought.

If the material system gets overhauled at some point then being able to update a material without having to re-assign them to GeometricModels would be awesome.

We already have implemented the notion of using a material index into a master material list that lives on the renderer. You can assign material indices on the GeometricModel that index the material data parameter on the renderer, and update material to your hearts content in between frames. While we need more examples of this, it sounds like this is exactly what you are looking for.

paulmelis commented 4 years ago

We do have a yet-to-be-officially-documented mix material that does this: https://github.com/ospray/ospray/blob/master/ospray/render/pathtracer/materials/Mix.cpp. Maybe this would help?

Hah! And there's also hidden Plastic and Velvet materials in the dir it seems :) That's indeed very useful to know.

We could easily use transfer functions as something to modulate materials and/or values from a texture. The API design surrounding it, however, would need some thought.

Yes, that sounds like a good reuse of the TF

We already have implemented the notion of using a material index into a master material list that lives on the renderer. You can assign material indices on the GeometricModel that index the material data parameter on the renderer, and update material to your hearts content in between frames. While we need more examples of this, it sounds like this is exactly what you are looking for.

Hmm, that sounds fine as well. So instead of storing materials per model you'd store them all on the renderer and index them from the models. Updating a material, even changing its type, would involve updating/replacing the material for a certain index on the renderer. With lots of materials that could mean reassigning a large list on the renderer for changing a single material (but that use case might not be common, anyway). Again, good to know!

update material to your hearts content in between frames

That actually opens up interesting rendering options, by changing materials over the set of samples that get rendered. Not even Blender can do that :)

jeffamstutz commented 4 years ago

With lots of materials that could mean reassigning a large list on the renderer for changing a single material (but that use case might not be common, anyway).

Remember that if you used shared data arrays, creating (and assigning) a OSPData is mostly just a pointer and size, therefore being cheap. Re-creating shared data arrays, especially for objects which have no acceleration structure to build, should be viewed as very cheap to do.

The original reason for the feature is that the world would have no objects which contain renderer-dependent material handles in them. If each renderer has it's own "master view" of all the candidate materials, one can trivially switch renderers without incurring any cost to modify objects in the scene.

jeffamstutz commented 4 years ago

Even if you use copied data arrays, you'd need a LOT of materials for an array re-creation to be "expensive": material parameter/commits don't need the material list to be re-created, so this only happens when a material type is changed.

paulmelis commented 4 years ago

Even if you use copied data arrays, you'd need a LOT of materials for an array re-creation to be "expensive": material parameter/commits don't need the material list to be re-created, so this only happens when a material type is changed.

Okay, good to know. Although passing lists of objects as an API is pretty convenient and quick, it's a bit opaque how much processing is actually done on those items in OSPray after a commit() (the exception being the Embree acceleration structures being built).

paulmelis commented 4 years ago

I guess we can close this