OpenTechEngine / Discussions

The issues of this project are abused as a discussion forum for Open Tech
4 stars 0 forks source link

Lighting and physics #20

Open ghost opened 8 years ago

ghost commented 8 years ago

I'm watching this lecture from John Carmack, it's quite old but was wondering what your take is on it.

https://www.youtube.com/watch?v=MG4QuTe8aUw

I'm currently experimenting with Blackbodies etc in Blender with Cycles and using scientific data to simulate lighting and materials based on wavelengths etc. The results when done correctly is quite impressive when using energy based models.

The advantages I can think of is that it would create realistic rendering and will cut down on texture use as wavelength/blackbody values can be hardcoded and mixed with stencil/alpha textures ( possibly proceduraly generated. ). This helps add randomness to the textures and how it is shaded according to the lights values from the blackbodies.

Here is an example of blackbody lighting, It also affects the translucency of the candle wax.

http://wiki.blender.org/uploads/thumb/5/5a/CyclesRelease269ColorTempCandles.png/800px-CyclesRelease269ColorTempCandles.png

BielBdeLuna commented 8 years ago

I think the best approximation we could have about this is the usage of a real time reflections method alongside a method to affect the way Doom3 calculates the size of the specular highlight which already is in the shader but it lacks code to control it.

ghost commented 8 years ago

There is an interview with John Carmack where he speaks of his experiments with raytracing and OpenCL: https://www.youtube.com/watch?v=hapCuhAs1nA

I think for I'll look into faking certain effects with material scripts, I've been experimenting with layering the code with the blend and add to mix the outcome, however in most cases the engine returns the result as a black render.

It would be useful if someone could write a material editor with compatibility for OTE with a real-time preview. It would make things easier for artists. Currently I'm learning a new open source application called AwesomeBump which is a tool for creating PBR and traditional materials. Something like that for OTE with script generation would be perfect.

Another thing I'm looking into is translucency is to fake subsurface scattering, so far my results show more transparency. I need to figure out how to get it to render with the depth buffer so the lighting comes from the back. I'm still learning as I go along.

BielBdeLuna commented 8 years ago

keep in mind that doom3 isn't a PBR renderer it only does some of the lightning in real-time and some gamma correction in the textures, besides this, it still lacks a lot of stuff.

ghost commented 8 years ago

Agreed, however I've found a workaround by testing the black & white values on specular maps based on PBR charts, the results seem good, it's just an issue now of gloss/reflection. SSS and other shaders can be done through custom renderprogs. I've started a small documentation project which documents steps I'm taking to create different types of materials, most of it is currently rough notes, but will be compiled at a later stage and uploaded to Github for other users of OTE.

BielBdeLuna commented 8 years ago

great.

gloss and reflection is the big thing now in all engines, the idea that there is not diffuse and specular but that all is difference in specular is the paradigm change, but the problem with this is that specular is expensive. hell even the portalsky in the old doom3 was expensive the way was done, and that was just rendering a second point of view, imagine rendering as points of view as surfaces there are in view, and re-reflections. it goes insane quite quickly. other engines are settling with abusing the environment maps, and then either using realtime environment maps (which is quite expensive, this option is possible in doom3), or forgetting about the entities that reside in the map (so no reflections of people in the specular, so quite the dracula effect), or use stand in shapes for people in the reflection ( in a separated renderer, and then added to the specular render target, just like in Remember Me ) or some exotic render feature like raycast specular reflections in Crysis 3

ghost commented 8 years ago

reflectionprobe

I did something like this in the Blender game engine by using an extra camera to render what is behind the player, then sending the result to the texture gloss map. It had some framerate drop depending on the amount of objects in the background, however there was no occlusion in the scene and it was rendering everything as far as the camera could see. Could maybe do something like that with a lower resolution on the result, possibly also making use of OpenCL to handle the task.

BielBdeLuna commented 8 years ago

this would be a heavy handed approach, where you're rendering the scene several times, I bet we could use some screen space reflections or other methods not involving rendering novel points of view, and leave the OpenCL for smokes and other particles.

BielBdeLuna commented 8 years ago

I wonder if voxels render any fast, maybe a voxelised version of the map could be renderer containing every diffuse information of the lightning and it's correspondent colour, therefore you could LOD those voxels and maybe invert them so they could represent the reflections?

ghost commented 8 years ago

Voxels could work well I think, maybe a combo of voxels and screen space reflections. I've been playing around with Ken SIlvermans Voxlap, which renders off the cpu, here is a link to the source code. http://advsys.net/ken/voxlap.htm

I've played some voxel games recently, some engines had terrible performance on the gpu, i've seen some per pixel voxel demos running on CUDA at good speeds. A low resolution output of the render should be sufficient for most surfaces that has a higher roughness value to cut down on the compute.