elm-community / webgl

Moved to elm-explorations/webgl
https://package.elm-lang.org/packages/elm-explorations/webgl/latest
BSD 3-Clause "New" or "Revised" License
131 stars 18 forks source link

[Feature request] Frame buffers #35

Closed Zinggi closed 5 years ago

Zinggi commented 7 years ago

It's currently impossible to perform certain rendering operation common in many 3d games. Many techniques require rendering to a texture (or other types of a frameBuffer).

Examples include:

For the API, I propose a counterpart to toHtml and toHtmlWith, but instead of rendering to Html, it should render to a Texture. E.g.

renderToTextureWithSettings : RenderToTextureOptions -> List Renderable -> Texture

Where RenderToTextureOptions would be something like:

type alias RenderToTextureOptions =
  { width : Float
  , height : Float
  , settings : List Setting
  }

I didn't fully think that api through, it's just a suggestion.

w0rm commented 7 years ago

I guess it is quite possible to do it like this. From the runtime side, renderToTextureWithSettings can return a thunk function, that will be evaluated with the WebGLContext when the uniform is initiated.

w0rm commented 7 years ago

I have a (kinda) working POC here: https://github.com/elm-community/webgl/tree/render-to-texture

koenusz commented 7 years ago

I think a nice addition to this feature would be to add a function to the Texture to get the color at a position. This way the texture can be used for picking functionality.

w0rm commented 7 years ago

@koenusz render to texture won't create an image that may be used for picking. It will only encapsulate the instructions of how to render to texture. We need the gl context and we can only access it when the virtual dom node is rendered.

shamansir commented 6 years ago

Would be nice also to be able to do things like in this article https://webglfundamentals.org/webgl/lessons/webgl-render-to-texture.html, for making different kinds of reflections & s.o. Just saying.

guydunton commented 6 years ago

Has there been any progress on this feature? I wanted to port some C++ code (demo here) to Elm but it requires render to texture to implement the deferred shading. Is there any way to help this along?

w0rm commented 6 years ago

@gdunton no progress so far apart from https://github.com/elm-community/webgl/tree/render-to-texture branch.

The API there looks nice and simple, because it is similar to the toHml function.

fromEntities : Options -> ( Int, Int ) -> List WebGL.Entity -> Result Error Texture
fromEntities { magnify, minify, horizontalWrap, verticalWrap, flipY } ( width, height ) entities =

However there are problems with this solution:

  1. In many cases we need to render to a texture on every frame. The current draft implementation would create a new texture and a frame buffer each time. That is really slow to do on each frame. Ideally we need a way to reuse the frame buffers, but this is hard with the declarative api.
  2. The data is cached forever, meaning dynamically creating textures grows the memory.

What we need is to invent some cache control from the declarative api.

guydunton commented 6 years ago

@w0rm That's unfortunate but makes sense. Having thought about it for a bit I can't think of a place to cache the frame buffers that doesn't limit what can be done with the api.

It seems that the programmer needs more control over buffers and could be required to store the texture in their model to achieve this; although that isn't particularly clean. Sorry if this isn't very helpful.

w0rm commented 6 years ago

@gdunton There is no way to cache buffers in either textures or entities, because a buffer is bound to the gl context which is stored in the virtual dom node. The same texture may be sent to different canvases.

The way it currently works is each texture gets a new unique id that is used to store the texture in the cache that is attached to the virtual dom node. The texture is cached once it is accessed from the uniforms and it is cached forever.

With rendering to textures we need to reuse the frame buffer between multiple textures. But we need to do it in a clever way. Some of the buffers may need to be reused on each render and others may be needed to be cached forever.

What would be the way to control the cache? How can we tell if a particular frame buffer may be reused? It should probably have the same dimensions and texture options. What should we do with buffers from multiple recursive textures? E.g. when a scene and a texture require another rendered texture. In this case we would need two buffers.

guydunton commented 6 years ago

@w0rm I'm not sure whether this would suite all needs but as the simplest,hacky solution you could alter Option to

type alias Options =
    { magnify : Resize Bigger
    , minify : Resize Smaller
    , horizontalWrap : Wrap
    , verticalWrap : Wrap
    , flipY : Bool
    , bufferName : Maybe String
    }

defaultOptions could provide Nothing while a user could name the buffer. This would let you cache the buffer in a map in the virtual dom node, that way the user could choose whether they wanted to render to a specific buffer or not. I've been trying to think of a slightly more elegant solution other than strings but I'm drawing a blank.

w0rm commented 6 years ago

@gdunton I thought about providing keys for the cache. But how would this work if the same buffer name is used for two textures that have different sizes?

w0rm commented 6 years ago

I had another idea, what if we could do smth like this:

frameBuffer : Options -> ( Int, Int ) -> Result Error FrameBuffer

fromEntities : FrameBuffer -> List Entity -> Texture

Each frame buffer gets its id and is cached. So it would be possible to create it separately outside the view function, or store in the model, and reuse for rendering.

One problem I see in this solution is when a frame buffer is used multiple times in the rendering tree. I guess we should store multiple of them under the same cache id and then reuse.

Another problem is supporting only-render-once cases, when we just need to render to a texture and cache it forever without re-evaluating. I guess in this case we should give an id to the texture and store it for the frame buffer, so that the frame buffer would check if the last rendered id matches the one from the texture, then it should not do anything.

guydunton commented 6 years ago

I like the idea of a new type for the FrameBuffer and following from that I have been toying with this:

type alias BufferedTexture =
    { buffer : FrameBuffer
    , texture : Texture
    }

renderIntoBuffer : FrameBuffer -> List entity -> BufferedTexture
renderIntoBuffer buffer entities =
    BufferedTexture buffer 3

overwriteBufferedTexture : BufferedTexture -> List entity -> BufferedTexture
overwriteBufferedTexture bufferedTexture entities =
    bufferedTexture

deconstructBufferedTexture : BufferedTexture -> ( FrameBuffer, Texture )
deconstructBufferedTexture bufferedTexture =
    ( bufferedTexture.buffer, bufferedTexture.texture )

The names aren't very good but this would allow the programmer to decide how they wanted to render into the buffers. The BufferedTexture could be deconstructed to allow the texture to be stored (perhaps in the model) while the buffer could be reused. If the program required the buffer to be rendered into each frame and the old texture wasn't required it could be overwritten.

This still doesn't solve the buffer being used multiple times in the rendering tree however.

w0rm commented 6 years ago

@gdunton I'm not sure why these things are needed.

fromEntities produces a Texture with a certain id (under the hood it just stores the instructions needed to evaluate the texture). This texture is rendered onto the corresponding frame buffer when evaluated. That frame buffer should be defined outside the view function & cached, just like a mesh. If a user is calling fromEntities with the same buffer multiple times, then the texture inside this buffer will be overridden on every frame, and the buffer will be reused, just what we want!

Now, if we want to only render once, what we do, is remember an id of the last texture that was rendered into the certain frame buffer. If this id is the same, then we should not replace the contents of the frame buffer. So in order to preserve a texture, a user has to store it somewhere in the model or define as a constant in the top level scope.

guydunton commented 6 years ago

@w0rm That all makes sense to me. I was suggesting those new functions to allow the user more explicit controls of buffers and textures but it makes less sense to add them if you don’t need to.

w0rm commented 6 years ago

@gdunton I think I managed to make it work in the https://github.com/elm-community/webgl/tree/render-to-texture branch. Would be good if more people checked that this actually works. You can clone this branch and write code in the examples folder. The current example in render-to-texture.elm just renders a simple gradient onto the texture, and then renders it again on the rotating rectangle.

w0rm commented 6 years ago

I modified the example to render the crate scene onto the texture and then render the result texture onto the rotating square. Seems to be working fine! And the frame buffer is properly cached.

untitled

ajlende commented 6 years ago

I've been waiting for the ability to do multi-pass stuff for a while now—thanks for working on this! I did a quick gaussian blur example. It's mostly working—I haven't tried to get the reflection fixed yet.

gaussian blur example animation

w0rm commented 6 years ago

@ajlende there is a problem, gl.clear doesn’t seem to clean up the stencil buffer in the frame buffer. Even though it should here: https://github.com/ajlende/webgl/blob/render-to-texture/src/Native/WebGL.js#L380

guydunton commented 6 years ago

@w0rm I've been playing with this in an example and it's exciting. However, I think I've discovered a bug with regards to passing multiple textures into a shader.

Here: https://github.com/gdunton/webgl/blob/7d58ce9d42f54b3fcd05b0c39ab3b2a4404e9dba/examples/deferred-shading.elm#L307 & here: https://github.com/gdunton/webgl/blob/7d58ce9d42f54b3fcd05b0c39ab3b2a4404e9dba/examples/deferred-shading.elm#L470

I'm not exactly sure what's happening but swapping around the textures in the record causes the other texture to be all white.

w0rm commented 6 years ago

@gdunton thanks for trying this! Could you reduce the example to a minimum?

In any case please keep the code, I will try to fix the issue.

guydunton commented 6 years ago

@w0rm I've created an example showing the error with multiple textures being passed into a shader to combine them. The color on screen should be yellow because the combined shader adds together a red texture and a green texture, but the output is green.

If the order of the parameters into the combine shader are swapped then the output is always the second texture.

https://github.com/gdunton/webgl/blob/render-to-texture/examples/combine-buffers.elm

I've also tried passing multiple textures into a single shader and this works correctly.

w0rm commented 6 years ago

@gdunton thanks! I pushed the fix. I made a stupid JavaScript mistake.

Unfortunately programming in JavaScript after work is not as easy as Elm!

w0rm commented 6 years ago

@gdunton I wonder if this new API looks intuitive and allows you to implement your demo?

guydunton commented 6 years ago

@w0rm I've been working on it and I'm enjoying the API, especially because Maybe Framebuffer can separated from the use, not having to deal with maybe's in the pipeline is helpful. I'm just having some trouble porting to webGL because I can't use signed image data, that’s more of a webGL 2 thing though I think. I'm pretty sure I'll be able to do it with the current API but I'm having to remember a lot of maths that I'd forgotten!

w0rm commented 6 years ago

@gdunton the reason frameBuffer returns Result is because we're using the same Options as for loading a Texture from a file. It fails when the dimensions are not power of two and mipmapping is enabled.

w0rm commented 6 years ago

@ajlende I had the same issue with the stencil buffer when rendering this scene with a headless gl. The issue can be mitigated by clearing the stencil buffer. You just render the same floor with settings that don't put it on the screen, but clear the stencil bits.

Appending this in the end of entities does the work by failing the stencil test and emptying the stencil buffer when it fails:

    , WebGL.entityWith
        [ StencilTest.test
            { ref = 1
            , mask = 1
            , test = StencilTest.never
            , fail = StencilTest.zero
            , zfail = StencilTest.keep
            , zpass = StencilTest.keep
            , writeMask = 0xFF
            }
        ]
        floorVertex
        floorFragment
        floorMesh
        { texture = texture
        , perspective = camera
        }

This screencast below renders the crate scene on the sides of the rotating cube:

untitled

cc @shamansir this looks like the demo from your link ^

ajlende commented 6 years ago

@w0rm I was also able to get it to work by adding gl.stencilMask(0xFF) right before the gl.clear call that you pointed out because apparently it needs to be enabled in order to clear it.

How to clear the stencil buffer Depth test & stencils - Planar reflections

w0rm commented 6 years ago

@ajlende oh, I see, didn't know that it depends on the mask. We probably need to restore the mask value when we remove the stencil setting https://github.com/elm-community/webgl/blob/master/src/Native/WebGL.js#L126

w0rm commented 6 years ago

@ajlende I pushed a proper fix to the same branch. I figured out how to get the default value for the stencil mask, because different platforms may have different number of stencil bits. https://github.com/elm-community/webgl/commit/245d2fd557cadd40ae01da5819d487dfdc12ba05

Apparently there was the same issue with the depth mask. I fixed it too.

w0rm commented 6 years ago

To test that multiple steps work, I rendered the crate scene onto a texture, then rendered a cube with that texture onto another texture, and then rendered a cube with the final texture. Works fine!

screen shot 2018-03-27 at 21 17 21

guydunton commented 6 years ago

@w0rm I've worked some more on deferred rendering and I've managed to get something working. The code is a mess but it seems to work well.

elm webgl

There are some hoops that need to be jumped through but these are limitations of WebGL 1.0 rather than the elm implementation.

w0rm commented 5 years ago

This feature was re-requested in the new repo: https://github.com/elm-explorations/webgl/issues/6