Open floe opened 1 year ago
Hi, to render to a canvas would require a second WebGL context (i.e. a 2nd THREE.WebGLRenderer
)
One of the things I deliberately tried to do with this set of components was to avoid the need for multiple WebGL contexts, as described here: https://diarmidmackenzie.github.io/aframe-multi-camera/#single-vs-multiple-webgl-contexts
In your case, it sounds as though you actively want an additional canvas & hence you'll need an additional WebGL context, since each THREE.WebGLRenderer
targets exactly one canvas
element.
In that case, I think you would be better off using the code from jgbarah's camrender.js, rather than trying to adapt these components?
Is there a reason that doesn't work for you?
I've fiddled around a bit more and found that the canvas element actually gets rendered to (if I move it out of a-assets
and make it visible as a standalone element, it shows the second camera view). But the MediaStream
I get from canvas.captureStream()
still shows a blank element. So I'll try the approach from camrender.js
next, thanks for your quick response!
Update: yes, it works with camrender.js
, with the caveat that the canvas needs to be initialized before captureStream()
works (either through THREE.WebGLRenderer
, or through canvas.getContext("webgl")
).
(if I move it out of a-assets and make it visible as a standalone element, it shows the second camera view). But the MediaStream I get from canvas.captureStream() still shows a blank element
When you do this, I think is not actually rendering to the 2nd canvas.
Rather, it is rendering to a section of the original canvas that is defined by the boundary of the 2nd canvas.
Just tested your component with my a-frame 1.4.1 scene and it works like a charm, kudos!
However, for my somewhat esoteric usecase, I'd like to render the output of the second camera to a
canvas
element (just like in https://jgbarah.github.io/aframe-playground/camrender-01/ ). Unfortunately, that doesn't seem to do anything. I verified that aframe-multi-camera itself works, using an extra plane, but I need something that I can use to create aMediaStream
object from, and that has to be acanvas
.My setup:
When I change the second camera options to
"output:plane; outputElement:#testplane; sequence:before"
, I get the expected result rendered to the plane, but with the code above, the canvas stays unchanged. Any ideas about how to fix this?Thanks!