Closed weepy closed 4 years ago
technically it is possible but you need to implement some native for it. there is the "TextureLoader" system in place so just need to implement a TextureLoader that supports video. it can be done with an external library (it's not to be solved in react-native-webgl but react-native-webgl should already provide the tool to allow it). for instance I was able to write https://github.com/gre/react-native-webgl-view-shot I was planning to experiment around this but I can't give you an ETA so feel free to try on your own :)
i'm not yet sure how this could work, but in my mind, if we could interop with react-native-video (provides a ref of a Video element to the loadTexture thing) it would be the best, because you would be able to control the video with react-native-video while injecting it as a texture. now maybe there is a simpler idea (but usually you need more advanced control)
That's good to hear it's possible. I'm building a video mashup app, so right now I'm focussing on simply the synchronisation and playback - but would love to add some filters in the future. I probably need a bit more control than react-native-video provides (e.g. need to be able to switch sources as fast as possible).
if you need to be able to switch sources, maybe it would still be possible to have them all loaded in many <Video/>
, just paused, and you play and inject the one you need?
I mean, in web that's what you would do: you will create HTMLVideoElement (offscreen) and inject them in texImage2D.
kinda feel a bit hacky but in React paradigm you could render them all (hidden somewhere^^) and access their ref to have to control AND inject them to react-native-webgl lib extension (assuming we can then figure out how to get the video source from the ref)
Heya - so we're making our own native player with a React library. We found that layering multiple AVplayers and using prerolling allowed us the synchronisation we need. We found the best was to seek asap (i.e. on touch) - then use preroll before .play()
We'd like to be able to render these AVPlayers through a GL filter to combine them.
If you're interested to see what we're up to, here's a demo vid => https://www.youtube.com/watch?v=awzIIU49OjE
that's neat! yeah it makes sense to have them prerolled. we'll need to figure out how to have avplayer as an input in react-native-webgl.
That would be awesome. Just so I know - is it possible to have more than one video texture - so we can combine ?
yes multiple textures is totally possible, at least from the GL side, the limit of that depends on the hardware AFAIK but it's usually like 16 (MAX_TEXTURE_IMAGE_UNITS)
16 is plenty for our purposes.
Just looking at snapshot. I see you're using a timer in https://github.com/gre/react-native-webgl-view-shot/blob/master/ios/RNWebGLTextureView.m
Do you know if that's linked to a real frame refresh (aka requestAnimationFrame) ?
this is just for the view-shot case that I had to use a timer, because I didn't find a way to "observe" a view has changed (in continuous snapshotting mode). In the android implementation, i was able to do this: https://github.com/gre/react-native-webgl-view-shot/blob/master/android/src/main/java/fr/greweb/rnwebglviewshot/RNWebGLTextureView.java#L58-L59
for video, there are ways to hook to frame events
@weepy Did you done by implementing your task? I'm trying to similar work with it but can't do that.
We created our own video component. Was just easier.
Is it possible to use an AVPlayer to provide a dynamic texture with this ?