Closed axistudio closed 7 months ago
yeah, that would be cool.
But I don't know how to make this work in a performative way. It might be possible to get a spout/syphon texture into an image texture, but it will be very slow because it has to be transfered from GPU to CPU and then back again.
I think it would require a new kind of input node that is beyond my capabilities.
This is now implemented for Spyhon and NDI
Hi i am working on a project that want to share my touchdeisnger visual content as texture render back to blender shader editor , so the my generative content can be used as image texture as materail for object in Blender.