johnboiles / coremediaio-dal-minimal-example

Intended to be the most minimalistic example of a macOS CoreMediaIO DAL plugin.
MIT License
211 stars 31 forks source link

Show animated pixelbuffer #9

Closed seanchas116 closed 4 years ago

seanchas116 commented 4 years ago

I replaced the static image pixel buffer with a dynamically animated pixel buffer, so you can see whether the frames are presented without flickers. (I wonder if static image and dynamic animation should be togglable)

It looks like this (the gif is from SimpleDALPlugin but the animation is same as this pull request):

image

johnboiles commented 4 years ago

You're the best @seanchas116 this is also something I wanted to do but hadn't gotten time yet. Thank you!!

For completeness sake could you also remove the png?

johnboiles commented 4 years ago

Thank you again!

Since you seem like a skilled macOS dev. I wonder if it would be possible to render a NSView or CALayer to the CGContext to be able to draw more complex things from the code.

Then I wonder if you could use maybe CAMediaTimingFunction to be able to use CALayer animations with the timing of the frames.

I might look into this at some point. There's some good conversation going on in https://github.com/johnboiles/obs-mac-virtualcam/issues/34 about what is shown from OBS when the virtual camera is off. It would be super cool to animate some part of that.

Do you see any reason this wouldn't work?

seanchas116 commented 4 years ago

I haven't used it yet but maybe you could use CARenderer for that purpose?

According to this gist CARenderer can render CALayers into MTLTexture with custom frame time. MTLTextue can be converted to CVPixelBuffer so you could probably render CALayers into CVPixelBuffer.

(I usually develop iOS apps and I know little about macOS specific things so the reply might be inaccurate 😓)

johnboiles commented 4 years ago

Neat thanks for the pointers! I also have more experience with iOS than macOS but I think the CoreAnimation stuff is mostly the same so I bet what you're saying will work. Not high on my priority list right now but I look forward to trying this out!

gerwindehaan commented 4 years ago

Thanks for this!

I would be inclined to try and keep any more elaborate drawing code (including config like framerate / size) out of this plugin itself, and instead try and expose a set of protocols for feeding the plugin with pixelbuffers etc. Ideally this interface would be consumed by any other external helper code, ideally external apps through MacOS XPC or even websockets. Not sure on the details on performance and XPC accessibility from these Plugins yet, but I'd imagine in the external apps you'd be using anything that could produce pixelbuffers, like SwiftUI / CARenderer/ Metal framebuffers. I am interested mostly in using Scenekit and CoreImage.

We are using XPC and WebSockets in our MacOS server app today. If I do find the time, I will be continuing such experiments in Swift with as a basis https://github.com/seanchas116/SimpleDALPlugin

johnboiles commented 4 years ago

Take a look at https://github.com/johnboiles/obs-mac-virtualcam or Apple's original sample code that is based on. Those use Mach messaging to pass frame buffers from a separate app and it works pretty well. Though the code is pretty messy.

Agreed we should keep anything elaborate out this plugin. It's the minimal example! Though it would be nice to have more complex examples in other repos.

I'm going to try to do some more complex animation potentially once I port this code back into https://github.com/johnboiles/obs-mac-virtualcam so that when OBS is closed you have some animated sample frame that indicates the plugin stream is working and ready to receive frames.