Open maxencedouet opened 4 years ago
I've similar question. I know that we have to use some IPC to do it. but my question is how xpc will deliver the frames received from the app to the plugin ?
What I ended up doing was capturing a NSWindow from inside the Plugin, at an interval. That's because my application needed to send the contents of a NSWindow opened on the desktop in an accompanying Mac OS app. Note that this window being captured by the plugin needs not be visible: so you could do all your composition in this "hidden" NSWindow, and your plugin would still be able to capture its bitmap.
@sivrish: May be this will be helpful https://github.com/PhilippMatthes/neural-greenscreen
@Raj123456788 thanks, but I want to implement something like SnapChat did on their Snap Camera. They used XPC to communicate between the App and the plugin. But I'm confused on how to create a connection using that.
@sivrish: Is there a repository where I can look into your code? I was trying the same but never got a headway. See if this helps: https://github.com/zakk4223/CSVirtualCamera/issues/3
@Raj123456788 have you implemented the vcam successfully ?
Hi @sivrish : I followed https://github.com/knightbenax/Cobalt and it works.
That's a great news! @Raj123456788 Can you DM me on twitter @sivrish and I have to ask few details about vcam instead of me spamming the issue section here ?
How do I add a microphone?
Hello Ryohei,
First, thank you very much for this simple exemple simply customisable.
I would love to use this plugin in an other application in order to be able to send stream into it. (node best but swift app ok)
Do you know how I could do ?
Thank you very much for your help
Maxence