jitsi / jitsi-meet

Jitsi Meet - Secure, Simple and Scalable Video Conferences that you use as a standalone app or embed in your web application.
https://jitsi.org/meet
Apache License 2.0
23.24k stars 6.75k forks source link

Background blur on Mobile #7819

Open jinixx opened 4 years ago

jinixx commented 4 years ago

Describe the solution you'd like Have the same background blur feature available for web in iOS/Android.

The feature is especially useful for users on the run when they want to participate in business meetings and prefer not to distract everyone to his/her surroundings.

I understood that the background blur feature is currently built with Body Pix modal for TensorFlow. When this feature was built last year, TensorFlow did not official support RN but they do now since Feb. So, does the Jitsi team mind to share if this is on the road map? Any status about this? Or are there any background / potential road blocks / challenges on proceeding with this?

Willing to contribute if some background can be shared.

Thanks!

saghul commented 4 years ago

Hey there! We don't currently have this in our roadmap.

As for the implementation, it would need at a very low level, integrating directly with WebRTC in order to modify the video source before it goes out. I think Apple has some APIs to do face-tracking, so those could perhaps be used instead of TF. Not sure about Android.

Feel free to experiment and share your findings!

stale[bot] commented 3 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

nielsezeka commented 3 years ago

@jinixx cc: @saghul I try to add this feature but for CIFilter in iOS, my steps is: For custom WebRTC (https://github.com/react-native-webrtc/react-native-webrtc) Step 1: Build a custom WebRTC version(because I want to modify the stream before it goes out), add this function. (we override it in Rect-native-webrtc later) - (CVPixelBufferRef) getNewPixelRef:(CMSampleBufferRef) sampleBuffer { //---> override in custom filter return CMSampleBufferGetImageBuffer(sampleBuffer); }

in files: RTCCameraVideoCapturer.h/.m

Step 2: Change the pixel buffer input instead get them directly from buffer: CVPixelBufferRef pixelBuffer = nil; pixelBuffer = [self getNewPixelRef: sampleBuffer];

in files: RTCCameraVideoCapturer.h/.m

Step 3 For jiitsi project: modify to use the custom filter (MyCustomFilter is my CIFilter) `#if !TARGET_IPHONE_SIMULATOR MyCustomFilter videoCapturer = [[MyCustomFilter alloc] initWithDelegate:videoSource]; VideoCaptureController videoCaptureController = [[VideoCaptureController alloc] initWithCapturer:videoCapturer andConstraints:constraints[@"video"]]; videoTrack.videoCaptureController = videoCaptureController; [videoCaptureController startCapture];

endif`

Step 4: Implement the custom filter `@interface MyCustomFilter: RTCCameraVideoCapturer @property (retain, nonatomic) CIContext* context; @end

@implementation MyCustomFilter

And now your going stream has CISepiaTone.(I think you can modify function *- (CIImage) sepiaFilterImage...** for your implementation, the best option in iOS I guess is DeeplabV3 under https://developer.apple.com/machine-learning/models/ ). Hope it can help!

saghul commented 3 years ago

This looks very nice! If this can be implemented with a custom capturer, similar to how we do it for screen-sharing, it would be nice to explore the idea of adding the ability to inject it from outside of the RN WebRTC module and then applications could implement their own filters on the native side.

If you feel like experimenting feel free to get some PRs going and we can look over the details with some code in front.

jaswant0605 commented 3 years ago

This can be done by extending the VideoCapturer class and passing custom frames to it. I integrated ArCore with WebRTC this way successfully. All you need to do is use TFlite/OpenCV to create the "Bokeh" on your local view and then stream it via the CustomVideoCapturer by creating its frames.

This is very interesting and can be used very generically, you can stream almost anything if you are able to create its bitmaps! I also integrated CameraX with webrtc for different kind of camera video effects such as Zoom in-out, filters, flashlight (You cant use flashlight if you use default VideoCapturer as your camera is occupied).

You can refer to this answer to get an 'Idea'. https://stackoverflow.com/questions/58409931/use-webrtc-camera-in-ar-session-android

stale[bot] commented 2 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.