aws-samples / amazon-chime-react-native-demo

A React Native demo application for Android and iOS using the Amazon Chime SDK.
MIT No Attribution
100 stars 24 forks source link

How to implement video preview before start the meeting session #198

Closed ashirkhan94 closed 8 months ago

ashirkhan94 commented 9 months ago

How can we implement the custom video sources for camera preview before starting the meeting? Can I reuse the existing RNVideoViewManager for that, Does it require any changes? we refer https://github.com/aws/amazon-chime-sdk-android/blob/master/guides/custom_video.md documentation for Android and iOS but don't know how to implement it successfully on the existing bridge file of React-native-demo. Can anyone help us Thanks.

ashirkhan94 commented 8 months ago

Hi Team We implement the custom video source with existing RNVideoViewManager.kt file And the torch is on when we call the callStartLocalVideo() function but the video view in React Native side is Black(Blank), And we got the OnCaptureStarted observer when we call the callStartLocalVideo() function. What is the issue on our code, Please Help Us

RNVideoViewManager.kt

import com.amazonaws.services.chime.sdk.meetings.audiovideo.AudioVideoControllerFacade
import com.facebook.react.bridge.ReactMethod
import android.hardware.camera2.CameraManager
import android.content.Context

class RNVideoViewManager (private val reactContext2: ReactApplicationContext): SimpleViewManager<DefaultVideoRenderView>() {
    private val logger = ConsoleLogger(LogLevel.DEBUG)

    var eglCoreFactory = DefaultEglCoreFactory()
    var surfaceTextureCaptureSourceFactory = DefaultSurfaceTextureCaptureSourceFactory(logger, eglCoreFactory)
    val cameraCaptureSource = DefaultCameraCaptureSource(reactContext2, logger, surfaceTextureCaptureSourceFactory)

        // Get the current device and format
        val currentDevice = cameraCaptureSource.device
        val currentFormat = cameraCaptureSource.format

        // Pick a new device explicitly (requires application context)
        val desiredDeviceType = MediaDeviceType.VIDEO_BACK_CAMERA
        val cameraManager: CameraManager = reactContext2.getSystemService(Context.CAMERA_SERVICE) as CameraManager

    companion object {
        // The React Native side is expecting a component named RNVideoView
        private const val REACT_CLASS = "RNVideoView"
        private const val TAG = "RNVideoViewManager"
    }

    override fun createViewInstance(reactContext: ThemedReactContext): DefaultVideoRenderView {
        logger.info(TAG, "Creating view instance")
        return DefaultVideoRenderView(reactContext)
    }

    override fun getName(): String {
        return REACT_CLASS
    }

    fun callStartLocalVideo(){
        val newDevice = MediaDevice.listVideoDevices(cameraManager).firstOrNull { it.type == desiredDeviceType } ?:return 
        cameraCaptureSource.device=newDevice
        val newFormat = MediaDevice.listSupportedVideoCaptureFormats(cameraManager, newDevice)
         .firstOrNull { it.height <= 800 } ?: return

        cameraCaptureSource.format = newFormat
        cameraCaptureSource.torchEnabled = true
        cameraCaptureSource.start()
    }
    fun callStopLocalVideo(){
        cameraCaptureSource.stop()
        cameraCaptureSource.release()
    }
    @ReactProp(name = "mirror")
    fun setMirror(renderView: DefaultVideoRenderView, mirror: Boolean) {
        logger.info(TAG, "Creating view callStartLocalVideo1${mirror}")
        renderView.init(eglCoreFactory)
        cameraCaptureSource.addVideoSink(renderView)
    }
}

Observer functions in MeetingObservers.kt file

Screenshot 2023-10-14 at 9 03 09 PM
override fun onCaptureFailed(error: CaptureSourceError): Unit{
        logger.info(TAG, "OnCapture status failed error= ${error}")
        eventEmitter.sendReactNativeEvent(RNEventEmitter.RN_LOCAL_VIDEO_PREVIEW_STATUS, "failed")
    }
    override fun onCaptureStarted(): Unit{
        logger.info(TAG, "OnCapture status started")
        eventEmitter.sendReactNativeEvent(RNEventEmitter.RN_LOCAL_VIDEO_PREVIEW_STATUS, "started")
    }
    override fun onCaptureStopped(): Unit{
        logger.info(TAG, "OnCapture status stoped")
        eventEmitter.sendReactNativeEvent(RNEventEmitter.RN_LOCAL_VIDEO_PREVIEW_STATUS, "stopped")
    }

RNVideoRenderView.js

/*
 * Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
 * SPDX-License-Identifier: MIT-0
 */

import PropTypes from 'prop-types';
import React from 'react';
import { requireNativeComponent, findNodeHandle } from 'react-native';
import { NativeFunction } from '../../utils/Bridge';

export class RNVideoRenderView extends React.Component {

    componentDidMount() {
        // we need to delay the bind video 
        // Because "componentDidMount" will be called "immediately after the initial rendering occurs"
        // This is *before* RCTUIManager add this view to register (so that viewForReactTag() can return a view)
        // So we need to dispatch bindVideoView after this function complete
        setTimeout(() => {
            try {
                // NativeFunction.bindVideoView(findNodeHandle(this), this.props.tileId);
            } catch (error) {

            }
        });
    }

    componentWillUnmount() {
        if (!this.props.bindStatus) {
            // NativeFunction.unbindVideoView(0);
        }
    }
    render() {
        return <RNVideoRenderViewNative {...this.props} />;
    }
}

RNVideoRenderView.propTypes = {
    /**
     * A int value to identifier the Video view, will be used to bind video stream later
     */
    tileId: PropTypes.number,

};

var RNVideoRenderViewNative = requireNativeComponent('RNVideoView', RNVideoRenderView);

JoinScreen.js File

const JoinScreen = ({ navigation }: any) => {
        const cameraPreviewStatus = useSelector((state: any) => state.Main.cameraPreviewStatus);

    const startHandler = () => {
        NativeFunction.startLocalVideo();
    }
    const stophandler = () => {
        NativeFunction.stopLocalVideo();
    }

    return (
        <>
            <View style={{ flex: 1, backgroundColor: "yellow" }}>
                <View style={{ width: 300, height: 300, overflow: "hidden", backgroundColor: "red" }}>
                    {cameraPreviewStatus == "started" ? <RNVideoRenderView mirror={true} style={{ aspectRatio: 9 / 12, height: "100%", width: "100%", overFlow: 'hidden', }} /> : null}
                </View>
                <TouchableOpacity onPress={startHandler} style={{ width: 100, height: 40, margin: 20, backgroundColor: "gray" }}>
                    <Text >start</Text>
                </TouchableOpacity>
                <TouchableOpacity onPress={stophandler} style={{ width: 100, height: 40, margin: 20, backgroundColor: "gray" }}>
                    <Text>stop</Text>
                </TouchableOpacity>
            </View>
        </>
    )
}

export default JoinScreen

@georgezy-amzn need any changes on the above code for the local video preview before starting the meeting?

Screenshot_2023-10-14-21-00-09-402_com officesuitelivemobile

ashirkhan94 commented 8 months ago

Hi Team Any solution?

ashirkhan94 commented 8 months ago

Hi Team we closed this issue because we found a solution for this.

Thanks

shivam-kakkar27 commented 7 months ago

Hi Team we closed this issue because we found a solution for this.

Thanks

HI,

I'm also trying to implement the same. What's the solution?

shivam-kakkar27 commented 6 months ago

callStartLocalVideo

Hi, I'm also trying to implement the Video preview functionality before joining the meeting.

You are calling NativeFunction.startLocalVideo. In which file is it defined?

I think it must be in NativeMobileSDKBridge.kt file as all other functions are exported from there.

I'm confused as to how to use callStartLocalVideo() and callStopLocalVideo() functions defined in RNVideoViewManager.kt file

Can you send the rest of the code as a lot of it is missing in your question?

ashirkhan94 commented 6 months ago

I think as per Amazon Chime Android documentation no need to call the NativeFunction.startLocalVideo for preview. For solving the Android black issue we are using SurfaceRenderView instead of DefaultVideoRenderView

shivam-kakkar27 commented 6 months ago

But in the code you have posted here, you are calling NativeFunction.startLocalVideo in startHandler function in JoinScreen.js file. But in your code there is no such function. Instead there is another function called callStartLocalVideo() which you have defined in RNVideoViewManager.kt file.

ashirkhan94 commented 6 months ago

But in the code you have posted here, you are calling NativeFunction.startLocalVideo in startHandler function in JoinScreen.js file. But in your code there is no such function. Instead there is another function called callStartLocalVideo() which you have defined in RNVideoViewManager.kt file.

Yes, we were calling the NativeFunction.startLocalVideo(); which is written in NativeMobileSDKBridge.kt file, and from startLocalVideo() function we called the callStartLocalVideo() because we are not much familiar with Kotlin at that time

but I think you can directly call the callStartLocalVideo() from joinScreen if the function start with@ReactMethod in RNVideoViewManager.kt

In our case, we are making a copy of the existing RNVideoViewManager.kt file for preview