Closed ruggero-balteri closed 4 years ago
@ruggero-balteri we currently don't have a plan to support React-native anytime soon. But, you can definitely write your own bridge between JS and native codes since we support native android and ios.
KVS WebRTC SDK for Android: https://github.com/awslabs/amazon-kinesis-video-streams-webrtc-sdk-android KVS WebRTC SDK for IOS: https://github.com/awslabs/amazon-kinesis-video-streams-webrtc-sdk-ios
How to write android bridge: https://reactnative.dev/docs/native-modules-android How to write ios bridge: https://reactnative.dev/docs/native-modules-ios
Sounds good! thank you
Of course, a dedicated bridge would be the best but FYI I was able to successfully implement a working solution together with react-native-webrtc. I can confirm that it works together with amazon-kinesis-video-streams-webrtc-sdk quite well.
@swiety85 That's really awesome! Thanks for letting us know.
Since we currently don't have a plan to support a dedicated bridge to React Native and the question has been answered by creating your own bridge, I'll mark this issue as resolved. Please feel free to create a new issue if you need further assistance.
Since we currently don't have a plan to support a dedicated bridge to React Native and the question has been answered by creating your own bridge, I'll mark this issue as resolved. Please feel free to create a new issue if you need further assistance.
Sure, thank you for your help!
Of course, a dedicated bridge would be the best but FYI I was able to successfully implement a working solution together with react-native-webrtc. I can confirm that it works together with amazon-kinesis-video-streams-webrtc-sdk quite well.
oh that is a great news! I did use react-native-webrtc in the past, but I was not sure it could be used successfully with Kinesis WebRTC. May I ask you what were the biggest challenges (if any)? I am sure many people could benefit from your experience! Should we move the discussion to react-native-webrtc section?
For now, the functionality works only for iOS, but in the near future will be implemented also for Android. I think my biggest issue was that iOS fires up a pending call directly after delivering the VoIP push notification and you have no time given for the call initialization. The problem is that the process of establishing the connection together with kinesis takes time. When the user answers the call you need first to request a Kinesis to create a signaling channel (through the server) and then you can establish the peer connection by using SignalingClient of this library. In case of the incoming call with a locked screen, after the answer, the user sees the time counter running but you can not speak yet because the initialization takes some seconds. I have an idea of creating a channel before an answer, but I'm not sure about the potential costs of such a move.
Never the less I'm very happy about the Kinesis solution. 👍
@swiety85 It's amazing to use react-native-webrtc but I would like to know how did you implement the amazon-kinesis-video-streams-webrtc-sdk with it?
@swiety85 Awesome! It would be really great if you provide a quick snippet on how you connected to KVS. Could you? Thanks!!
@pdias94 I couldn't use KVC so I used Twilio its really amazing check it out also you can find an answer for every question
@swiety85 That sounds exciting. Any chance you have some code snippet available somewhere?
Hi Guys, I know this topic is closed, but I figure out a way to have the kinesis working fine with react native on android. He took me sometime to figure out so better to share it in case over people would like to play around.
const viewer = {};
const App = () => {
const [stream, setStream] = useState(null);
useEffect(() => {
if (!stream) {
(async () => {
... // SAME CODE OF https://github.com/awslabs/amazon-kinesis-video-streams-webrtc-sdk-js/blob/master/examples/viewer.js
viewer.signalingClient.on('sdpAnswer', async answer => {
// CHANGE 0 to audio and 1 to video
answer['sdp'] = answer['sdp'].replace("a=group:BUNDLE 0 1", "a=group:BUNDLE audio video")
answer['sdp'] = answer['sdp'].replace("a=mid:0", "a=mid:audio")
answer['sdp'] = answer['sdp'].replace("a=mid:1", "a=mid:video")
await viewer.peerConnection.setRemoteDescription(answer);
});
viewer.signalingClient.on('iceCandidate', candidate => {
// CHANGE 0 to audio and 1 to video
candidate['sdpMid'] = candidate['sdpMid'].replace("0", "audio")
candidate['sdpMid'] = candidate['sdpMid'].replace("1", "video")
viewer.peerConnection.addIceCandidate(candidate);
});
... // SAME CODE OF https://github.com/awslabs/amazon-kinesis-video-streams-webrtc-sdk-js/blob/master/examples/viewer.js
// REPLACE THE ADDTRACK EVENT BY
viewer.peerConnection.onaddstream = event => {
console.log('[VIEWER] Received remote track');
if (viewer.remoteStream) {
return;
}
viewer.remoteStream = event.stream;
setStream(viewer.remoteStream)
};
viewer.signalingClient.open();
})();
}
}, [stream]);
return <RTCView streamURL={stream?.toURL()} style={styles.viewer} />;
};
@ruggero-balteri we currently don't have a plan to support React-native anytime soon. But, you can definitely write your own bridge between JS and native codes since we support native android and ios.
KVS WebRTC SDK for Android: https://github.com/awslabs/amazon-kinesis-video-streams-webrtc-sdk-android KVS WebRTC SDK for IOS: https://github.com/awslabs/amazon-kinesis-video-streams-webrtc-sdk-ios
How to write android bridge: https://reactnative.dev/docs/native-modules-android How to write ios bridge: https://reactnative.dev/docs/native-modules-ios
I gone with the bridging in React Native and also tried with the SDK you provided in the link, now my question is that how to integrate the SDK in existing react native app with bridging, can I have to please all the files in my existing react native project ?
Thanks in advance
Hi Guys, I know this topic is closed, but I figure out a way to have the kinesis working fine with react native on android. He took me sometime to figure out so better to share it in case over people would like to play around.
# code removed
My apologies for not following up on this thread
Also, thank you very much for your suggestions wolfviking0!
I am testing a component based on your snippets and was able to successfully connect to kinesis using react-native-webrtc. Unfortunately, I do experience a problem with the RTCView
component.
The call-back functions seems to be triggered correctly and I do get the remote track from kinesis (that is onaddstream
works).
[VIEWER] Connected to signaling service
[VIEWER] Creating SDP offer
[VIEWER] Sending SDP offer
[VIEWER] Generated ICE candidate
[VIEWER] Sending ICE candidate
... sending and receiving
[VIEWER] Generated ICE candidate
[VIEWER] Sending ICE candidate
[VIEWER] Received SDP answer
Adapter created for track {58e55cf0-dc14-004d-9abe-d14e78132d74}
[VIEWER] Received remote track
Stream 98EA333C-DB19-4D57-9D01-34C16137D086
[VideoTrackAdapter] Mute event for 0 98EA333C-DB19-4D57-9D01-34C16137D086 {58e55cf0-dc14-004d-9abe-d14e78132d74}
[VIEWER] All ICE candidates have been generated
Nonetheless the RTCView
component is black and does not show any video.
I am not sure on how to debug this problem. On the top of my head I could either use Wireshark and compare the packets (but it seems very time-consuming), or use Xcode and try to debug the RTCView component (but I am not familiar with it). Either ways seems very tough.
Do you have any suggestions?
Note: to get the keys of configData
you can either use AWS CLI get-ice-server or more simply the Dev Tools of Chrome/Firefox while running amazon-kinesis-video-streams-webrtc-sdk-js (after you connect in the network tab you will find all the data that is fetched using the AWS SDK. Unfortunately React Native does not support it (which is not a big problem in the future as you can easily pass these parameters via a custom lambda function+api)
import React, {useState, useEffect} from 'react';
import {View, SafeAreaView, StyleSheet} from 'react-native';
import {RTCView} from 'react-native-webrtc';
import {SignalingClient} from 'amazon-kinesis-video-streams-webrtc';
import {RTCPeerConnection} from 'react-native-webrtc';
export default function Webrtc() {
const [stream, setStream] = useState(null);
const viewer = {};
function onStatsReport(report) {
// console.log('Report :', report);
}
useEffect(() => {
async function StartViewer() {
const configData = {
channelARN:
'arn:aws:kinesisvideo:us-east-1:XXXX:channel/XXXX/XXXX',
channelEndpoint:
'wss://v-XXXX.kinesisvideo.us-east-1.amazonaws.com',
clientId: 'OBQ9L95XXXX,
role: 'VIEWER',
region: 'us-east-1',
credentials: {
accessKeyId: 'XXXX',
secretAccessKey: 'XXXX',
sessionToken: null,
},
systemClockOffset: 0,
};
const iceServers = [
{
urls: 'stun:stun.kinesisvideo.us-east-1.amazonaws.com:443',
},
{
urls: [
'turn:XXXX.kinesisvideo.us-east-1.amazonaws.com:443?transport=udp',
'turns:XXXX.kinesisvideo.us-east-1.amazonaws.com:443?transport=udp',
'turns:XXXX.t-e58da546.kinesisvideo.us-east-1.amazonaws.com:443?transport=tcp',
],
username:
'XXXX',
credential: 'XXXX',
},
{
urls: [
'turn:XXXX.t-e58da546.kinesisvideo.us-east-1.amazonaws.com:443?transport=udp',
'turns:XXXX.t-e58da546.kinesisvideo.us-east-1.amazonaws.com:443?transport=udp',
'turns:XXXX.t-e58da546.kinesisvideo.us-east-1.amazonaws.com:443?transport=tcp',
],
username:
'XXXX',
credential: 'XXXX',
},
];
const formValues = {
region: 'us-east-1',
channelName: 'XXXX',
clientId: 'XXXX',
sendVideo: true,
sendAudio: false,
openDataChannel: false,
widescreen: true,
fullscreen: false,
useTrickleICE: true,
natTraversalDisabled: false,
forceTURN: false,
accessKeyId: 'XXXX',
endpoint: null,
secretAccessKey: 'XXXX',
sessionToken: null,
};
// Create Signaling Client
viewer.signalingClient = new SignalingClient(configData);
const configuration = {
iceServers,
iceTransportPolicy: formValues.forceTURN ? 'relay' : 'all',
};
viewer.peerConnection = new RTCPeerConnection(configuration);
// Poll for connection stats
viewer.peerConnectionStatsInterval = setInterval(
() => viewer.peerConnection.getStats().then(onStatsReport),
1000,
);
viewer.signalingClient.on('open', async () => {
console.log('[VIEWER] Connected to signaling service');
// Create an SDP offer to send to the master
console.log('[VIEWER] Creating SDP offer');
await viewer.peerConnection.setLocalDescription(
await viewer.peerConnection.createOffer({
offerToReceiveAudio: true,
offerToReceiveVideo: true,
}),
);
// When trickle ICE is enabled, send the offer now and then send ICE candidates as they are generated. Otherwise wait on the ICE candidates.
if (formValues.useTrickleICE) {
console.log('[VIEWER] Sending SDP offer');
viewer.signalingClient.sendSdpOffer(
viewer.peerConnection.localDescription,
);
}
console.log('[VIEWER] Generating ICE candidates');
});
viewer.signalingClient.on('sdpAnswer', async (answer) => {
// Add the SDP answer to the peer connection
console.log('[VIEWER] Received SDP answer');
// CHANGE 0 to audio and 1 to video
answer['sdp'] = answer['sdp'].replace(
'a=group:BUNDLE 0 1',
'a=group:BUNDLE audio video',
);
answer['sdp'] = answer['sdp'].replace('a=mid:0', 'a=mid:audio');
answer['sdp'] = answer['sdp'].replace('a=mid:1', 'a=mid:video');
await viewer.peerConnection.setRemoteDescription(answer);
});
viewer.signalingClient.onicecandidate = (e) => {
try {
console.log('remotePC icecandidate:', e.candidate);
if (e.candidate) {
// CHANGE 0 to audio and 1 to video
let candidate = e.candidate;
candidate['sdpMid'] = candidate['sdpMid'].replace('0', 'audio');
candidate['sdpMid'] = candidate['sdpMid'].replace('1', 'video');
viewer.peerConnection.addIceCandidate(candidate);
}
} catch (err) {
console.error(`Error adding localPC iceCandidate: ${err}`);
}
};
viewer.signalingClient.on('close', () => {
console.log('[VIEWER] Disconnected from signaling channel');
});
viewer.signalingClient.on('error', (error) => {
console.error('[VIEWER] Signaling client error: ', error);
});
// Send any ICE candidates to the other peer
viewer.peerConnection.addEventListener('icecandidate', ({candidate}) => {
if (candidate) {
console.log('[VIEWER] Generated ICE candidate');
// When trickle ICE is enabled, send the ICE candidates as they are generated.
if (formValues.useTrickleICE) {
console.log('[VIEWER] Sending ICE candidate');
viewer.signalingClient.sendIceCandidate(candidate);
}
} else {
console.log('[VIEWER] All ICE candidates have been generated');
// When trickle ICE is disabled, send the offer now that all the ICE candidates have ben generated.
if (!formValues.useTrickleICE) {
console.log('[VIEWER] Sending SDP offer');
viewer.signalingClient.sendSdpOffer(
viewer.peerConnection.localDescription,
);
}
}
});
// As remote tracks are received, add them to the remote view
viewer.peerConnection.onaddstream = (event) => {
console.log('[VIEWER] Received remote track');
if (viewer.remoteStream) {
return;
}
viewer.remoteStream = event.stream;
setStream(event.stream);
console.log('This is the event', event);
console.log('This is the stream', event.stream);
};
console.log('[VIEWER] Starting viewer connection');
viewer.signalingClient.open();
}
if (!stream) {
StartViewer();
}
}, [
stream,
viewer.peerConnection,
viewer.peerConnectionStatsInterval,
viewer.remoteStream,
viewer.signalingClient,
]);
console.log('Stream ', stream?.toURL());
console.log(stream);
return (
<SafeAreaView style={styles.container}>
<View style={styles.rtcview}>
<RTCView streamURL={stream?.toURL()} />
</View>
</SafeAreaView>
);
}
const styles = StyleSheet.create({
container: {
backgroundColor: '#313131',
justifyContent: 'space-between',
alignItems: 'center',
height: '100%',
},
rtcview: {
justifyContent: 'center',
alignItems: 'center',
height: '40%',
width: '80%',
backgroundColor: 'black',
},
});
Yeah~ it's ok. This is my code:
async function startViewer() {
const formValues = {
region: 'xxxxx',
channelName: 'xxxx',
clientId: getRandomClientId(),
sendVideo: true,
sendAudio: true,
openDataChannel: true,
widescreen: true,//16:9
fullscreen: false,//4:3
useTrickleICE: true,//Use trickle ICE (not supported by Alexa devices)
natTraversalDisabled: false,
forceTURN: false,
accessKeyId: 'xxxxx',
endpoint: null,
secretAccessKey: 'xxxxx',
sessionToken: null,
};
const channelARN = 'xxxxx';
const endpointsByProtocol = {
'HTTPS': 'xxxxx',
'WSS': 'xxxxx',
};
viewer.signalingClient = new SignalingClient({
channelARN: channelARN,
channelEndpoint: endpointsByProtocol.WSS,
clientId: formValues.clientId,
role: Role.VIEWER,
region: formValues.region,
credentials: {
accessKeyId: formValues.accessKeyId,
secretAccessKey: formValues.secretAccessKey,
sessionToken: formValues.sessionToken,
},
systemClockOffset: 0,
});
console.log('[VIEWER] Endpoints: ', endpointsByProtocol);
const configuration = {
iceServers: [
{
'urls': 'xxxxx',
},
{
'urls': [
'turn:xxxxx',
'turns:xxxxx',
'turns:xxxxx',
],
'username': 'xxxxx',
'credential': 'xxxxx',
},
{
'urls': [
'xxxxx',
'xxxxx',
'xxxxx',
],
'username': 'xxxxx',
'credential': 'xxxxx',
},
],
iceTransportPolicy: formValues.forceTURN ? 'relay' : 'all',
};
viewer.peerConnection = new RTCPeerConnection(configuration);
// Poll for connection stats
viewer.peerConnection.getStats().then(onStatsReport);
// viewer.peerConnectionStatsInterval = setInterval(() => {
// if (viewer && viewer.peerConnection && typeof viewer.peerConnection.getStats === 'function') {
// viewer.peerConnection.getStats().then(onStatsReport);
// }
// }, 3000);
viewer.signalingClient.on('open', async () => {
console.log('[VIEWER] Connected to signaling service');
let isFront = true;
mediaDevices.enumerateDevices().then(async (sourceInfos) => {
console.log(sourceInfos);
let videoSourceId;
for (let i = 0; i < sourceInfos.length; i++) {
const sourceInfo = sourceInfos[i];
if (sourceInfo.kind === 'videoinput' && sourceInfo.facing === (isFront ? 'front' : 'environment')) {
videoSourceId = sourceInfo.deviceId;
}
}
console.info('videoSourceId:', videoSourceId);
const localStream = await mediaDevices.getUserMedia({
audio: true,
video: {width: 640, height: 480, frameRate: 30, facingMode: (isFront ? 'user' : 'environment'), deviceId: videoSourceId},
});
console.info('user media stream url:', localStream);
viewer.localStream = localStream;
viewer.peerConnection.addStream(localStream);
setLocalStream(localStream);
// Create an SDP offer to send to the master
console.log('[VIEWER] Creating SDP offer');
await viewer.peerConnection.setLocalDescription(
await viewer.peerConnection.createOffer({
offerToReceiveAudio: true,
offerToReceiveVideo: true,
}),
);
// When trickle ICE is enabled, send the offer now and then send ICE candidates as they are generated. Otherwise wait on the ICE candidates.
if (formValues.useTrickleICE) {
console.log('[VIEWER] Sending SDP offer');
viewer.signalingClient.sendSdpOffer(viewer.peerConnection.localDescription);
}
console.log('[VIEWER] Generating ICE candidates');
});
});
viewer.signalingClient.on('sdpAnswer', async (answer) => {
// console.log('[VIEWER] Received SDP answer', answer);
// CHANGE 0 to audio and 1 to video
answer['sdp'] = answer['sdp'].replace('a=group:BUNDLE 0 1', 'a=group:BUNDLE audio video');
answer['sdp'] = answer['sdp'].replace('a=mid:0', 'a=mid:audio');
answer['sdp'] = answer['sdp'].replace('a=mid:1', 'a=mid:video');
await viewer.peerConnection.setRemoteDescription(answer);
});
viewer.signalingClient.on('iceCandidate', (candidate) => {
console.log('[VIEWER] Received iceCandidate', candidate);
// CHANGE 0 to audio and 1 to video
candidate['sdpMid'] = candidate['sdpMid'].replace('0', 'audio');
candidate['sdpMid'] = candidate['sdpMid'].replace('1', 'video');
viewer.peerConnection.addIceCandidate(candidate);
});
viewer.signalingClient.on('close', () => {
console.log('[VIEWER] Disconnected from signaling channel');
});
viewer.signalingClient.on('error', (error) => {
console.error('[VIEWER] Signaling client error: ', error);
});
// Send any ICE candidates to the other peer
viewer.peerConnection.addEventListener('icecandidate', ({candidate}) => {
if (candidate) {
console.log('[VIEWER] Generated ICE candidate', candidate);
// When trickle ICE is enabled, send the ICE candidates as they are generated.
if (formValues.useTrickleICE) {
console.log('[VIEWER] Sending ICE candidate');
viewer.signalingClient.sendIceCandidate(candidate);
}
} else {
console.log('[VIEWER] All ICE candidates have been generated');
// When trickle ICE is disabled, send the offer now that all the ICE candidates have ben generated.
if (!formValues.useTrickleICE) {
console.log('[VIEWER] Sending SDP offer');
viewer.signalingClient.sendSdpOffer(viewer.peerConnection.localDescription);
}
}
});
viewer.peerConnection.addEventListener('addstream', ({stream}) => {
console.log('[VIEWER] Received remote track');
if (viewer.remoteStream) {
return;
}
console.log('This is the stream', stream);
viewer.remoteStream = stream;
setStream(viewer.remoteStream);
});
console.log('[VIEWER] Starting viewer connection');
viewer.signalingClient.open();
}
Thank you!
Hi,
I have tried your example @ruggero-balteri, but I got an error on RN because the library is trying to access window and document which are variables from the browser.
Did you encounter the same problem ? Thanks in advance :)
Hi @guillaume-g interestingly enough I saw the same error calling out that global.document.addEventListener is not a function
. I traced it back to this issue: https://github.com/kevlened/isomorphic-webcrypto/issues/36. To solve it you can remove isomorphic-webcrypto from this package and use react-native-webview-crypto to polyfill cryto.subtle albeit not a perfect solution.
Of course, a dedicated bridge would be the best but FYI I was able to successfully implement a working solution together with react-native-webrtc. I can confirm that it works together with amazon-kinesis-video-streams-webrtc-sdk quite well.
hey, Great Great. If possible can you share the bridge on yashukla47@gmail.com
@swiety85 were you able to successfully implement the bridge on android as well?
hi! I am interested in connecting to Kinesis via WebRTC using a react-native application, but I am unsure about the best way to proceed.
It seems that the React-native WebRTC Kinesis SDK has not been released (yet), and I was wondering about the next steps.
What would you recommend?