Open kjhyun824 opened 3 years ago
I'm not 100% sure what solved it for me but what I did was got got it working with video and audio on. I included onicecandidate handlers to handle those messages as I was missing those before. Then once it was all perfect I removed the video parameter and it is still working great.
So I take back what I said, this only works if I am on the same WiFi for some reason.
@tgreco What did you handled on onicecandidate
function? (i.e. which one was missing?)
I also have tested on the same WIFI but it doesn't work for me... T.T
@kjhyun824
When the ICE candidates are received I use websockets to so send them over to the other user.
onIceCandidate(evt)
{
console.log('On Ice Candidate', evt.candidate);
if(evt.candidate !== null)
{
WebsocketClient.GetInstance().sendIceCandidate(evt.candidate, WebRtcClient.#instance.#partnerUserId.toString());
}
}
Then when the websocket message is received on the other client side:
peerConnection.addIceCandidate(new RTCIceCandidate(candidate));
I have the same issue. When I start a second call after close the first, I don't have sound...
Same, only on ios, android works perfectly, if you have a solution share it here :)
@tgreco Sorry for my late reply. In my case, I use Open WebRTC Toolkit(a.k.a. OWT) which offers WebRTC gateway and SDKs for implementing client-side application. And in the Client-side SDK, I have found the implementation that you have mentioned above. Moreover, if the ice connection was a problem, the Android should not work too, but it works... :(
I think it's known issue for them, cause they mentioned on the TODO as "Fix iOS audio shared instance singleton conflict with internal webrtc." We should wait until they handle it TT
@jsellam we have exactly this problem as well, anyone has an update on solutions/workarounds?
I have the same issue. When I start a second call after close the first, I don't have sound...
hi, could you solve it?
@zxcpoiu any help on this? I'm also facing the same issue. And due to the same reason I can not call stop method at the moment.
I solved it by manually managing RTCAudioSession. You can see a solution here: https://stackoverflow.com/a/55781328
Hi @kiprijonas , if you can give us more information on how to apply this solution please, thank you!
Same problem. In my case, it does not work in any call. The microphone and video do, but the audio does not. In Android, it works perfectly. Some news? @saghul
I solved it by manually managing RTCAudioSession. You can see a solution here: https://stackoverflow.com/a/55781328
Hi @kiprijonas, I saw your comment, and also your solution, but I cannot seem to link both the situations together. Can you share some snippet please? which files you changed for example?
Thanks in advance!
HI.
I'm building an app and have some issue. Here is my scenario.
Here is problem. When I re-establish the connection and start InCallManager again, the Audio doesn't works at all, i.e. Mic input and Speaker output.
My assumption is due to IOS AVAudioSession, because WebRTC assumes that it's the only one controls AVAudioSession which is implemented as a singleton.
Am I right? and how can I fix it? (or do you have any plan for fixing it?)
React-Native-IncallManager Version : v3.3.0 libWebRTC : M84