Open Mohit-3196 opened 2 years ago
Hi @Mohit-3196 ,
Thank you for the issue. It's an expected behavior.
I think I can provide a quick solution for this. Let's schedule the issue.
Yes just let me tell the solution.
Broadcast extension captures the system audio and/or mic audio. Here are some solutions for the different scenarios.
If one just would like to send the mic audio but not to send system audio.
showsMicrophoneButton
to true in WelcomeVideoController
self.screenRecord.showsMicrophoneButton = true;
processSampleBuffer
RPSampleBufferType.audioApp
caseRPSampleBufferType.audioMic
caseAs a result it should look like something below
override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) {
switch sampleBufferType {
case RPSampleBufferType.video:
// Handle video sample buffer
//NSLog("processSamplebuffer video");
if videoEnabled {
self.client.deliverExternalVideo(sampleBuffer: sampleBuffer);
}
break
case RPSampleBufferType.audioApp:
// Handle audio sample buffer for app audio
//NSLog("processSamplebuffer audio");
// if audioEnabled {
// self.client.deliverExternalAudio(sampleBuffer: sampleBuffer);
// }
break
case RPSampleBufferType.audioMic:
// Handle audio sample buffer for mic audio.
// You can choose
// NSLog("processSamplebuffer audio mic");
if audioEnabled {
self.client.deliverExternalAudio(sampleBuffer: sampleBuffer);
}
break
@unknown default:
// Handle other sample buffer types
fatalError("Unknown type of sample buffer")
}
}
If one would like to send system audio and mic audio, we've a workaround solution. In this solution, user can send additional stream to the server that is audio only. In order to that, follow the instructions
SampleHandler.swift
in iOS SDK and create a new field as below
let micAudioClient: AntMediaClient = AntMediaClient.init()
Go to the end of broadcastStarted
and add the following lines
let micAudioStreamId = "micAudio"; // give any stream id you want to have
self.micAudioClient.delegate = self
self.micAudioClient.setDebug(true)
self.micAudioClient.setOptions(url: url as! String, streamId: micAudioStreamId as! String, token: token as? String ?? "", mode: AntMediaClientMode.publish, enableDataChannel: true, captureScreenEnabled: true);
//disable video
self.micAudioClient.setVideoEnable(enable: false);
self.micAudioClient.setExternalVideoCapture(externalVideoCapture: false);
self.micAudioClient.setExternalAudio(externalAudioEnabled: true)
self.micAudioClient.initPeerConnection();
self.micAudioClient.start();
processSampleBuffer
and edit the RPSampleBufferType.audioMic
case and it should look like something below
...
case RPSampleBufferType.audioMic:
// Handle audio sample buffer for mic audio.
// You can choose
// NSLog("processSamplebuffer audio mic");
if audioEnabled {
self.micAudioClient.deliverExternalAudio(sampleBuffer: sampleBuffer);
}
break
...
broadcastFinished
method and stop the micAudioClient
as shown below
override func broadcastFinished() {
self.client.stop();
self.micAudioClient.stop();
}
I've tested these things and It has worked for me. I'm attaching the edited SampleHandler.swift for your convenience. I hope it will help you and let me know if I can help further
Thanks for the suggested solutions. This however doesn't seem to solve the problem we have of receiving/playing audio when using the broadcasting extension.
That's our testing constellation: A peer-to-peer connection between a WEB-client <--> iOS client The audio issue only occurs in the direction of receiving audio on the iOS-client (web --> iOS). The other direction is working as expected: audio/mic from the iOS client is received on the WEB-client. And of course WEB <--> WEB is also working.
A comment to the suggested 3. solution (by the way, should there be a 2. solution?) This workaround won't work in our scenario of a peer-2-peer connections, because the AntMedia Server + SDKs won't allow more than 2 clients
Hi @doggomir , I see.
The audio issue only occurs in the direction of receiving audio on the iOS-client (web --> iOS).
What about mixing audio in the web side and sending the mixed audio. There is a support for mixing audio in web side. For instance, desktop audio + mic audio are mixed in this line https://github.com/ant-media/StreamApp/blob/master/src/main/webapp/js/media_manager.js#L373
A comment to the suggested 3. solution (by the way, should there be a 2. solution?) This workaround won't work in our scenario of a peer-2-peer connections, because the AntMedia Server + SDKs won't allow more than 2 clients
Yes, you're right. You already know that the quick workout solution is using the AMS to relay the audio/video for this scenario. Sorry for that.
Please let me know if we can help you with anything.
Regards, A. Oguz
@mekya I'm also facing this question, and maybe this problem is because some APIs are not allowed to be used from the extension?
Use any API marked in header files with the NS_EXTENSION_UNAVAILABLE macro, or similar unavailability macro, or any API in an unavailable framework For example, in iOS 8.0, the HealthKit framework and EventKit UI framework are unavailable to app extensions.
I've tried to play sound using AVSpeechSynthesisVoice and AVSpeechUtterance, and have had no luck. So looks like from the extension we can just stream video and audio FROM the device but not ON the device...
We are using the Broadcast Extension example with a p2p connection. We want to share the screen of an iPhone with another participant while maintaining an audio connection. The participant receives both video and audio without issue. The iOS user does not seem to play the audio of the other participant. This behaviour only happens with the broadcast extension and regular screen capturing (foreground) does not suffer from the same issue
ClientMOde: .join (for p2p)