ant-media / Ant-Media-Server

Ant Media Server is a live streaming engine software that provides adaptive, ultra low latency streaming by using WebRTC technology with ~0.5 seconds latency. Ant Media Server is auto-scalable and it can run on-premise or on-cloud.
https://antmedia.io
Other
4.28k stars 631 forks source link

While using Broadcast Extension it is not possible to hear the audio of the other participant. #4293

Open Mohit-3196 opened 2 years ago

Mohit-3196 commented 2 years ago

We are using the Broadcast Extension example with a p2p connection. We want to share the screen of an iPhone with another participant while maintaining an audio connection. The participant receives both video and audio without issue. The iOS user does not seem to play the audio of the other participant. This behaviour only happens with the broadcast extension and regular screen capturing (foreground) does not suffer from the same issue

ClientMOde: .join (for p2p)

mekya commented 2 years ago

Hi @Mohit-3196 ,

Thank you for the issue. It's an expected behavior.

I think I can provide a quick solution for this. Let's schedule the issue.

mekya commented 2 years ago

Yes just let me tell the solution.

Broadcast extension captures the system audio and/or mic audio. Here are some solutions for the different scenarios.

  1. If one just would like to send the mic audio but not to send system audio.

    • Set showsMicrophoneButton to true in WelcomeVideoController
      self.screenRecord.showsMicrophoneButton = true;
    • Just open the SampleHandler.swift and go to processSampleBuffer
    • Comment out the line that sends the system audio under the RPSampleBufferType.audioApp case
    • Enable the line that sends the mic audio under the RPSampleBufferType.audioMiccase

    As a result it should look like something below

    override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) {
        switch sampleBufferType {
        case RPSampleBufferType.video:
            // Handle video sample buffer
            //NSLog("processSamplebuffer video");
            if videoEnabled {
                self.client.deliverExternalVideo(sampleBuffer: sampleBuffer);
            }
            break
        case RPSampleBufferType.audioApp:
            // Handle audio sample buffer for app audio
            //NSLog("processSamplebuffer audio");
           // if audioEnabled {
           //     self.client.deliverExternalAudio(sampleBuffer: sampleBuffer);
           // }
            break
        case RPSampleBufferType.audioMic:
            // Handle audio sample buffer for mic audio.
            // You can choose
           // NSLog("processSamplebuffer audio mic");
           if audioEnabled {
                self.client.deliverExternalAudio(sampleBuffer: sampleBuffer);
           }
            break
        @unknown default:
            // Handle other sample buffer types
            fatalError("Unknown type of sample buffer")
        }
     }
    • Please pay attention: Don't forget to tap to enable Mic button when starting the Screen Record
  2. If one would like to send system audio and mic audio, we've a workaround solution. In this solution, user can send additional stream to the server that is audio only. In order to that, follow the instructions

    • Open SampleHandler.swift in iOS SDK and create a new field as below
      let micAudioClient: AntMediaClient = AntMediaClient.init()
    • Go to the end of broadcastStarted and add the following lines

          let micAudioStreamId = "micAudio"; // give any stream id you want to have
          self.micAudioClient.delegate = self
          self.micAudioClient.setDebug(true)
          self.micAudioClient.setOptions(url: url as! String, streamId: micAudioStreamId as! String, token: token as? String ?? "", mode: AntMediaClientMode.publish, enableDataChannel: true, captureScreenEnabled: true);
      
          //disable video
          self.micAudioClient.setVideoEnable(enable: false);
          self.micAudioClient.setExternalVideoCapture(externalVideoCapture: false);
      
          self.micAudioClient.setExternalAudio(externalAudioEnabled: true)
      
          self.micAudioClient.initPeerConnection();
      
          self.micAudioClient.start();
    • Go to the end of processSampleBuffer and edit the RPSampleBufferType.audioMic case and it should look like something below
      ...
      case RPSampleBufferType.audioMic:
          // Handle audio sample buffer for mic audio.
          // You can choose
         // NSLog("processSamplebuffer audio mic");
         if audioEnabled {
              self.micAudioClient.deliverExternalAudio(sampleBuffer: sampleBuffer);
          }
          break
      ...
    • Go to the broadcastFinished method and stop the micAudioClient as shown below
      override func broadcastFinished() {
        self.client.stop();
        self.micAudioClient.stop();
      }

I've tested these things and It has worked for me. I'm attaching the edited SampleHandler.swift for your convenience. I hope it will help you and let me know if I can help further

SampleHandler.swift.txt

doggomir commented 2 years ago

Thanks for the suggested solutions. This however doesn't seem to solve the problem we have of receiving/playing audio when using the broadcasting extension.

That's our testing constellation: A peer-to-peer connection between a WEB-client <--> iOS client The audio issue only occurs in the direction of receiving audio on the iOS-client (web --> iOS). The other direction is working as expected: audio/mic from the iOS client is received on the WEB-client. And of course WEB <--> WEB is also working.

A comment to the suggested 3. solution (by the way, should there be a 2. solution?) This workaround won't work in our scenario of a peer-2-peer connections, because the AntMedia Server + SDKs won't allow more than 2 clients

mekya commented 2 years ago

Hi @doggomir , I see.

The audio issue only occurs in the direction of receiving audio on the iOS-client (web --> iOS).

What about mixing audio in the web side and sending the mixed audio. There is a support for mixing audio in web side. For instance, desktop audio + mic audio are mixed in this line https://github.com/ant-media/StreamApp/blob/master/src/main/webapp/js/media_manager.js#L373

A comment to the suggested 3. solution (by the way, should there be a 2. solution?) This workaround won't work in our scenario of a peer-2-peer connections, because the AntMedia Server + SDKs won't allow more than 2 clients

Yes, you're right. You already know that the quick workout solution is using the AMS to relay the audio/video for this scenario. Sorry for that.

Please let me know if we can help you with anything.

Regards, A. Oguz

Serg-Pogrebnyak commented 2 months ago

@mekya I'm also facing this question, and maybe this problem is because some APIs are not allowed to be used from the extension?

Use any API marked in header files with the NS_EXTENSION_UNAVAILABLE macro, or similar unavailability macro, or any API in an unavailable framework For example, in iOS 8.0, the HealthKit framework and EventKit UI framework are unavailable to app extensions.

https://developer.apple.com/library/archive/documentation/General/Conceptual/ExtensibilityPG/ExtensionOverview.html#//apple_ref/doc/uid/TP40014214-CH2-SW2

I've tried to play sound using AVSpeechSynthesisVoice and AVSpeechUtterance, and have had no luck. So looks like from the extension we can just stream video and audio FROM the device but not ON the device...