webrtc-sdk / Specs

WebRTC SDK for iOS/mac (Cocopods Specs)
MIT License
28 stars 19 forks source link

Audio not working properly #3

Closed dmichelutti closed 1 year ago

dmichelutti commented 1 year ago

After adopting the release [92.4515.05] the audio is not working properly and all workarounds tried on project seems not work.

To establish connection I've tried to set the previous category and mode (AVAudioSessionCategoryPlayAndRecord and AVAudioSessionModeVoiceChat), but microphone does not work. We receive, after a few seconds, an error callback "Did Fail To Gather Ice Candidate code 710".

Is there something to do to make a fix on project to adopt the latest version of the pod?

davidzhao commented 1 year ago

The latest version is 104.5112.10, could you give that a try? 92.xx is quite old. We won't be able to support old builds.

dmichelutti commented 1 year ago

Sorry for late response. We have tried the latest versions, but audio is not working well. We are not able to join a call with audio and video working properly. In the previous post I have told that after changes made in that specific version (92.4515.05), audio is not passing, also if we try to restore the correct parameters to AVSession.

Have you a quick fix to make audio work with latest versions? Or can you tell me how to make audio work well?

rafaelnobrekz commented 6 months ago

I'm having similar issues when I add a route change notification handler (to have the sounds come through speaker instead of earpiece). How can I achieve that with this fork? It works on official WebRTC M114 (and all others), but there is absolutely no sound on M114 release from this fork.

davidzhao commented 6 months ago

@rafaelnobrekz can you share a bit about how you are using it? We've made a few changes to the AVSession handling to prevent it from being acquiring the microphone unnecessarily. We use this library in production with LiveKit's mobile SDKs and have not had any issues with audio.

rafaelnobrekz commented 6 months ago

@davidzhao Hi, sure, thanks for following up. The code is basically the following (on production on another part of this app for a few years now). We set the audio track, then customize some audio session properties and set a route change notification to be able to force output to the speakers, unless a headphone is plugged in. We don't want the earpiece to be used. With this setup, I get absolutely no sound output (statistics report audioLevel 0.0). If I don't set any of it, then audio works as expected.

func setupLocalAudioTrack() {
  let audioSource = peerConnectionFactory.audioSource(with: nil)
  let audioTrack = peerConnectionFactory.audioTrack(with: audioSource, trackId: "localAudio")
  setupAudioSession()
}

private func setupAudioSession() {
        let rtcAudioSession = RTCAudioSession.sharedInstance()
        rtcAudioSession.lockForConfiguration()
        do {
            var categoryOptions: AVAudioSession.CategoryOptions = [.defaultToSpeaker, .allowBluetoothA2DP, .allowAirPlay, .allowBluetooth]
            if #available(iOS 14.5, *) {
                try rtcAudioSession.session.setPrefersNoInterruptionsFromSystemAlerts(true)
                categoryOptions.insert(.overrideMutedMicrophoneInterruption)
            }
            try rtcAudioSession.setCategory(AVAudioSession.Category.playAndRecord.rawValue, with: categoryOptions)
            try rtcAudioSession.setMode(AVAudioSession.Mode.voiceChat.rawValue)
            try rtcAudioSession.setActive(true)
        } catch {
            Logger.log(error.localizedDescription)
        }
        rtcAudioSession.unlockForConfiguration()

        NotificationCenter.default.addObserver(self, selector: #selector(routeChange), name: AVAudioSession.routeChangeNotification, object: nil)
    }

    @objc private func routeChange(_ n: Foundation.Notification) {

        func hasHeadphonesOrHeadset(in routeDescription: AVAudioSessionRouteDescription) -> Bool {
            return routeDescription.outputs.contains { $0.portType == .headphones || $0.portType == .bluetoothHFP || $0.portType == .bluetoothA2DP }
        }
        let session = AVAudioSession.sharedInstance()
        let inputs = session.currentRoute.inputs
        let availableInputs = session.availableInputs
        let inputGain = session.inputGain
        let currentRoute = session.currentRoute
        let headphonesOrHeadsetConnected = hasHeadphonesOrHeadset(in: session.currentRoute)

        try? session.overrideOutputAudioPort(headphonesOrHeadsetConnected ? .none : .speaker)
        if session.isInputGainSettable {
            try? session.setInputGain(1)
        }
    }