shogo4405 / HaishinKit.swift

Camera and Microphone streaming library via RTMP and SRT for iOS, macOS, tvOS and visionOS.
BSD 3-Clause "New" or "Revised" License
2.75k stars 610 forks source link

App crash getting converter in AudioCodec #1049

Closed CedricEugeni closed 2 years ago

CedricEugeni commented 2 years ago

Describe the bug

My app crash because of converter being nil when trying to get it in a specific configuration.

For my app, I need to use a playAndRecord category and a videoChat mode. I need to do it because, alongside my RTMP audio & video stream, I have WebRTC connections to make an audio call.

So I improve voice isolation & stuff using videoChat mode.

Now I have different results considering if I change something in HaishinKit's code.

To Reproduce

I just launch my RTMP connection & stream and the app crashes.

Expected behavior

It should work without crashing.

Version

latest

Smartphone info.

iPhone 13 mini. Latest version of iOS

Additional context

It seems this code works fine in production on iOS 15 but this version of my app used a version of HaishinKit from December 2021. I updated it last week to be prepared for iOS 16 and now I have this crash.

Screenshots

No response

Relevant log output

No response

shogo4405 commented 2 years ago

Can I see xcode screeshot? Like this https://github.com/shogo4405/HaishinKit.swift/issues/889

CedricEugeni commented 2 years ago

@shogo4405 yes. At first glance, it seems to be the exact same error code in the console for the converter > xxx line and the same stack trace as the screenshot in the issue you mentioned. I'll send you the screenshot in a few minutes in another message šŸ˜Š

CedricEugeni commented 2 years ago

@shogo4405 here's the screenshot as you asked šŸ˜Š

SCR-20220817-lxz

shogo4405 commented 2 years ago

mChannelsPerFrame: 4

HaishinKit don't support 4 channel. I can' reproduce, 4channel with iPhone 13 pro + iOS15.6. How can I reproduce it?

2022-22-08 22:25:44.251 [Info] [com.haishinkit.HaishinKit] [AudioCodec.swift:99] formatDescription > Optional(<CMAudioFormatDescription 0x282890460 [0x1e861c1b8]> {
    mediaType:'soun' 
    mediaSubType:'aac ' 
    mediaSpecific: {
        ASBD: {
            mSampleRate: 48000.000000 
            mFormatID: 'aac ' 
            mFormatFlags: 0x2 
            mBytesPerPacket: 0 
            mFramesPerPacket: 1024 
            mBytesPerFrame: 0 
            mChannelsPerFrame: 1 
            mBitsPerChannel: 0  } 
        cookie: {(null)} 
        ACL: {(null)}
        FormatList Array: {
            Index: 0 
            ChannelLayoutTag: 0x640001 
            ASBD: {
            mSampleRate: 48000.000000 
            mFormatID: 'aac ' 
            mFormatFlags: 0x2 
            mBytesPerPacket: 0 
            mFramesPerPacket: 1024 
            mBytesPerFrame: 0 
            mChannelsPerFrame: 1 
            mBitsPerChannel: 0  }} 
    } 
    extensions: {(null)}
})
CedricEugeni commented 2 years ago

mChannelsPerFrame: 4

HaishinKit don't support 4 channel. I can' reproduce, 4channel with iPhone 13 pro + iOS15.6. How can I reproduce it?

What I did was using AVAudioSession.sharedInstance using .playAndRecord category & .videoChat mode. I use and iPhone 13 mini on the latest iOS version.

Are you able to reproduce it using these same parameters ?

Let me know if I can be of any help to debug / inspect this issue šŸ˜Š

shogo4405 commented 2 years ago

Yes. I know... But I can't reproduce it. iPhone13 pro.

try AVAudioSession.sharedInstance.setCategory(.playAndRecord, mode: .videoChat, options: [])

Please more information.

CedricEugeni commented 2 years ago

I used the following:

try session.setCategory(.playAndRecord, mode: .videoChat, options: [.allowBluetooth, .allowBluetoothA2DP, .defaultToSpeaker])

Maybe it also depends if you use front or back camera ? šŸ¤”

The other potential thing that can be tricky to reproduce that I can think of is that, as I mentioned in the issue, I use WebRTc framework at the same time to enable an audioChat. WebRTc framework tends to automatically override AVAudioSession to .audioChat and I have to change it manually if an event occurs. Maybe this can introduce an issue in audio session ? šŸ¤”

shogo4405 commented 2 years ago

I tested a back camera. But also a front camera. I think so. https://developer.apple.com/documentation/avfaudio/avaudiosession/1616481-setpreferredoutputnumberofchanne

How about? AVAudioSesison.sharedInstance.setPreferredOutputNumberOfChannels(1)

CedricEugeni commented 2 years ago

I'll try it in a few moments but thanks for pointing this API out! šŸ˜Š Should I set it to 1 or to 2 in your opinion ? šŸ¤”

shogo4405 commented 2 years ago

Yes. In the first place, I'm believed that the AudioConverter can't support 2 above channel. Also HaishinKit. So. Workaround, set to 1 or 2 channel.

CedricEugeni commented 2 years ago

If I let defaultchannels in AudioCodec file to 0, it still crashes if I use setPreferredOutputNumberOfChannels to set it to 1 or 2, both with the same result.

I just thought maybe one thing causing the crash is that I also use try session.overrideOutputAudioPort(.speaker) to have audio on the iPhone speaker.

I'm going to test what happen if I use setPreferredOutputNumberOfChannels to set it to 2 and also change defaultChannels to 2 just to try and see what happens šŸ¤” (or 1)

CedricEugeni commented 2 years ago

It seems that if I set defaultChannels to 2, I only have sound on the left side and it doesn't matter if I use setPreferredOutputNumberOfChannels with a value of 2 or 1.

I was thinking: is it possible that the nonInterleaved value may be wrong and that it takes a wrong channel for the right side ? šŸ¤”

And also, would it be "easily" possible to tell the converter that if we have a configuration with 4 channels in my case but also more, we only take some of it to make a stereo audio ? šŸ¤”