Closed CedricEugeni closed 2 years ago
Can I see xcode screeshot? Like this https://github.com/shogo4405/HaishinKit.swift/issues/889
@shogo4405 yes. At first glance, it seems to be the exact same error code in the console for the converter > xxx
line and the same stack trace as the screenshot in the issue you mentioned.
I'll send you the screenshot in a few minutes in another message š
@shogo4405 here's the screenshot as you asked š
mChannelsPerFrame: 4
HaishinKit don't support 4 channel. I can' reproduce, 4channel with iPhone 13 pro + iOS15.6. How can I reproduce it?
2022-22-08 22:25:44.251 [Info] [com.haishinkit.HaishinKit] [AudioCodec.swift:99] formatDescription > Optional(<CMAudioFormatDescription 0x282890460 [0x1e861c1b8]> {
mediaType:'soun'
mediaSubType:'aac '
mediaSpecific: {
ASBD: {
mSampleRate: 48000.000000
mFormatID: 'aac '
mFormatFlags: 0x2
mBytesPerPacket: 0
mFramesPerPacket: 1024
mBytesPerFrame: 0
mChannelsPerFrame: 1
mBitsPerChannel: 0 }
cookie: {(null)}
ACL: {(null)}
FormatList Array: {
Index: 0
ChannelLayoutTag: 0x640001
ASBD: {
mSampleRate: 48000.000000
mFormatID: 'aac '
mFormatFlags: 0x2
mBytesPerPacket: 0
mFramesPerPacket: 1024
mBytesPerFrame: 0
mChannelsPerFrame: 1
mBitsPerChannel: 0 }}
}
extensions: {(null)}
})
mChannelsPerFrame: 4
HaishinKit don't support 4 channel. I can' reproduce, 4channel with iPhone 13 pro + iOS15.6. How can I reproduce it?
What I did was using AVAudioSession.sharedInstance
using .playAndRecord
category & .videoChat
mode.
I use and iPhone 13 mini on the latest iOS version.
Are you able to reproduce it using these same parameters ?
Let me know if I can be of any help to debug / inspect this issue š
Yes. I know... But I can't reproduce it. iPhone13 pro.
try AVAudioSession.sharedInstance.setCategory(.playAndRecord, mode: .videoChat, options: [])
Please more information.
I used the following:
try session.setCategory(.playAndRecord, mode: .videoChat, options: [.allowBluetooth, .allowBluetoothA2DP, .defaultToSpeaker])
Maybe it also depends if you use front or back camera ? š¤
The other potential thing that can be tricky to reproduce that I can think of is that, as I mentioned in the issue, I use WebRTc framework at the same time to enable an audioChat. WebRTc framework tends to automatically override AVAudioSession to .audioChat
and I have to change it manually if an event occurs. Maybe this can introduce an issue in audio session ? š¤
I tested a back camera. But also a front camera. I think so. https://developer.apple.com/documentation/avfaudio/avaudiosession/1616481-setpreferredoutputnumberofchanne
How about?
AVAudioSesison.sharedInstance.setPreferredOutputNumberOfChannels(1)
I'll try it in a few moments but thanks for pointing this API out! š Should I set it to 1 or to 2 in your opinion ? š¤
Yes. In the first place, I'm believed that the AudioConverter can't support 2 above channel. Also HaishinKit. So. Workaround, set to 1 or 2 channel.
If I let defaultchannels
in AudioCodec file to 0, it still crashes if I use setPreferredOutputNumberOfChannels
to set it to 1 or 2, both with the same result.
I just thought maybe one thing causing the crash is that I also use try session.overrideOutputAudioPort(.speaker)
to have audio on the iPhone speaker.
I'm going to test what happen if I use setPreferredOutputNumberOfChannels
to set it to 2 and also change defaultChannels
to 2 just to try and see what happens š¤ (or 1)
It seems that if I set defaultChannels to 2, I only have sound on the left side and it doesn't matter if I use setPreferredOutputNumberOfChannels
with a value of 2 or 1.
I was thinking: is it possible that the nonInterleaved value may be wrong and that it takes a wrong channel for the right side ? š¤
And also, would it be "easily" possible to tell the converter that if we have a configuration with 4 channels in my case but also more, we only take some of it to make a stereo audio ? š¤
Describe the bug
My app crash because of converter being nil when trying to get it in a specific configuration.
For my app, I need to use a
playAndRecord
category and avideoChat
mode. I need to do it because, alongside my RTMP audio & video stream, I have WebRTC connections to make an audio call.So I improve voice isolation & stuff using
videoChat
mode.Now I have different results considering if I change something in HaishinKit's code.
defaultChannels
property in AudioCodec to2
, it works without crashing, but when I download the video+audio I sent to my backend, I only have sound in my left ear.defaultChannels
property in AudioCodec to1
, then it works without crashing, but sound is now in mono, and I'd like to be able to support true stereo since my app works use external mics.To Reproduce
I just launch my RTMP connection & stream and the app crashes.
Expected behavior
It should work without crashing.
Version
latest
Smartphone info.
iPhone 13 mini. Latest version of iOS
Additional context
It seems this code works fine in production on iOS 15 but this version of my app used a version of HaishinKit from December 2021. I updated it last week to be prepared for iOS 16 and now I have this crash.
Screenshots
No response
Relevant log output
No response