airsdk / Adobe-Runtime-Support

Report, track and discuss issues in Adobe AIR. Monitored by Adobe - and HARMAN - and maintained by the AIR community.
199 stars 11 forks source link

Adobe Air Nellymoser Codec crashes iPhone since iOS 14.6 #1010

Open apofis1969 opened 3 years ago

apofis1969 commented 3 years ago

Problem Description

We are creating a Live WebStream to a web player using Wowza and Ant Media The problem is, that the iPhone with iOS 14.6 can not stream anymore using Nellymoser. The App crashes completely. Wowza does not accept SPEEX as sound codec, the H264 feature of Adobe Air is also not working, Wowza does not realise the H264 stream. We have to use PCMU to realise a stream to receive a live stream via a transcoder to Ant Media using an iPhone. The iPad (iPad Pro 12.9, 2nd Generation) with iOS 14.7 sends the stream using Nellymoser without any problems. No idea how the new devices react....

It is really a pity, that the H264VideoStreamSettings are not working properly for Live Streaming and that there is no AAC Codec or any other sound codec which can be used to use the whole power of Wowza such as AAC, Vorbis or Opus without using a FFMPEG based transcoder.

Known Workarounds

Substitute Nellymoser with PCMU

ajwfrost commented 3 years ago

Hi @apofis1969

We're looking at a Nellymoser test case running on an older iPhone on 14.6 but aren't seeing the crash.. do you have any diagnostic info or crash report e.g. from Settings -> Privacy -> Analytics & Improvements -> Analytics Data and then a report with your application name?

We may need to send you a custom SDK to try to debug this - possibly first we'll send over a basic SWF/XML to check whether a basic test case also fails on your device..

On the H.264/AAC encoding, that was something that had to be removed due to the codec patent license but is something we are looking at re-introducing via platform APIs, as part of our revised multimedia framework.

thanks

apofis1969 commented 3 years ago

Hi @ajwfrost, Here is the Testflight Crash Report. Hope this helps. crashlog.txt

ajwfrost commented 3 years ago

Thanks @apofis1969 - looks interesting, we've written into a stack guard 32 bytes before its end which suggests that thread 17 has put too much onto the stack - maybe a recursive call going on. The call stack there though isn't complete (which can sometimes happen when things go wrong!) and for some reason none of the thread names have been output properly..

Would it be possible to see whether you also get a crash using the below simple test case? From what we've seen of the code, the Nellymoser compression is being applied even when you do a loopback test so if this crashes we'll be able to set up some simpler logging/debugging. We've not managed to reproduce it here (either with this test case or with a case that compresses and uploads to a media server) but we don't have an iPhone 12 with iOS 14.6 on it (are you aware of any other device/OS combinations on which it fails?)

thanks

package 
{
    import flash.display.Sprite;
    import flash.events.ActivityEvent;
    import flash.events.Event;
    import flash.events.PermissionEvent;
    import flash.events.StatusEvent;
    import flash.media.Microphone;
    import flash.media.SoundTransform;
    import flash.permissions.PermissionManager;
    import flash.permissions.PermissionStatus;

    public class nellymoser extends Sprite
    {
        private var _mic : Microphone;

        public function nellymoser()
        {
            trace("Starting Nellymoser test: checking for Microphone permission");
            var pm:PermissionManager = Microphone.permissionManager;
            if (pm.permissionStatus == PermissionStatus.UNKNOWN)
            {
                pm.addEventListener(PermissionEvent.PERMISSION_STATUS, onPermission);
                pm.requestPermission();
            }
            else if (pm.permissionStatus == PermissionStatus.GRANTED)
            {
                startMicrophoneListener();
            }
        }

        private function onPermission(e : PermissionEvent) : void
        {
            (e.target as PermissionManager).removeEventListener(PermissionEvent.PERMISSION_STATUS, onPermission);
            if (e.status == PermissionStatus.GRANTED)
            {
                startMicrophoneListener();
            }
        }

        private var _micStatus : Sprite;

        private function startMicrophoneListener() : void
        {
            trace("Listening out for activity..");
            _mic = Microphone.getMicrophone();
            _mic.setUseEchoSuppression(true);
            _mic.addEventListener(ActivityEvent.ACTIVITY, activityHandler);
            _mic.addEventListener(StatusEvent.STATUS, statusHandler);
            var sndXF : SoundTransform = new SoundTransform(0);
            _mic.soundTransform = sndXF;
            _mic.setLoopBack();
            _micStatus = new Sprite();
            _micStatus.addEventListener(Event.ENTER_FRAME, displayMicStatus);
            addChild(_micStatus);
        }

        private function displayMicStatus(e : Event) : void
        {
            _micStatus.graphics.beginFill(0x000080);
            _micStatus.graphics.drawRect(0, 0, 120, 30);
            _micStatus.graphics.endFill();
            _micStatus.graphics.beginFill(0x00ff00);
            _micStatus.graphics.drawRect(10, 10, _mic.activityLevel, 10);
            _micStatus.graphics.endFill();
        }

        private function activityHandler(e : ActivityEvent) : void
        {
            trace("Activity -> " + _mic.activityLevel);
        }
        private function statusHandler(e : StatusEvent) : void
        {
            trace("Status -> " + e);
        }
    }
}
apofis1969 commented 3 years ago

@ajwfrost I think you forgot the sound codec. I will try this, but within the soundcontroller.as it will cause problems. I send you an update.

ajwfrost commented 3 years ago

Thanks. Default encoder I think is to use one of the Nellymoser formats .. mostly we're just trying to get as simple a case together as possible so that it makes it easier to debug!

It's an iPhone 11 though, you mention? I thought I'd seen it as an iPhone 12. I will just check on our hardware here to see if we have the right combination of handset and iOS version!

apofis1969 commented 3 years ago

Hi, @ajwfrost the code does not work for me this way, as my SoundsController.as sends the microphone settings to the stream. I have to adapt the code and implement in my SoundsController.as It is an iPhone 11. Attached I send you the soundscontroller.as there where I added the codec PCMU was Nellymoser and there crashes the app. It is the soundcodec for a stream to Wowza soundscontroller.txt

ajwfrost commented 3 years ago

Okay thanks - we can probably craft a test case using this but I would be curious if that above AS3 code would also crash on your device if it was run as a completely new/separate application, without any of the server-side connections. Under the hood these should both have an audio codec thread running to compress the raw microphone data into the Nellymoser format...

apofis1969 commented 3 years ago

Hi @ajwfrost, I use the Nellymoser codec to record at the pet station an own sound to calm down the pet. The app does not crash, because is used internal without streaming. If I use the Nellymoser codec on iPhone 11 with iOS 14.6 to send a live stream the stream does not work and the app crashes. I will implement your code to the soundscontroller.as. Unfortunately I have no time to create a new and simple streaming app. Fact is, that the simulators on Mojave with Air 259 (all others do not work anymore) work with Nellymoser as well as the iPad using iOS 14.6, iPad Pro 2nd generation 12.9" and the latest Adobe Air. Today I will update the iPad to iOS 14.7.1 and will see what happens. I have no idea how the new generation of iPads react and if they are crashing too, because I do not have any device to test.

ajwfrost commented 3 years ago

Hi @apofis1969 - thanks, although we're not suggesting you integrate our code into yours ... I am just wondering if you are able to compile that AS into a SWF and then into an IPA file for your handset, whether it also crashes. The recording/replaying that you're doing looks to be via raw sound samples, which wouldn't pass through the nellymoser encoder. So for the encoder to work, normally it would mean you're sending buffers to the remote server, but it also looks like the encoder is active when the "loopback" function is enabled on the microphone. So, our test case was purely a simple/standalone application to listen to the microphone in loopback mode..

Simulators on Mojave ... I am using later versions of the AIR SDK with simulators on Mojave, so can you raise a separate issue about that to clarify what you're trying, and we'll look into it?

thanks

apofis1969 commented 3 years ago

Hi @ajwfrost, I think we do not understand each other. Nellymoser crashes the iphone when using the codec for a livestream. Everything else is working. Your code does not crash the app because is not connected to the live stream. Do not know anymore how to explain this. Works with Mac, my iPad but not with iPhone. The encoder can not read the sound codec the iphone is sending with the video. This is why the iphone crashes.

ajwfrost commented 3 years ago

There are some assumptions in here ... why do you think it only crashes when using the codec for a live stream? As I believe it is still using the codec when it does the internal loopback code (this is checked by running it on a desktop and having breakpoints within the microphone compression thread). So if the problem is just within the nellymoser codec, it should be hit by a loopback test that compresses the audio, as much as by a test that uploads the compressed audio..

But it would definitely then tell us something, if the loopback test works fine but the live-stream app doesn't, so it would help us to know where to investigate..

I think we need to put in some debugging/tracing into the runtime and have you build your standard app with this, so that we can get some more information out of it!

thanks

apofis1969 commented 3 years ago

Hi, @ajwfrost I will do it this way. Integrate the code to my app with an extra button and test the code. Then we will see if the app crashes. I will do that at the weekend. Hope this will help. You really believe in me, that this is with sending the stream with nelymoser :-)

ajwfrost commented 3 years ago

Thanks @apofis1969, that would be very useful. Just trying to work out whether the crash is with nellymoser itself or with the streaming part of it...

apofis1969 commented 3 years ago

hi @ajwfrost, sorry I am late with the testing of Nellymoser, but after the second COVID vaccination I felt very bad. I will do it this week.

apofis1969 commented 2 years ago

hi @ajwfrost Nellymoser Codec crashes the app now completely with iOS15.0.1 and iPhone 11 Even if not adding to the Video Stream. The microphone does not accept the NellyMoser Codec. The ActivityLevel of the Microphone does not show any activity. Adding PCMU or Speex, the Microphone displays the ActivityLevel. Please see the logfile I created with Testflight. With iPad 12.9 2nd generation all is working. But who knows with the newer devices. It is a huge bug, because it is breaking completely an important feature. crashlog.txt

ajwfrost commented 2 years ago

Hi

We'll look into this one again ... the crash report is a little odd:

Thread 17 Crashed:
0   libsystem_malloc.dylib          0x0000000192ef9e6c small_madvise_free_range_no_lock + 4 (magazine_small.c:1091)
1   libsystem_malloc.dylib          0x0000000192efb1b0 free_small + 948 (magazine_small.c:1390)
2   MyAppApp                        0x0000000104e29bc4 0x104abc000 + 3595204
3   MyAppApp                        0x0000000104e29bd0 0x104abc000 + 3595216

So it's not crashing on the main UI thread but we're also not seeing the thread_start call at the bottom of the thread...

It looks like a buffer underrun or similar though:

VM Region Info: 0x16b3d3ff0 is in 0x16b3d0000-0x16b3d4000;  bytes after start: 16368  bytes before end: 15
--->  STACK GUARD              16b3d0000-16b3d4000 [   16K] ---/rwx SM=NUL  ...for thread 17
      Stack                    16b3d4000-16b45c000 [  544K] rw-/rwx SM=PRV  thread 17

so we've dropped 16 bytes down into the stack guard memory area.

I guess if you're able to crash / not crash based on the codec setting, it's fairly clearly a fault within Nellymoser! Thanks for the details, I'll try to get someone to reproduce this here...

apofis1969 commented 2 years ago

@ajwfrost thank you. More details the crash report does not show. But fact is, that the Codec crashes the microphone and does not create any activity. I am now trying with enhanced microphone. As soon as I tested I will inform you.

ajwfrost commented 2 years ago

Hi @apofis1969 - still struggling to reproduce this one with a simpler test case here .. are you able to hook up your phone to Scout to do some debugging, if we send you a custom version of the AIR SDK? Or if you've got a mac, can you connect and use the Console app to get logs from it?

If so -> we can start adding some output into the trace or console log stream so that we can confirm what functions are at what addresses, and see if there are any memory operations going off the end of the stack..

thanks

apofis1969 commented 2 years ago

HI @ajwfrost, i could cancel the bug using this code now: _mic = Microphone.getEnhancedMicrophone(); _mic.codec = SoundCodec.NELLYMOSER; _mic.rate = 44; _mic.gain = 70; _mic.setLoopBack(false); _mic.setUseEchoSuppression(true); _mic.setSilenceLevel(0); GLOB.alert("Recording",""); But it is still strange, because PCMU and SPEEX do not create this crash. If I use _mic = Microphone.getMicrophone() the app crashes. This is only on iPhone. tested with Android, macOS,.. no problem. I can do that. With my Mac. But only next week. Tomorrow I have to travel until Friday

ajwfrost commented 2 years ago

@apofis1969 thanks for that, I would be interested to know if it's just the fact you're using the enhanced microphone, or whether any of those other settings on their own might be impacting the behaviour. There are different code paths and buffer sizes etc particularly around the sampling rate and echo suppression.

We've still not got a reproduction here though, so would you be able to download the zip from the below message, and extract this over the top of your AIR SDK in the "lib/aot/lib" folder? This will then add some debug output into the generated app that you should be able to see if you connect it to a macOS machine and open up the Console app..

thanks

https://transfer.harman.com/message/8IgUcC4mJ55Kn6gReLpVNQ

apofis1969 commented 2 years ago

Hi @ajwfrost, I downloaded and will do it. But only at the weekend or on monday. As I said, this only impacts the iPhone. The tablet and the mac have no problems. This is why I think it is strange. The microphone runs before using the other sound codec a reset. So all my Log files show the correct sound codec and the correct microphone settings. I think this is a configuration within the hardware and causes this incompatibility. It would be helpful to be able to work with the actual streaming codecs and at least with AAC. A lot of configuration problems could be resolved with this. Wowza logs the error that sorenson spark and the adobe air sound codecs can not be send at the same time.

apofis1969 commented 2 years ago

Hi @ajwfrost something went wrong with the Archive you sent to me. Packaging failed! Packaging error message: Compilation failed while executing : ld64 Packaging output: Undefined symbols for architecture armv7: "_sdk_aotInfo", referenced from: _com.distriqt.PushNotifications-18_79:0:global$init in com.distriqt.PushNotifications-18_79.o _com.distriqt.Dialog-9_26:0:global$init in com.distriqt.Dialog-9_26.o _com.distriqt.InAppBilling-32_124:0:global$init in com.distriqt.InAppBilling-32_124.o _com.distriqt.InAppBilling-2_94:0:global$init in com.distriqt.InAppBilling-2_94.o _com.distriqt.Dialog-11_28:0:global$init in com.distriqt.Dialog-11_28.o _com.distriqt.AudioRecorder-7_133:0:global$init in com.distriqt.AudioRecorder-7_133.o _com.distriqt.Core-0_90:0:global$init in com.distriqt.Core-0_90.o ... "_builtin_aotInfo", referenced from: _com.distriqt.AudioRecorder-1_127:0:global$init in com.distriqt.AudioRecorder-1_127.o _com.distriqt.playservices.Base-0_191:0:global$init in com.distriqt.playservices.Base-0_191.o _AOTBuildOutput-0000000001_206:13164:com.myapp.views::RemoteLogout/logoutcallback in AOTBuildOutput-0000000001_206_15.o _AOTBuildOutput-0000000001_206:13163:com.myapp.views::RemoteLogout/finishRemoteLogout in AOTBuildOutput-0000000001_206_15.o _AOTBuildOutput-0000000001_206:13162:com.myapp.views::RemoteLogout/viewRemoteLogout_creationCompleteHandler in AOTBuildOutput-0000000001_206_15.o _AOTBuildOutput-0000000001_206:13161:com.myapp.views::RemoteLogout/laterConnect in AOTBuildOutput-0000000001_206_15.o _AOTBuildOutput-0000000001_206:13157:com.myapp.views::RemoteLogout/startLogout_clickHandler in AOTBuildOutput-0000000001_206_15.o ... ld: symbol(s) not found for architecture armv7

ajwfrost commented 2 years ago

This looks like it's caused by a mixture of SDKs ... the binaries I sent would need to be extracted onto the 33.1.1.633 version of the SDK as they're only containing the arm64 code; the log above shows it's trying to build for armv7 which means it must be using ADT from an earlier version of the AIR SDK...

thanks

apofis1969 commented 2 years ago

hi @ajwfrost I think it is my error, forgot to update the .profile. Maybe it is this. I will check it.

apofis1969 commented 2 years ago

Hi @ajwfrost, sorry, had a confusion with the SDKs. Here is now the log file. App crashed on iPhone 11 with iOS 15.0.1 Hopefully this helps now. crash-nellymoser.txt

ajwfrost commented 2 years ago

@apofis1969 thanks for this .. can I check what the behaviour is on the phone? as from these logs, it looks like there is a stream that's being captured, audio buffers are being sent for around 7 seconds, and it looks like the crash happens when the stream is stopped..?

default 16:31:35.767600 AIRSDK getting a microphone 0x108978a00
default 16:31:36.099164 Adding sensor usage data for MICROPHONE
default 16:31:36.177263 AIRSDK microphone thread function 0x104ba2274
default 16:31:43.121046 AIRSDK: microphone try to open...
default 16:31:43.121124 AIRSDK setting compression format 96
default 16:31:43.121215 AIRSDK sample count from Nelly rate 44100 = 2048
default 16:31:43.121545 InitializeAudioSession
default 16:31:44.904889 AIRSDK sending a buffer for timestamp 21969
default 16:31:44.906256 AIRSDK: capturing nellymoser audio
...
default 16:31:52.098148 AIRSDK sending a buffer for timestamp 733564
default 16:31:52.098379 AIRSDK: capturing nellymoser audio
default 16:31:52.098558 AIRSDK clearing nelly state
default 16:31:52.098734 AIRSDK setting microphone codec to Speex
default 16:31:52.318487 Parsing corpse data for pid 9709

We can look in more detail then what is happening during this "clearing state" part i.e. it may be some clean-up and memory management that's causing the problem rather than any actual encoding issue - or it may be that one of the buffers had already overrun and the issue is only found when it starts to clean up or unwind that thread...

thanks

apofis1969 commented 2 years ago

hi @ajwfrost how do you want to check? As it always worked I can not understand where the problem is. Please tell me how you want to check it. I just need to send an update because I found a bug testing with Xcode 12.2

apofis1969 commented 2 years ago

Hi, @ajwfrost Testing with the simulators of Xcode 12.2 on Catalina I realized, that the problem with Nellymoser and the microphone settings starts with iPhone 11 and iOS 14 Until iPhone SE the microphone settings work perfect. Beginning with iPhone 11 the confusion starts. There is something wrong with microphone and enhanced microphone. If I use enhanced microphone on iPhone SE the recording with Nellymoser is like fast forward. If I do not use enhaced microphone with Nellymoser on iPhone X the app crashes. I think it is really time to add new codecs. For video and audio to the SDK. As I have a live streaming app I depend on this. As I already mentioned, also wowza logs error messages with spark and speex codec.

ajwfrost commented 2 years ago

Hi @apofis1969 - currently we've still got problems in being able to reproduce the issue here. Wondering whether it's possible for us to be able to build/test it via a simulator, are you able to reproduce this even with "ipa-xxx-interpreter-simulator" builds?

My query earlier had been when you observe the crash: is it the instant that your app starts streaming, or are you able to stream for a few seconds and receive the audio at your server? The log suggested that the crash is happening after streaming has been working for a while, and the app is then stopping the stream which caused the crash..

We may need to send you another update to test out if we're not able to get it to reproduce here. But we can do a few more checks based on the log file details first..

thanks

apofis1969 commented 2 years ago

Hi, @ajwfrost this is no stream. it is an inapp recording. the app crashes during or when stoping. the microphone does not show any reaction. only with enhanced microphone it works on iphone x devices. it crashed also the simulator. I have no idea why this appears suddenly. I am really concerned about this

ajwfrost commented 2 years ago

@apofis1969 if this is local in-app recording then it should make it easier to reproduce for us. The weird thing is where it's only certain devices .. but with the simulators we can try to find out what's going wrong. If we can't get a reproduction, is there any way we can get at your application to install it on devices here (which may mean it needs to be from the app store or testflight)?

thanks

apofis1969 commented 2 years ago

Hi @ajwfrost, I definitely found the problem. It is the enhanced microphone. I removed the settings and there is no problem anymore with Nellymoser. But the video chat function now is with a huge feedback, echo. Even when far. For the live Streaming I have to use speex. Also the enhanced microphone makes the app very slowly. Is there any possibility to make the enhanced settings better? I can’t remove the video chat. I am desperate… I will test with enhanced on the receiver only and the normal settings on the base station. Hopefully there will be a solution for these settings. Do you still want a crashing TestFlight build?

apofis1969 commented 2 years ago

hi @ajwfrost, I could resolve the problem with this workaround. Publisher has only getMicrophone to use all the features. Receiver has getEnhancedMicrophone to use the Video Chat and reduce echo. This is the code: `public static function get mic():Microphone{

        if(_mic == null){   
            try{
                if(GLOB.isPublisher == true){
                    if(GLOB.recordingCodec == true){
                        _mic = Microphone.getMicrophone();
                        _mic.codec = SoundCodec.NELLYMOSER;
                        _mic.rate = 44;
                        _mic.gain = 70;
                        _mic.setLoopBack(false);
                        _mic.setUseEchoSuppression(true);
                        _mic.setSilenceLevel(0);
//                      GLOB.alert("Recording: "+_mic.codec,"");
                    }
                    if(ValidateWebStream.webStreamCodec == true){

//                      GLOB.alert("web stream","");
                        _mic = Microphone.getMicrophone();
                        _mic.codec = SoundCodec.NELLYMOSER;
                        _mic.gain = 70;
                        _mic.rate = 22;
                        _mic.setLoopBack(false);
                        _mic.setUseEchoSuppression(true);
                        _mic.setSilenceLevel(0);
//                      GLOB.alert("web stream: "+_mic.codec,"");
                    }
                    if(GLOB.recordingCodec == false && ValidateWebStream.webStreamCodec == false){
                        _mic = Microphone.getMicrophone();
                        _mic.codec = SoundCodec.SPEEX;
                        _mic.rate = 22;//was 11
                        _mic.setLoopBack(false);
                        _mic.framesPerPacket = 1;
                        _mic.encodeQuality = 6;
                        _mic.noiseSuppressionLevel = -30;
                        _mic.setUseEchoSuppression(true);
                        _mic.gain = GLOB.micGain;
                        _mic.setSilenceLevel(5);
//                      GLOB.alert("Gain = "+_mic.gain,"");
                    }
                }
                if(GLOB.isPublisher == false){
                    _mic = Microphone.getEnhancedMicrophone();
                    options = new MicrophoneEnhancedOptions();
                    options.mode = MicrophoneEnhancedMode.FULL_DUPLEX;
                    options.echoPath = 128;
                    options.autoGain = false;
                    options.nonLinearProcessing = true;
                    _mic.enhancedOptions = options;
                    _mic.rate = 22;//was 11
                    _mic.codec = SoundCodec.SPEEX;
                    _mic.gain = 50;
                    _mic.noiseSuppressionLevel = -30;
                    _mic.setSilenceLevel(0);
                    _mic.setLoopBack(false);
                    _mic.framesPerPacket = 1;
                    _mic.encodeQuality = 6;

// GLOB.alert("ReceiverMic",""); // GLOB.alert("web stream: "+_mic.codec,""); } _mic.addEventListener(ActivityEvent.ACTIVITY,micActivity); _mic.addEventListener(SampleDataEvent.SAMPLE_DATA, onSampleData); micSpeechListener.addEventListener(TimerEvent.TIMER,onSpeechTimer); }catch(e:Error){

            }
        }
        return _mic;
    }`
ajwfrost commented 2 years ago

Okay thanks .. so can I check, what change would you make to the above code in order to make it crash? I don't know which code path would be being used, what are the values of GLOB.recordingCodec and ValidateWebStream.webStreamCodec?

Plus, what are you doing in the callbacks for onSampleData and within the speech timer?

Still not able to reproduce the problem here (currently trying on an iPhone 11 / iOS 15, simulator) but we're using loopback(true) so a different usage. I'll see whether we can adapt it to the above and then make it crash...!

thanks

apofis1969 commented 2 years ago

@ajwfrost, this is only a variable to tell the publisher, if there is the recording in use or the webstream. This is necessary to use the correct microphone settings.

The previous code was like that: `private static var options:MicrophoneEnhancedOptions; public static function get mic():Microphone {

        if(_mic==null){ 
            try{
                if(GLOB.recordingCodec == true){

// _mic = Microphone.getMicrophone(); if(GLOB.getIsPhone() == true && GLOB.getIsiPhoneX() == true){ _mic = Microphone.getEnhancedMicrophone(); }else{ _mic = Microphone.getMicrophone(); } _mic.codec = SoundCodec.NELLYMOSER; _mic.rate = 44; _mic.gain = 70; _mic.setLoopBack(false); _mic.setUseEchoSuppression(true); _mic.setSilenceLevel(0); // GLOB.alert("Recording: "+_mic.codec,""); }

                if(ValidateWebStream.webStreamCodec == true){

// GLOB.alert("web stream",""); // _mic = Microphone.getMicrophone(); _mic = Microphone.getEnhancedMicrophone(); _mic.codec = SoundCodec.PCMU; _mic.gain = 70; // _mic.rate = 44; _mic.setLoopBack(false); _mic.setUseEchoSuppression(true); _mic.setSilenceLevel(0); // GLOB.alert("web stream: "+_mic.codec,""); }

                if(GLOB.recordingCodec == false && ValidateWebStream.webStreamCodec == false){

                    _mic = Microphone.getEnhancedMicrophone();
                    options = new MicrophoneEnhancedOptions();
                    options.mode = MicrophoneEnhancedMode.FULL_DUPLEX;
                    options.echoPath = 128;
                    options.autoGain = false;
                    options.nonLinearProcessing = true;
                    _mic.enhancedOptions = options;

                    _mic.rate = 22;//was 11
                    _mic.codec = SoundCodec.SPEEX;
                    if(GLOB.isPublisher == true){
                        _mic.gain = GLOB.micGain;
                        _mic.setSilenceLevel(5);

// GLOB.alert("Gain = "+_mic.gain,""); }else{ _mic.gain = 70; _mic.setSilenceLevel(0); // GLOB.alert("ReceiverMic",""); }

                    _mic.setLoopBack(false);
                    _mic.framesPerPacket = 1;
                    _mic.encodeQuality = 6;

// GLOB.alert("web stream: "+_mic.codec,""); } // GLOB.alert("Mic: "+_mic.codec,""); _mic.addEventListener(ActivityEvent.ACTIVITY,micActivity); _mic.addEventListener(SampleDataEvent.SAMPLE_DATA, onSampleData); micSpeechListener.addEventListener(TimerEvent.TIMER,onSpeechTimer); }catch(e:Error){

            }
        }
        return _mic;
    }`

Here the Publisher and the Receiver are using enhanced Microphone. I have only one question: does AIR SDK 633 work with iPhone 6s, 7, 8 and SE (2nd generation) and all the other devices if iOS 15 is NOT installed?

ajwfrost commented 2 years ago

Okay we'll take a look - presumably then it was crashing with the enhanced microphone but is working now with the normal microphone? But earlier you had said:

i could cancel the bug using this code now _mic = Microphone.getEnhancedMicrophone(); ... If I use _mic = Microphone.getMicrophone() the app crashes.

We really need to create a very simple, single test case that demonstrates the crash, rather than having 3-4 different code paths with different microphones being obtained and set up! I'm also wondering whether you get any crash if you remove the event listeners for sample_data and whatever you're doing in the timer event..? or would we need to check the behaviour of those callbacks in case they affect things?

thanks

ajwfrost commented 2 years ago

.. and for this:

does AIR SDK 633 work with iPhone 6s, 7, 8 and SE (2nd generation) and all the other devices if iOS 15 is NOT installed?

Yes it would work still on any (64-bit) iPhone, it doesn't need iOS 15 but should still run on iOS 9...

Does the crash happen regardless of the values of GLOB.recordingCodec and ValidateWebStream.webStreamCodec then? although there are three different codecs being used here in that code -> you mentioned it being a nellymoser problem so presumably the crash only happens if GLOB.recordingCodec is true and ValidateWebStream.webStreamCodec is false?

We've tried setting up the enhanced microphone with those values (assuming GLOB.getIsPhone() == true), but no crashes are happening... but I'm wondering what you're doing to the data when it arrives and whether this could make a difference.

apofis1969 commented 2 years ago

The iPhone can not switch anymore between enhanced and normal microphone. Here happens the bug. I had to use different codecs because of this. Now, as the Publisher only uses getMicrophone I can use for the webstream and recording the Nellymoser. I need the Nellymoser for the webstream to send them through the ffmpeg restreamer to create a stream with H264 and AAC, because SPEEX no platform can convert. The app internal live stream is with Speex. Also the recording only works with Nellymoser, all the other codecs play the recording in fast forward mode...

apofis1969 commented 2 years ago

Hi @ajwfrost, I tested the code without enhanced microphone on the Base Station. Works, but the Video Chat not. Needs on both stations enhanced. If you want a build on TestFlight which crashes, please tell me.

ajwfrost commented 2 years ago

Hi @apofis1969 - when you say the video chat doesn't work, is that because it's crashing in the same way as before? or because you need to have the enhanced microphone in order to make it work?

I'm not sure a TestFlight build would be useful unless you can get a interpreter-based build up there? What we really need to do is get a self-contained, simple ActionScript test case that demonstrates the crash. Are you able to strip away the rest of your application so that all we have is the configuring of the microphone, and check that this is still showing the crash?

thanks

apofis1969 commented 2 years ago

Hi @ajwfrost I need the enhanced microphone to use the video chat on both devices. I can not switch back to the normal microphone settings on the new iOS devices. Ho can I do a interpreter-base-build? I am trying all possibilities to get this running as it was, but until now I only have crashes. I can not remove all, because it is too complex and a few days of work. It is really frustrating

ajwfrost commented 2 years ago

I was thinking of an ADT target such as ipa-debug-interpreter but I guess we'll end up with the issue of this not installing on any devices we have here. Although if you did ipa-debug-interpreter-simulator we should be able to run it on a simulator as they don't enforce the signing/provisioning...?

So if you can get the application built for a simulator and set up so that it's clear (a) what steps we should take to reproduce the crash, and (b) what bit of code is executing at this point, we may then be able to get a reproduction case together.

The alternative is to help clarify what single change in the AS3 code means that the crash happens, from a "working" (non-crashing) app. And to be able to clarify what microphone construction and configuration is going on, and whether the crash happens only when you are handling the sample data events or not.

thanks

apofis1969 commented 2 years ago

@ajwfrost Case again: Nellymoser works with:

Problem: The Base Station needs the getEnhancedMicrophone as well as the receiver to have an echo free Video Chat. The switch between getEnhancedMicrophone and getMicrophone on the iPhone worked until iOS < 14.6 The configuration never created a bug. This configuration is online since more than 1 year. I tried the code I sent to you -> without getEnhancedMicrophone on the base station. All fine. But can't be used for the Video Chat PCMU and SPEEX DO NOT create this problem, but unfortunately can not be used for a recording in a swc archive. The play back of the recording is in fast forward. I do not know anymore what to do... I will send you a video about the behaviour. added a visualisation of the mic activity to the recording. The microphone does not show any activity and the app crashes completely.

apofis1969 commented 2 years ago

hi @ajwfrost, @marchbold Sorry that I did not upload the video until now, but I tried to find the bug. I discovered this: With iOS 15 and the newer devices of iPhone 11 and up, Apple integrated Microphone Mode Settings. If I use the adobe air standard microphone configuration these settings DO NOT appear on the device. If I use adobe air enhanced microphone settings these settings APPEAR and the streaming is also better. I do not know, if this problem also appears on the newer iPad generations, because on my iPad 12.9 2nd generation the Microphone Mode Settings never appear. Maybe this information is important and explains the crash of the iPhone 11 with Nellymoser. Hopefully this helps.