twilio / video-quickstart-ios

Twilio Video Quickstart for iOS
https://www.twilio.com/docs/api/video
MIT License
460 stars 178 forks source link

iOS app crashes often by raising this error "EXC_BAD_ACCESS KERN_INVALID_ADDRESS" #486

Closed neuralxmasaki closed 4 years ago

neuralxmasaki commented 4 years ago

We developed a two way video communication platform with Twilio, and testing via testflight on iOS.

Here is the line the app crashes. twilio::media::TVIVideoRendererNativeAdapter::OnFrame(webrtc::VideoFrame const&)

We are running the app with testflight and running google firebase with it. Here is the crash report on the firebase. Could you give me any advice of how to fix it?

EXC_BAD_ACCESS KERN_INVALID_ADDRESS 0x0000000000000000

Crashed: com.apple.main-thread 0 CoreVideo 0x1bf1223fc CVPixelBufferGetWidth + 24 1 TwilioVideo 0x105c30e30 twilio::media::TVIVideoRendererNativeAdapter::OnFrame(webrtc::VideoFrame const&) + 116 (hidden#1507_:116) 2 TwilioVideo 0x10623383c hidden#38020 + 923016 3 TwilioVideo 0x10622ba78 hidden#46 + 890820 4 TwilioVideo 0x105be7940 twilio::media::CoreVideoSource::OnCapturedFrame(TVIVideoFrame) + 928 5 TwilioVideo 0x105be755c twilio::media::CoreVideoSource::CopyCapturedFrame(TVIVideoFrame) + 368 6 TwilioVideo 0x105be7d74 -[TVICoreVideoSource onVideoFrame:] + 2004 7 Presence 0x104f5ca94 ConferenceManager.displayLinkDidFire(timer:) + 125 (ConferenceManager+Twilio.swift:125) 8 Presence 0x104f5cb08 @objc ConferenceManager.displayLinkDidFire(timer:) + 4311468808 (:4311468808) 9 QuartzCore 0x1b8919f24 CA::Display::DisplayLink::dispatch_items(unsigned long long, unsigned long long, unsigned long long) + 628 10 QuartzCore 0x1b89e8608 display_timer_callback(CFMachPort, void, long, void*) + 268 11 CoreFoundation 0x1b1de5dac CFMachPortPerform + 176 12 CoreFoundation 0x1b1e107c4 CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE1_PERFORM_FUNCTION + 60 13 CoreFoundation 0x1b1e0fe90 CFRunLoopDoSource1 + 448 14 CoreFoundation 0x1b1e0aac8 __CFRunLoopRun + 2144 15 CoreFoundation 0x1b1e09f40 CFRunLoopRunSpecific + 480 16 GraphicsServices 0x1bc09a534 GSEventRunModal + 108 17 UIKitCore 0x1b5f95580 UIApplicationMain + 1940 18 Presence 0x104db9b68 main + 14 (BrowseOnDemandView.swift:14) 19 libdyld.dylib 0x1b1c88e18 start + 4

neuralxmasaki commented 4 years ago

Just to clarify, I am using the latest SDK (3.2.5 )

ceaglest commented 4 years ago

Hi @neuralxmasaki,

7 Presence 0x104f5ca94 ConferenceManager.displayLinkDidFire(timer:) + 125 (ConferenceManager+Twilio.swift:125) 8 Presence 0x104f5cb08 @objc ConferenceManager.displayLinkDidFire(timer:) + 4311468808 (:4311468808) 9 QuartzCore 0x1b8919f24 CA::Display::DisplayLink::dispatch_items(unsigned long long, unsigned

It looks like ConferenceManager is a custom TVIVideoSource that runs a DisplayLink timer to generate frames. Is there any chance that you could share the code that is delivering the TVIVideoFrames? My guess is that the frames being delivered don't have a valid CVPixelBuffer which is then causing a crash when the frames get rendered locally.

Best, Chris

neuralxmasaki commented 4 years ago

Hi Chris,

Thank you very much for your kind support on this.

Here is the code.

`extension ConferenceManager: TwilioVideoDelegate { func connectToRoom(isLocalAudio: Bool) { guard let accessToken = accessToken else { return } // Preparing the connect options with the access token that we fetched. let connectOptions = ConnectOptions(token: accessToken) { (builder) in // Use the local media that we prepared. if(isLocalAudio){ builder.audioTracks = self.localAudioTrack != nil ? [self.localAudioTrack!] : [LocalAudioTrack]() } builder.videoTracks = self.localVideoTrack != nil ? [self.localVideoTrack!] : [LocalVideoTrack]()

        // Use the preferred audio codec
        if let preferredAudioCodec = Settings.shared.audioCodec {
            builder.preferredAudioCodecs = [preferredAudioCodec]
        }

        // Use the preferred video codec
        if let preferredVideoCodec = Settings.shared.videoCodec {
            builder.preferredVideoCodecs = [preferredVideoCodec]
        }

        // Use the preferred encoding parameters
        if let encodingParameters = Settings.shared.getEncodingParameters() {
            builder.encodingParameters = encodingParameters
        }

        // Use the preferred signaling region
        if let signalingRegion = Settings.shared.signalingRegion {
            builder.region = signalingRegion
        }
    }
    //this is the safe guard to deal with this problem
    //Unpublishing and republishing a media track might not be seen by Participants
    //https://github.com/twilio/twilio-video-ios/issues/34

    sleep(3)

    // Connect to the Room using the options we provided.
    room = TwilioVideoSDK.connect(options: connectOptions, delegate: self)
}

func disconnectFromRoom() {
    updateRoomState(to: .online, completion: { success in
        self.room?.disconnect()
        //self.cleanUpAudioSetting()
       // self.cleanupRoom()
    })
}

func prepareLocalMedia(isLocalAudio: Bool) {
    configureAudioSession()

    // We will share local audio and video when we connect to the Room.
    // Create an audio track.
    if(isLocalAudio){
        if (localAudioTrack == nil) {
            localAudioTrack = LocalAudioTrack(options: nil, enabled: true, name: "Microphone")
            if (localAudioTrack == nil) {
                print("Failed to create audio track")
            }
        }
    }
    // Create a video track which captures from the camera.
    if (localVideoTrack == nil) {
        self.startPreview()
    }
}

func cleanUpAudioSetting() {
    configureAudioSession()
    localAudioTrack = LocalAudioTrack(options: nil, enabled: true, name: "Microphone")
    if (localAudioTrack == nil) {
        print("Failed to create audio track")
    }
}

func startPreview() {
    self.localVideoTrack = LocalVideoTrack(source: self)

    let format = VideoFormat()
    format.frameRate = 24
    format.pixelFormat = PixelFormat.format32BGRA
    format.dimensions = CMVideoDimensions(width: Int32(UIScreen.main.bounds.size.width),
                                          height: Int32(UIScreen.main.bounds.size.height))
    requestOutputFormat(format)
    startDisplayLink()
}

func startDisplayLink() {
    //make sure to invalidate the link if anything is already running.
    self.displayLink?.invalidate()

    // Depth sensing stuff
    self.attemptAVCaptureConfiguration()
    self.startCaptureSessionIfAble()

    let link = CADisplayLink(target: self, selector: #selector(self.displayLinkDidFire))
    link.preferredFramesPerSecond = 24

    link.add(to: RunLoop.main, forMode: RunLoop.Mode.common)
    self.displayLink = link
}

func stopDisplayLink() {
    self.displayLink?.invalidate()
    self.displayLink = nil
    self.sink = nil
}

@objc func displayLinkDidFire(timer: CADisplayLink) {
    guard let pixelBuffer: CVPixelBuffer = self.videoPixelBuffer else { return }
    self.videoFrame = VideoFrame(timeInterval: timer.timestamp, buffer: pixelBuffer, orientation: VideoOrientation.left)
    if let frame = self.videoFrame { self.sink?.onVideoFrame(frame) }
}

func dataOutputSynchronizer(_ synchronizer: AVCaptureDataOutputSynchronizer, didOutput synchronizedDataCollection: AVCaptureSynchronizedDataCollection) {
    //masaki temp solution
    guard renderingEnabled, let syncedVideoData: AVCaptureSynchronizedSampleBufferData = synchronizedDataCollection.synchronizedData(for: videoDataOutput) as? AVCaptureSynchronizedSampleBufferData else {
            return
    }
    if syncedVideoData.sampleBufferWasDropped {
        return
    }
    guard let videoPixelBuffer = CMSampleBufferGetImageBuffer(syncedVideoData.sampleBuffer) else {
        return
    }
    self.videoPixelBuffer = videoPixelBuffer
    return
}

}

// MARK:- VideoSource extension ConferenceManager: VideoSource { var isScreencast: Bool { // We want fluid AR content, maintaining the original frame rate. return false }

func requestOutputFormat(_ outputFormat: VideoFormat) {
    if let sink = sink {
        sink.onVideoFormatRequest(outputFormat)
    }
}

}`

ceaglest commented 4 years ago

Hi @neuralxmasaki,

Thanks for sharing the code. From what I can tell there isn't a strong guarantee of thread safety for self.videoPixelBuffer, so you might end up delivering an already released CVPixelBuffer when the display link timer fires on the main thread.

// Running on a background queue worker thread?
func dataOutputSynchronizer(_ synchronizer: AVCaptureDataOutputSynchronizer, didOutput synchronizedDataCollection: AVCaptureSynchronizedDataCollection) {
 // ...
 self.videoPixelBuffer = videoPixelBuffer
}

// Running on the main queue/thread
@objc func displayLinkDidFire(timer: CADisplayLink) {
 // ...
  if let frame = self.videoFrame { self.sink?.onVideoFrame(frame) }
}

One thought is, could you just deliver the frames from inside dataOutputSynchronizer instead of by polling for them on the main thread? It's up to you, but ensuring that your access to the pixel buffers is thread safe is pretty important to prevent crashes.

Best, Chris

neuralxmasaki commented 4 years ago

Hi Chris,

Thank you very much for your kind advice. Are you suggesting me to do something like the following?

@objc func displayLinkDidFire(timer: CADisplayLink) { //removed code form here. self.videoFrame = VideoFrame(timeInterval: timer.timestamp, buffer: pixelBuffer, orientation: VideoOrientation.left) if let frame = self.videoFrame { self.sink?.onVideoFrame(frame) }

}

func dataOutputSynchronizer(_ synchronizer: AVCaptureDataOutputSynchronizer, didOutput synchronizedDataCollection: AVCaptureSynchronizedDataCollection) { //... self.videoPixelBuffer = videoPixelBuffer guard let self.pixelBuffer = self.videoPixelBuffer else { return }

return

}

ceaglest commented 4 years ago

Remove all the code from func displayLinkDidFire. I think you edited your last message but it was closer the first time:

`@objc func displayLinkDidFire(timer: CADisplayLink) {
  //removed code from here.
}
func dataOutputSynchronizer(_ synchronizer: AVCaptureDataOutputSynchronizer, didOutput synchronizedDataCollection: AVCaptureSynchronizedDataCollection) {
   //...
   self.videoPixelBuffer = videoPixelBuffer
   guard let pixelBuffer: CVPixelBuffer = self.videoPixelBuffer else { return }
   self.videoFrame = VideoFrame(timeInterval: timer.timestamp, buffer: pixelBuffer, orientation:
  VideoOrientation.left)
   if let frame = self.videoFrame { self.sink?.onVideoFrame(frame) }
   return
}
neuralxmasaki commented 4 years ago

Hi Chris,

Thank you very much again. The reason I edited was that I could not find the timer value for the following line. self.videoFrame = VideoFrame(timeInterval: timer.timestamp, buffer: pixelBuffer, orientation: VideoOrientation.left) Maybe I should do something like the following? @objcfunc displayLinkDidFire(timer: CADisplayLink) { //removed code form here. self.displayTimer = timer }

neuralxmasaki commented 4 years ago

Sorry for my naive questions.

I think I could make it work by. let interval = (self.displayLink?.timestamp ?? 0.03) self.videoFrame = VideoFrame(timeInterval: interval, buffer: pixelBuffer, orientation: VideoOrientation.left)

Thank you so much for your kind support on this, Chris!

ceaglest commented 4 years ago

Hi @neuralxmasaki,

I haven't tried AVCaptureDataOutputSynchronizerDelegate before, but I think you should use AVCaptureSynchronizedDataCollection.timestamp for your TVIVideoFrame. Without being able to run your code this is the closest I can get:

AVCaptureSynchronizedDataCollection    func dataOutputSynchronizer(_ synchronizer: AVCaptureDataOutputSynchronizer, didOutput synchronizedDataCollection: AVCaptureSynchronizedDataCollection) {
        guard renderingEnabled, let syncedVideoData: AVCaptureSynchronizedSampleBufferData = synchronizedDataCollection.synchronizedData(for: videoDataOutput) as? AVCaptureSynchronizedSampleBufferData else {
            return
        }
        if syncedVideoData.sampleBufferWasDropped {
            return
        }
        guard let videoPixelBuffer = CMSampleBufferGetImageBuffer(syncedVideoData.sampleBuffer) else {
            return
        }
        if let videoFrame = VideoFrame(timestamp: syncedVideoData.timestamp,
                                       buffer: videoPixelBuffer,
                                       orientation: .left),
            let sink = self.sink {
            sink.onVideoFrame(videoFrame)
        }
    }

I hope that helps, Chris

neuralxmasaki commented 4 years ago

Thank you very much, Chris! I tested it and it worked very well!

ceaglest commented 4 years ago

🎉 Awesome!

If you're no longer experiencing crashes let me know if the issue can be closed.

neuralxmasaki commented 4 years ago

Yes, please close it, thanks again Chris!