GetStream / stream-video-swift

SwiftUI Video SDK ➡️ Stream Video 📹
https://getstream.io/video/sdk/ios/
Other
117 stars 21 forks source link

Help needed to integrate camera intrinsics with CMSampleBuffer in iOS using StreamVideo SDK #578

Open andreasteich opened 1 week ago

andreasteich commented 1 week ago

What are you trying to achieve?

I’m currently trying to integrate the following code to retrieve camera intrinsics from the CMSampleBuffer to compute the field of view (FOV):

if let captureConnection = videoDataOutput.connection(with: .video) {
    captureConnection.isEnabled = true
    captureConnection.isCameraIntrinsicMatrixDeliveryEnabled = true
}
nonisolated func computeFOV(_ sampleBuffer: CMSampleBuffer) -> Double? {
    guard let camData = CMGetAttachment(sampleBuffer, key: kCMSampleBufferAttachmentKey_CameraIntrinsicMatrix, attachmentModeOut: nil) as? Data else { return nil }

    let intrinsics: matrix_float3x3? = camData.withUnsafeBytes { pointer in
        if let baseAddress = pointer.baseAddress {
            return baseAddress.assumingMemoryBound(to: matrix_float3x3.self).pointee
        }
        return nil
    }

    guard let intrinsics = intrinsics else { return nil }

    let fx = intrinsics[0][0]
    let w = 2 * intrinsics[2][0]
    return Double(atan2(w, 2 * fx))
}

However, I’m not very familiar with WebRTC on iOS, and I’m wondering where I can find the typical captureOutput with the CMSampleBuffer in the Sources -> StreamVideo package. I would appreciate any guidance or suggestions on where to integrate this functionality into the existing codebase.

Thanks for your help!

Best regards

If possible, how can you achieve this currently?

Maybe possible?

What would be the better way?

I don't know right now.

andreasteich commented 1 week ago

Or let me ask differently. Is it possible to capture video using AVFoundation itself and passing the frames manually to videocapturer?