Closed dirk61 closed 1 year ago
Thanks!
I haven’t worked on this for years, but it could be that the reason for different number of frames is that this project doesn’t use output synchronization.
You may be able to enable synchronization with small effort.
Apple documentation (https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture/streaming_depth_data_from_the_truedepth_camera) has an example how to do synchronization: Synchronize the normal RGB video data with depth data output. The first output in the dataOutputs array is the master output. outputSynchronizer = AVCaptureDataOutputSynchronizer(dataOutputs: [videoDataOutput, depthDataOutput]) outputSynchronizer!.setDelegate(self, queue: dataOutputQueue)
And also you can find out the commented out part here: https://github.com/mantoone/DepthCapture/blob/e8019137200c68c7d4b3d5422421a87eee2d95fc/DepthApp/DepthApp/ViewController.swift#L70
Really appreciate your great work! I noticed that when I ran the jupyter notebook the overall frame count is not equal to the video frame count (30fps). What causes this difference?