shu223 / iOS-Depth-Sampler

Code examples for Depth APIs in iOS
MIT License
1.19k stars 101 forks source link

Recording Depth #6

Open pablovicentem opened 5 years ago

pablovicentem commented 5 years ago

Hello,

I want to know if it is possible to record videos or sequences and save the RGB+Depth info in a file?

mantoone commented 5 years ago

Did you find a solution?

pablovicentem commented 5 years ago

No, do you know one?

mantoone commented 5 years ago

Unfortunately not, I started implementing one, but I am stuck in saving the depth information. I saw some potential solutions here https://stackoverflow.com/questions/47664306/save-depth-images-from-truedepth-camera but I haven't had time to try those yet.

pablovicentem commented 5 years ago

I am in the same situation than you, I can preview the Depth but not save it, and I also trying to implement the solutions of that post

mantoone commented 5 years ago

I was able to write the depth data to a file using Eyal Fink's solution (in the stackoverflow thread). However, now I need to figure out how to read that file to find out that the data is correct.

mantoone commented 5 years ago

Ok, now I have a working solution that lets you capture depth data and RGB video simultaneously and save those to files. I also created a jupyter notebook for reading and visualizing the depth data from the saved file. You can find my code here https://github.com/mantoone/DepthCapture . Proper instructions are yet to be done, but if you have any questions feel free to ask.

pablovicentem commented 5 years ago

Thank you very much is a useful solution

skypanther commented 5 years ago

I can't find the reference at the moment, but you can embed depth data into JPG and other standard file formats. It's not on https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture/capturing_photos_with_depth but that was my jumping off point when I did find that info.

Assuming your ViewController is an AVCapturePhotoCaptureDelegate, you'd do something like this

// when configuring your AVCaptureSession, be sure to
// specify that the output should get depth data
let photoOutput = AVCapturePhotoOutput()
self.session.beginConfiguration()
self.session.addOutput(photoOutput)
self.session.commitConfiguration()
self.photoOutput.isDepthDataDeliveryEnabled = true

@IBAction func takePhoto(_ sender: UIButton) {
        var photoSettings = AVCapturePhotoSettings()
        photoSettings = AVCapturePhotoSettings(format: [AVVideoCodecKey: AVVideoCodecType.jpeg])
        // enable & embed the depth data
        photoSettings.isDepthDataDeliveryEnabled = true
        photoSettings.embedsDepthDataInPhoto = true
        photoSettings.isDepthDataFiltered = true
        self.photoOutput.capturePhoto(with: photoSettings, delegate: self)
}

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
        print(photo.depthData) // to get the data
        // if you save here, say to the camera roll, the resulting
        // file will have the depth data embedded
}