gorastudio-git / SCNRecorder

The best way to record your AR experience!
MIT License
201 stars 51 forks source link

Capture Pixel Buffers #33

Closed maxxfrazer closed 3 years ago

maxxfrazer commented 3 years ago

Hi, I really like this package, but I'm trying to do something with it for which I can't see an easy route.

I want to catch every frame before it's added to the video, and send it elsewhere instead of save it to a video file. Is this possible?

I tried using the capturePixelBuffers method, but the handler in there doesn't get called, not sure if it's supposed to work that way though.

Any help would be greatly appreciated 🙏

v-grigoriev commented 3 years ago

Talking about WebRTC I have the following class to do this and it works. If this is still an issue can you please provide an example of usage?

import Foundation
import WebRTC
import SceneKit
import SCNRecorder
import Accelerate

final class ARRTCVideoCapturer: RTCVideoCapturer {

    let sceneView: SCNView

    var output: PixelBufferOutput?

    static var format = vImage_CGImageFormat(
        bitsPerComponent: 8,
        bitsPerPixel: 32,
        colorSpace: nil,
        bitmapInfo: CGBitmapInfo(rawValue: CGImageAlphaInfo.last.rawValue),
        version: 0,
        decode: nil,
        renderingIntent: .defaultIntent
    )

    init(sceneView: SCNView, delegate: RTCVideoCapturerDelegate) {
        self.sceneView = sceneView
        super.init(delegate: delegate)
    }

    func startCapture() {
        output = sceneView.capturePixelBuffers { [weak self] (pixelBuffer, time) in
            guard let this = self else { return }

            var pixelBuffer = pixelBuffer
            switch CVPixelBufferGetPixelFormatType(pixelBuffer) {
            case kCVPixelFormatType_30RGBLEPackedWideGamut:
                var sourceBuffer = vImage_Buffer()
                vImageBuffer_InitWithCVPixelBuffer(
                    &sourceBuffer,
                    &Self.format,
                    pixelBuffer,
                    nil,
                    nil,
                    vImage_Flags(kvImageNoFlags)
                )

                var buffer: CVPixelBuffer?
                CVPixelBufferCreate(
                    nil,
                    CVPixelBufferGetWidth(pixelBuffer),
                    CVPixelBufferGetHeight(pixelBuffer),
                    kCVPixelFormatType_32BGRA,
                    [kCVPixelBufferBytesPerRowAlignmentKey as String: 4] as CFDictionary,
                    &buffer
                )

                guard let destinationBuffer = buffer else { break }

                vImageBuffer_CopyToCVPixelBuffer(
                    &sourceBuffer,
                    &Self.format,
                    destinationBuffer,
                    nil,
                    nil,
                    vImage_Flags(kvImageNoFlags)
                )

                sourceBuffer.free()
                pixelBuffer = destinationBuffer
            default: break
            }

            let timeStamp = time.seconds * 1_000_000_000
            let buffer = RTCCVPixelBuffer(pixelBuffer: pixelBuffer)
            let frame = RTCVideoFrame(buffer: buffer, rotation: ._0, timeStampNs: Int64(timeStamp))
            this.delegate?.capturer(this, didCapture: frame)
        }
    }

    func stopCapture() {
        output = nil
    }
}
v-grigoriev commented 3 years ago

@maxxfrazer Has the issue been resolved?

v-grigoriev commented 3 years ago

Closing since there is no activity.

maxxfrazer commented 3 years ago

oh wow that was closed quickly 😅 the code looks to do what I wanted, I'll check it out tomorrow (Monday). I am wanting it for RTC purposes, so looks to be the right result

fukemy commented 1 year ago

@v-grigoriev could u please provide full code? Thanks so much, I am want to intergrate AR with WebRTC