Closed ksuhr1 closed 4 years ago
I personally haven't been able to get this info from the Twilio SDK directly - instead if you create your own "VideoSource" you can own the AVCaptureSession yourself. Then just assign your own class to its sample buffer and forward along the buffer to both consumers - Twilio & your own Vision / OpenCV code.
The twilio capturer is super straightforward to set up:
final class CustomFrameCapturer: NSObject, VideoSource {
public var isScreencast: Bool
public weak var sink: VideoSink?
init(isScreencast: Bool) {
self.isScreencast = isScreencast
super.init()
}
func requestOutputFormat(_ outputFormat: VideoFormat) {
sink?.onVideoFormatRequest(outputFormat)
}
func deliverCapturedBuffer(
buffer: CVPixelBuffer,
orientation: VideoOrientation,
timestamp: Date = Date()
) {
guard
let frame = VideoFrame(
timeInterval: timestamp.timeIntervalSinceReferenceDate,
buffer: buffer,
orientation: orientation
)
else { return }
// The consumer retains the CVPixelBuffer and will own it as the buffer flows through the video pipeline.
sink?.onVideoFrame(frame)
}
}
private lazy var cameraVideoSource = CustomFrameCapturer(isScreencast: false)
private lazy var cameraTrack = LocalVideoTrack(
source: cameraVideoSource,
enabled: true,
name: "Camera"
)
then just call:
cameraVideoSource.deliverCapturedBuffer(buffer: buffer, orientation: .up, timestamp: exposureTime)
@ksuhr1 Writing custom VideoSource is one way, however easier ways is to implement a VideoRenderer on a track which doesn't require you to write a custom VideoSource.
We have a work-in-progress PR https://github.com/twilio/video-quickstart-ios/pull/286 which demonstrates custom VideoRenderer implementation. Here is the renderFrame implementation example from the PR.
Let me know if you have any questions.
@AdiAyyakad Thank you so much for your input! I will try this out
@piyushtank Thank you for your response! I am trying to figure out how to incorporate the the renderFrame function with the original code of this video-quickstart code. I have included the ExampleSampleBufferView.swift file to my project. In my ViewController, I have this function setupLocalVideoView that I call in startPreview() function. Would this be the proper way to get the frames?
This code removes the ability to see yourself in the corner preview. Do you know of any possible reason why it wouldn't show the rendered frames? It is printing "AVSampleBufferDisplayLayer is not ready for more frames." from the function enqueueFrame in this code https://github.com/twilio/video-quickstart-ios/blob/0a5f5bdaa043a77f1ea6491ef6b09ccf9c4e8f10/VideoRendererExample/VideoRenderers/ExampleSampleBufferView.swift#L130.
Thank you so much.
func setupLocalVideoView(track: LocalVideoTrack) {
// Create `ExampleSampleBufferRenderer`, and add it to the `UIStackView`.
let previewView = ViewController.kUseExampleSampleBufferView ?
ExampleSampleBufferView(frame: CGRect.zero) : VideoView(frame: CGRect.zero)
localVideoTrack!.addRenderer(previewView as! VideoRenderer)
}
func startPreview() {
camera!.startCapture(device: frontCamera != nil ? frontCamera! : backCamera!) { (captureDevice, videoFormat, error) in
if let error = error {
self.logMessage(messageText: "Capture failed with error.\ncode = \((error as NSError).code) error = \(error.localizedDescription)")
} else {
self.previewView.shouldMirror = (captureDevice.position == .front)
if( ViewController.kUseExampleSampleBufferView)
{
// Call to custom renderer
self.setupLocalVideoView(track: self.localVideoTrack!)
}
}
}
}
@ksuhr1 did you end up finding a good solution to do this with @piyushtank's approach?
Currently opting to go for @AdiAyyakad's suggestion but I'm a bit confused about what the buffer
variable comes from.
Update: this solution worked for me: twilio/video-quickstart-ios#500 (comment)
Description
** Not an issue, a question
I want to get the frames from the local video preview view so I can apply face detection using either Apple Vision or OpenCV. I understand for Apple Vision, they receive a CVSample Buffer that contains a single CVImageBuffer. I was trying to create a custom renderer but don't know what to pass for the VideoFrame. Does the localVideoTrack contain the most recent rendered frame? How do we get the VideoFrame from the localVideoTrack or previewView so I can pass it to an iOS Vision framework during a video chat? When I try to pass self.previewView.frame to the renderFrame function I get the error "Cannot convert value of type 'CGRRect' to expected argument type 'VideoFrame''. I've noticed some other comments about this but it wasn't very clear to me or the suggestions led to some deprecated code. Any steps or advice would be greatly appreciated!
let imageBuff = renderFrame(self.previewView.frame)
Thank you.
Xcode
[11.4.1]
iOS Version
[13.4.1]
iOS Device
[iPhone X]