Closed omarojo closed 7 years ago
Please try RTMPStream.appendSampleBuffer(_ sampleBuffer:CMSampleBuffer, withType: CMSampleBufferType, options:[NSObject: AnyObject]? = nil)
You can create a CMSampleBuffer from CVPixelBufferRef. framework needs CVPixelBuffer + presentationTimestamp:CMTime + duration:CMTime. CMSampleBuffer contains these properties. https://github.com/shogo4405/lf.swift/blob/master/Sources/Codec/AVCDecoder.swift#L107-L130
Regards.
Awesome man.. thank you very much.
I managed to create a CMSampleBuffer and feed it to the RTMPStream object. 👍 I noticed though, that the image that is being pushed, does an aspectToFill when Im looking at the stream output.
Is there a way to manage the frame/position/size of the pushed custom content into the output stream ? or maybe just change it to aspectToFit, so if Im pushing a squared image, the full image is visible, regardless of the videoSettings width and height.
I think this property. https://github.com/shogo4405/lf.swift/commit/368a07368a3ccea2c0a41bfb740e19e8b484612d
I will release this weekend (0.5.9). Then you can use this one.
@omarojo om
i have problem with a feature like your, i want to stream a static image instead of live content from camera, but i got problem when create CMSampleBuffer for appendSampleBuffer function, can you give me the example code.
Is there a way to stream custom video content instead of the native Camera feed? For example by providing a
CVPixelBufferRef
on each frame to a convenient method.Im retrieving each frame from GPUImage as a CVPixelBuffer, so I want to stream that.
I wonder how hard would it be to add this feature to your library, considering the device orientations, and rtmpStream.videoSettings.
Thanks for your support 👍