shogo4405 / HaishinKit.swift

Camera and Microphone streaming library via RTMP and SRT for iOS, macOS, tvOS and visionOS.
https://docs.haishinkit.com/swift/latest
BSD 3-Clause "New" or "Revised" License
2.79k stars 619 forks source link

How to stream custom content instead of native Camera #173

Closed omarojo closed 7 years ago

omarojo commented 7 years ago

Is there a way to stream custom video content instead of the native Camera feed? For example by providing a CVPixelBufferRef on each frame to a convenient method.

Im retrieving each frame from GPUImage as a CVPixelBuffer, so I want to stream that.

I wonder how hard would it be to add this feature to your library, considering the device orientations, and rtmpStream.videoSettings.

Thanks for your support 👍

shogo4405 commented 7 years ago

Please try RTMPStream.appendSampleBuffer(_ sampleBuffer:CMSampleBuffer, withType: CMSampleBufferType, options:[NSObject: AnyObject]? = nil)

You can create a CMSampleBuffer from CVPixelBufferRef. framework needs CVPixelBuffer + presentationTimestamp:CMTime + duration:CMTime. CMSampleBuffer contains these properties. https://github.com/shogo4405/lf.swift/blob/master/Sources/Codec/AVCDecoder.swift#L107-L130

Regards.

omarojo commented 7 years ago

Awesome man.. thank you very much.

I managed to create a CMSampleBuffer and feed it to the RTMPStream object. 👍 I noticed though, that the image that is being pushed, does an aspectToFill when Im looking at the stream output.

Is there a way to manage the frame/position/size of the pushed custom content into the output stream ? or maybe just change it to aspectToFit, so if Im pushing a squared image, the full image is visible, regardless of the videoSettings width and height.

shogo4405 commented 7 years ago

I think this property. https://github.com/shogo4405/lf.swift/commit/368a07368a3ccea2c0a41bfb740e19e8b484612d

I will release this weekend (0.5.9). Then you can use this one.

trantruonguet commented 3 years ago

@omarojo om

i have problem with a feature like your, i want to stream a static image instead of live content from camera, but i got problem when create CMSampleBuffer for appendSampleBuffer function, can you give me the example code.