shogo4405 / HaishinKit.swift

Camera and Microphone streaming library via RTMP and SRT for iOS, macOS, tvOS and visionOS.
https://docs.haishinkit.com/swift/latest
BSD 3-Clause "New" or "Revised" License
2.78k stars 618 forks source link

How to use ScreenCaptureSession to live broadcast screen capture? #28

Closed pzs7602 closed 8 years ago

pzs7602 commented 8 years ago

please give me some code snippet,thanks!

shogo4405 commented 8 years ago

https://github.com/shogo4405/lf.swift/blob/master/Application/Application/LiveViewController.swift#L99

// 1st step:  Comment-out line
// rtmpStream.attachCamera(AVMixer.deviceWithPosition(.Back))
// 2nd step: Remove Comment-out Line
rtmpStream.attachScreen(ScreenCaptureSession())
pzs7602 commented 8 years ago

thanks! it works. can rtmpStream attach both camera and screen capture?

shogo4405 commented 8 years ago

library has no this feature. I think can create this feature. custom VIsualEffect Plugin.


// example Custom VisualEffect
final class CameraMixEffect: VisualEffect, AVCaptureVideoDataOutputSampleBufferDelegate {
    let filter:CIFilter? = CIFilter(name: "CISourceOverCompositing")
    var camera:CIImage?
    var lockQueue:dispatch_queue_t? dispatch_queue_create(
         "CameraMixEffect.lock", DISPATCH_QUEUE_SERIAL
   )

    override init() {
        super.init()
    }

    override func execute(image: CIImage) -> CIImage {
        guard let filter:CIFilter = filter else {
            return image
        }
        dispatch_async(lockQueue) {
            self.filter.setValue(camera, forKey: "inputImage")
            self.filter.setValue(image, forKey: "inputBackgroundImage")
        }
        return filter.outputImage!
    }

    func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
        guard let image:CVImageBufferRef = CMSampleBufferGetImageBuffer(sampleBuffer) else {
            return
        }
        camera = CIImage(CVPixelBuffer:  image)
    }
}

// next 
var session = AVCaptureSession()
var videoOutput =  AVCaptureVideoDataOutput()
session.addInput(try! AVCaptureDeviceInput(device: camera))
session.addOutput(videoDataOutput)

var effect = CameraMixEffect()
videoOutput.setSampleBufferDelegate(effect, queue: effect.lockQueue)

var stream:RTMPStream = RTMPStream()
stream.registerEffect(effect)
pzs7602 commented 8 years ago

I try this, but the camera image does not shown over the device screen, perhaps the camera image's position or size must be set properly? I can get the image data from didOutputSampleBuffer method and display it on the screen, that also what I need. anyway, thank you very much!

tatuanfpt commented 3 years ago

Is this solution still work? Consider about adding some UIView from screen as a layer to Device Camera for a better quality in comparison with recording screen (which result in high CPU + low FPS).