Closed dmyma closed 6 years ago
Thank you for reply @robjperez . I am trying to implement a Broadcast extension(to stream the screen out-of-app) with Opentok. There are examples to share the screen in app, so my question is it possible to stream out-of-app, because I am struggling to implement it.
iOS won't allow you to capture other apps content in real time. You can record freely what's in your app, but you cannot put your app in background and keep recording the screen. See our Screen Sharing sample for more details.
Apple introduced ReplayKit some years ago, but that allows you to record a video capturing any content from your phone. After finishing the recording, you can then send it anywhere, but it cannot be done in realtime.
Hope it helps
@robjperez : Please refer to this video from WWDC 2018: https://developer.apple.com/videos/play/wwdc2018/601/
With ReplayKit 2 it IS possible to perform "iOS System Broadcast".
same question
Same question here. Is it possible using ReplayKit 2 to broadcast the Devices screen to a Tokbox session?
@robjperez : Please refer to this video from WWDC 2018: https://developer.apple.com/videos/play/wwdc2018/601/
With ReplayKit 2 it IS possible to perform "iOS System Broadcast".
App Extension is memory limited. only 50MB is allowed when running on my phone. Tokbox uses above 70MB and the extension gets killed by system.
Anyone made a successful attempt?
Hey @tuanit09 Same here :( Have you found any solution?
@robjperez It would be great if you guys can give us an update on this. The BroadCast-Ext Project is not working no matter how I change the way of consuming. Always passes the 50 Mb memory limit.
hey @mehmetbaykar did you find any solution, for BroadCast-EXT
Hi @yashukla47,
You can check out this repo and make this changes:
Change Constants
Change SDK Version to 2.18.1
This was the only solution I had for iPhone 11.
Thanks, Brother @mehmetbaykar, Will try these changes, I hope it works this time
hey, @mehmetbaykar I am still facing the problem, It looks like I am able to send the data of the screen share from the publisher side but it's not able to receive it on the other end.
Any Idea
Hey @yashukla47 How do you subscribe to the Screen Video and add it into your view hierarchy?
Hey, now it's working fine .... changing the SDK to the version that you suggested i.e 2.18.1 ... worked for me
Thanks
HI anyone have this isssue? i start to sharing screen. But have this error:: Broadcast is stop, because you attempt to start an invalid broadcast session. :(( anyone experienced,
@yashukla47
@mehmetbaykar i'm facing invalid broadcast even set constants to 0,3 and 2 for processSecondFrame. so i think this is overg 50mb so have you processed it? with the sample buffer ?
@mehmetbaykar I'm having the same issue. I'm able to send a couple of frames to other clients on the session before the 50mb limit causes the extension to exit, and the phone shows the invalid broadcast session message. I'm sending 6 frames per second and have scaled them down to 111x240 pixels (in portrait).
I'm not sure what the pixel buffer pool is for. It looks like only one buffer is ever used.
I'm currently using OpenTok 2.21.2. Is the suggestion above advocating a downgrade?
I also implemented the extension in swift. Should I switch to Obj-C?
Pared down the dependencies for the broadcast extension to ReplayKit, OpenTok, and OSLog with the processSampleBuffer below. Instruments still clocks the process steady state at around 330MB (305MB if I remove OSLog). Can anyone do better?
override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) {
switch sampleBufferType {
case .video:
guard capturing, let videoCaptureConsumer = videoCaptureConsumer else {
return
}
let pts: CMTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer) as CMTime
if let last = lastTimestamp {
let delta = CMTimeSubtract(pts, last).seconds
guard delta > desiredFrameRate else { return }
}
lastTimestamp = pts
guard let ciContext = ciContext else { return }
guard let imageBuffer: CVImageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
print("Failed to get imageBuffer from sampleBuffer")
return
}
var (width, height) = inspectBuffer(imageBuffer: imageBuffer)
let orientation = orientationFromFrame(sampleBuffer: sampleBuffer)
let scale: CGFloat = 0.25
do {
width = Int(CGFloat(width) * scale)
height = Int(CGFloat(height) * scale)
if resizedBuffer == nil || imageWidth != width || imageHeight != height {
imageWidth = width
imageHeight = height
let status = CVPixelBufferCreate(kCFAllocatorDefault, imageWidth, imageHeight, kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, nil, &resizedBuffer)
guard status == kCVReturnSuccess else {
throw SSError(message: "Failed allocating pixel buffer")
}
}
guard let pixelBuffer = resizedBuffer else {
throw SSError(message: "resized buffer is nil")
}
let ciImage = try self.scaleFilterImage(input: CIImage(cvImageBuffer: imageBuffer, options: nil), scale: scale)
ciContext.render(ciImage, to: pixelBuffer)
videoCaptureConsumer.consumeImageBuffer(pixelBuffer, orientation: orientation, timestamp: pts, metadata: nil)
} catch {
print("Error processing frame: \(error.localizedDescription)")
}
break
case .audioApp: break
case .audioMic: break
@unknown default:
print("Unknown type of sample buffer")
}
}
func inspectBuffer(imageBuffer: CVImageBuffer) -> (Int, Int) {
let w = CVPixelBufferGetWidth(imageBuffer)
let h = CVPixelBufferGetHeight(imageBuffer)
//let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer)
//let planeCount = CVPixelBufferGetPlaneCount(imageBuffer)
//let format = CVPixelBufferGetPixelFormatType(imageBuffer)
return (w, h)
}
func scaleFilterImage(input: CIImage, scale: CGFloat) throws -> CIImage {
guard let scaleFilter = scaleFilter else {
throw SSError(message: "scalefilter not set")
}
scaleFilter.setValue(input, forKey: kCIInputImageKey)
scaleFilter.setValue(scale, forKey: kCIInputScaleKey)
guard let output = scaleFilter.outputImage else {
throw SSError(message: "Failed scaling image")
}
return output
}
I found a better way to compress the pixelBuffer using Accelerate framework. It uses CPU not GPU. On iPhone, you should resize at least 0.5, and on the iPad at least 0.4. Here you have an extension that you can use inside of processSampleBuffer
function.
Hi @dmyma , can you elaborate a little bit more on the question?