Closed svprdga closed 3 years ago
Are you expecting to include microphone audio or audio coming from the phone? From your code above it looks like you are trying to include microphone audio. This should come through as long as you manually turn the microphone on in the broadcast extension screen (it should turn red if microphone is on).
@pqseags for now I am trying to include microphone. I do enable the microphone button in the broadcast extension screen, without luck.
By the way, is it possible to include internal audio? Do you know how to do it?
If you want to include internal audio from the apps that are being broadcasted, just move the .appendSampleBuffer
line from .audioMic to the .audioApp case. Don't include it in both, as that will cause problems. You can try that to help debug to see if it is a mic-specific issue or an issue with audio in general. Nothing else looks wrong with your code at a glance.
Thanks @pqseags , I managed to make it work.
@svprdga can you please share your source code. I want to implement the same thing.
@HamzaKiyani43 my source code contains business rules of my app that alters how a base implementation should look like, I can however share how I implemented processSampleBuffer()
:
override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) {
switch sampleBufferType {
case .video:
if let description = CMSampleBufferGetFormatDescription(sampleBuffer) {
let dimensions = CMVideoFormatDescriptionGetDimensions(description)
rtmpStream.videoSettings = [
.width: dimensions.width,
.height: dimensions.height,
.bitrate: 2000000,
.profileLevel: kVTProfileLevel_H264_Baseline_AutoLevel
]
}
rtmpStream.appendSampleBuffer(sampleBuffer, withType: .video)
case .audioApp:
rtmpStream.appendSampleBuffer(sampleBuffer, withType: .audio)
break
case .audioMic:
break
@unknown default:
break
}
}
If you have more specific question ask and I try to help you if I can.
@svprdga brother I want to broadcast my screen on which there is a camera view and on camera view there are some images. i want to broadcast all the content (camera view and images both and audio also). can you help me with that.?
@HamzaKiyani43, take a look at the SampleHandler example provided in this library, it illustrates a basic video streaming: https://github.com/shogo4405/HaishinKit.swift/blob/master/Examples/iOS/Screencast/SampleHandler.swift
@svprdga i have configured this extension but i need more then basic video streaming. i want to stream my screen like shown in pic attached. where my mobile screen has two images on top corners and a scorecard webview at bottom. this is screenshot taken from application prism https://apps.apple.com/us/app/prism-live-studio/id1319056339
Describe the bug I am trying to setup a screencast on my app to record everything that happens on the screen. Right now I am able to broadcast the video using a broadcast extension via ReplayKit, but the audio is not included in the stream.
To Reproduce Steps to reproduce the behavior:
Expected behavior Both audio and video are included in the broadcast.
Actual behavior The screencast only includes video, no audio at all.
Smartphone (please complete the following information):
Additional context My SampleHandler: