shogo4405 / HaishinKit.swift

Camera and Microphone streaming library via RTMP and SRT for iOS, macOS, tvOS and visionOS.
BSD 3-Clause "New" or "Revised" License
2.77k stars 612 forks source link

Audio not heard with RPScreenRecorder#startCapture #1141

Closed yuvarajb1992 closed 1 year ago

yuvarajb1992 commented 1 year ago

Describe the bug

@shogo4405 We are trying to capture the one-to-one video call screen using RPScreenRecorder and sending that feed to the rtmp url and converting to HLS url to play. So we are able to capture and send mobile screen with video but the audio is not heard when I play the URL parallelly.

To Reproduce

Is this the correct line to access audio while capturing? Also please check the screenshot of the code RPScreenRecorder.shared().isMicrophoneEnabled = true

Expected behavior

When playing with RTMP url We need to get audio from each user's end-screen capture.

Version

1.4.2

Smartphone info.

No response

Additional context

No response

Screenshots

permission

Relevant log output

No response

shogo4405 commented 1 year ago

RPScreenRecorder requires RPBroadcastSampleHandler. https://developer.apple.com/documentation/replaykit/rpbroadcastsamplehandler.

Sample codes is here. https://github.com/shogo4405/HaishinKit.swift/blob/main/Examples/iOS/Screencast/SampleHandler.swift

yuvarajb1992 commented 1 year ago

@shogo4405 Thanks for the reply.

Query:


Please check the screenshot for the startCapture

capture

Thanks in advance.

shogo4405 commented 1 year ago

Can I see trace level log? Please set LBLogger.with(HaishinKitIdentifier).level = .trace.

yuvarajb1992 commented 1 year ago

@shogo4405 please find the trace log text file. trace_log.txt

shogo4405 commented 1 year ago

It looks like it's going to be just a video message from the middle. What did you do during this time?

2023-23-02 19:51:24.953 [Trace] [com.haishinkit.HaishinKit] [RTMPSocket.swift:48] doOutput(chunk:) > RTMPChunk{size: 0,type: one,streamId: 8,message: Optional(RTMPAudioMessage{type: audio,length: 346,streamId: 1,timestamp: 21,payload: 338 bytes,codec: unknown,soundRate: kHz44,soundSize: snd8bit,soundType: stereo}),fragmented: false,_data: 8 bytes}
2023-23-02 19:51:24.957 [Trace] [com.haishinkit.HaishinKit] [RTMPSocket.swift:48] doOutput(chunk:) > RTMPChunk{size: 0,type: one,streamId: 9,message: Optional(RTMPVideoMessage{type: video,length: 385,streamId: 1,timestamp: 16,payload: 377 bytes,codec: unknown,status: 0}),fragmented: false,_data: 8 bytes}
2023-23-02 19:51:24.974 [Trace] [com.haishinkit.HaishinKit] [RTMPSocket.swift:48] doOutput(chunk:) > RTMPChunk{size: 0,type: one,streamId: 8,message: Optional(RTMPAudioMessage{type: audio,length: 334,streamId: 1,timestamp: 21,payload: 326 bytes,codec: unknown,soundRate: kHz44,soundSize: snd8bit,soundType: stereo}),fragmented: false,_data: 8 bytes}
2023-23-02 19:51:24.978 [Trace] [com.haishinkit.HaishinKit] [RTMPSocket.swift:48] doOutput(chunk:) > RTMPChunk{size: 0,type: one,streamId: 9,message: Optional(RTMPVideoMessage{type: video,length: 2813,streamId: 1,timestamp: 17,payload: 2805 bytes,codec: unknown,status: 0}),fragmented: false,_data: 8 bytes}
2023-23-02 19:51:24.990 [Trace] [com.haishinkit.HaishinKit] [RTMPSocket.swift:48] doOutput(chunk:) > RTMPChunk{size: 0,type: one,streamId: 9,message: Optional(RTMPVideoMessage{type: video,length: 673,streamId: 1,timestamp: 19,payload: 665 bytes,codec: unknown,status: 0}),fragmented: false,_data: 8 bytes}
2023-23-02 19:51:25.009 [Trace] [com.haishinkit.HaishinKit] [RTMPSocket.swift:48] doOutput(chunk:) > RTMPChunk{size: 0,type: one,streamId: 9,message: Optional(RTMPVideoMessage{type: video,length: 399,streamId: 1,timestamp: 14,payload: 391 bytes,codec: unknown,status: 0}),fragmented: false,_data: 8 bytes}
2023-23-02 19:51:25.026 [Trace] [com.haishinkit.HaishinKit] [RTMPSocket.swift:48] doOutput(chunk:) > RTMPChunk{size: 0,type: one,streamId: 9,message: Optional(RTMPVideoMessage{type: video,length: 3055,streamId: 1,timestamp: 18,payload: 3047 bytes,codec: unknown,status: 0}),fragmented: false,_data: 8 bytes}
yuvarajb1992 commented 1 year ago

We are doing one-to-one video call using webRTC in single screen. @shogo4405

shogo4405 commented 1 year ago

It seems that RPScreenRecorder#startCapture will stop to capture audio data. I can't resolve it.

yuvarajb1992 commented 1 year ago

@shogo4405 Thanks for your reply. Is there any alternate solution?

Our requirement: We are doing (one-to-one or many) webRTC video call and should share the entire screen with (video&audio).

Does HaishinKit support the above our requirement?

Thanks in advance. 🙂

shogo4405 commented 1 year ago

I have no idea. It seems more like a combination of WebRTC and ReplayKit than HaishinKit issue. I wish you good luck! Thanks. I will close as invali.