Closed MartinP7r closed 4 years ago
おはよう〜w
Are you using the latest version of the library? Could you also enable debug logs (Logger.setLogLevel(LogLevel.TRACE)) and post the output?
Thanks for the quick response!
I'm using version 1.4.1
.
Setting the log level as above doesn't display anything new in the console (or does the output of your logger go somewhere else?)
This is the usual output in the console (with or without your loggin enabled):
objc[296]: Class RTCEncodedImage is implemented in both /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/mediasoup_client_ios.framework/mediasoup_client_ios (0x102017418) and /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/WebRTC.framework/WebRTC (0x1017bfc38). One of the two will be used. Which one is undefined.
objc[296]: Class RTCRtpFragmentationHeader is implemented in both /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/mediasoup_client_ios.framework/mediasoup_client_ios (0x102017468) and /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/WebRTC.framework/WebRTC (0x1017bfc88). One of the two will be used. Which one is undefined.
objc[296]: Class RTCVideoCapturer is implemented in both /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/mediasoup_client_ios.framework/mediasoup_client_ios (0x102017490) and /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/WebRTC.framework/WebRTC (0x1017bfcb0). One of the two will be used. Which one is undefined.
objc[296]: Class RTCVideoCodecInfo is implemented in both /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/mediasoup_client_ios.framework/mediasoup_client_ios (0x1020174e0) and /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/WebRTC.framework/WebRTC (0x1017bfd00). One of the two will be used. Which one is undefined.
objc[296]: Class RTCVideoEncoderQpThresholds is implemented in both /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/mediasoup_client_ios.framework/mediasoup_client_ios (0x102017530) and /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/WebRTC.framework/WebRTC (0x1017bfd50). One of the two will be used. Which one is undefined.
objc[296]: Class RTCVideoEncoderSettings is implemented in both /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/mediasoup_client_ios.framework/mediasoup_client_ios (0x1020175a8) and /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/WebRTC.framework/WebRTC (0x1017bfdc8). One of the two will be used. Which one is undefined.
objc[296]: Class RTCVideoFrame is implemented in both /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/mediasoup_client_ios.framework/mediasoup_client_ios (0x1020175d0) and /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/WebRTC.framework/WebRTC (0x1017bfdf0). One of the two will be used. Which one is undefined.
objc[296]: Class RTCNativeAudioSessionDelegateAdapter is implemented in both /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/mediasoup_client_ios.framework/mediasoup_client_ios (0x102017300) and /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/WebRTC.framework/WebRTC (0x1017c0e80). One of the two will be used. Which one is undefined.
objc[296]: Class RTCAudioSession is implemented in both /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/mediasoup_client_ios.framework/mediasoup_client_ios (0x102017350) and /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/WebRTC.framework/WebRTC (0x1017c0ed0). One of the two will be used. Which one is undefined.
objc[296]: Class RTCAudioSessionConfiguration is implemented in both /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/mediasoup_client_ios.framework/mediasoup_client_ios (0x1020173a0) and /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/WebRTC.framework/WebRTC (0x1017c0f20). One of the two will be used. Which one is undefined.
objc[296]: Class RTCDispatcher is implemented in both /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/mediasoup_client_ios.framework/mediasoup_client_ios (0x102017648) and /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/WebRTC.framework/WebRTC (0x1017c0f98). One of the two will be used. Which one is undefined.
objc[296]: Class RTCCameraPreviewView is implemented in both /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/mediasoup_client_ios.framework/mediasoup_client_ios (0x102017670) and /private/var/containers/Bundle/Application/9690EDEC-7FA0-47F3-90FF-5ADF5DBB918E/VideoConf.app/Frameworks/WebRTC.framework/WebRTC (0x1017c0fc0). One of the two will be used. Which one is undefined.
2020-06-05 10:58:57.829782+0900 VideoConf[296:6972] Metal API Validation Enabled
sidenote: I don't think it matters, but I'm using my own socket.io backend for signaling.
Does your backend return data for the produce request?
Also could you share the rtpCapabilities used when loading the device? Seems to be an issue when the device is loaded with incorrect rtpCapabilities, the listener doesn't fire. However the onConnect listener seems to be working for you, so it may be another issue..
I've been restructuring things a little and have a short side-question if you don't mind.
In your sample app.
private class SendTransportHandler : NSObject, SendTransportListener {
fileprivate weak var delegate: SendTransportListener?
private var parent: RoomClient
self.sendTransportHandler = SendTransportHandler.init(parent: self)
self.sendTransportHandler!.delegate = self.sendTransportHandler!
Could you explain the reasoning behind having a separate class for delegating the SendTransportListener
and then having that class delegate onto itself and keep a strong reference to its parent?
As for your question:
The backend returns data for the produce request, but now it's cutting of
The associated promise has been destructed prior to the associated state becoming ready.
I'm thinking this is socket.io related. I'm using https://github.com/socketio/socket.io-client-swift for the connection and looking into it.
rtpCapabilities are as follows:
{
"codecs" : [
{
"clockRate" : 48000,
"preferredPayloadType" : 100,
"parameters" : {
},
"kind" : "audio",
"mimeType" : "audio\/opus",
"channels" : 2,
"rtcpFeedback" : [
]
},
{
"parameters" : {
"x-google-start-bitrate" : 1000
},
"mimeType" : "video\/VP8",
"rtcpFeedback" : [
{
"type" : "nack"
},
{
"parameter" : "pli",
"type" : "nack"
},
{
"parameter" : "fir",
"type" : "ccm"
},
{
"type" : "goog-remb"
},
{
"type" : "transport-cc"
}
],
"kind" : "video",
"preferredPayloadType" : 101,
"clockRate" : 90000
},
{
"parameters" : {
"apt" : 101
},
"mimeType" : "video\/rtx",
"rtcpFeedback" : [
],
"kind" : "video",
"preferredPayloadType" : 102,
"clockRate" : 90000
},
{
"parameters" : {
"profile-id" : 2,
"x-google-start-bitrate" : 1000
},
"mimeType" : "video\/VP9",
"rtcpFeedback" : [
{
"type" : "nack"
},
{
"parameter" : "pli",
"type" : "nack"
},
{
"parameter" : "fir",
"type" : "ccm"
},
{
"type" : "goog-remb"
},
{
"type" : "transport-cc"
}
],
"kind" : "video",
"preferredPayloadType" : 103,
"clockRate" : 90000
},
{
"mimeType" : "video\/rtx",
"parameters" : {
"apt" : 103
},
"preferredPayloadType" : 104,
"kind" : "video",
"clockRate" : 90000,
"rtcpFeedback" : [
]
},
{
"mimeType" : "video\/H264",
"parameters" : {
"packetization-mode" : 1,
"profile-level-id" : "4d0032",
"level-asymmetry-allowed" : 1,
"x-google-start-bitrate" : 1000
},
"preferredPayloadType" : 105,
"kind" : "video",
"clockRate" : 90000,
"rtcpFeedback" : [
{
"type" : "nack"
},
{
"parameter" : "pli",
"type" : "nack"
},
{
"parameter" : "fir",
"type" : "ccm"
},
{
"type" : "goog-remb"
},
{
"type" : "transport-cc"
}
]
},
{
"mimeType" : "video\/rtx",
"parameters" : {
"apt" : 105
},
"preferredPayloadType" : 106,
"kind" : "video",
"clockRate" : 90000,
"rtcpFeedback" : [
]
},
{
"mimeType" : "video\/H264",
"parameters" : {
"x-google-start-bitrate" : 1000,
"packetization-mode" : 1,
"profile-level-id" : "42e01f",
"level-asymmetry-allowed" : 1
},
"preferredPayloadType" : 107,
"kind" : "video",
"clockRate" : 90000,
"rtcpFeedback" : [
{
"type" : "nack"
},
{
"parameter" : "pli",
"type" : "nack"
},
{
"parameter" : "fir",
"type" : "ccm"
},
{
"type" : "goog-remb"
},
{
"type" : "transport-cc"
}
]
},
{
"mimeType" : "video\/rtx",
"parameters" : {
"apt" : 107
},
"preferredPayloadType" : 108,
"kind" : "video",
"clockRate" : 90000,
"rtcpFeedback" : [
]
}
],
"headerExtensions" : [
{
"preferredId" : 1,
"preferredEncrypt" : false,
"kind" : "audio",
"direction" : "recvonly",
"uri" : "urn:ietf:params:rtp-hdrext:sdes:mid"
},
{
"preferredId" : 1,
"preferredEncrypt" : false,
"kind" : "video",
"direction" : "recvonly",
"uri" : "urn:ietf:params:rtp-hdrext:sdes:mid"
},
{
"preferredId" : 2,
"preferredEncrypt" : false,
"kind" : "video",
"direction" : "recvonly",
"uri" : "urn:ietf:params:rtp-hdrext:sdes:rtp-stream-id"
},
{
"preferredId" : 3,
"preferredEncrypt" : false,
"kind" : "video",
"direction" : "recvonly",
"uri" : "urn:ietf:params:rtp-hdrext:sdes:repaired-rtp-stream-id"
},
{
"preferredId" : 4,
"preferredEncrypt" : false,
"kind" : "audio",
"direction" : "sendrecv",
"uri" : "http:\/\/www.webrtc.org\/experiments\/rtp-hdrext\/abs-send-time"
},
{
"preferredId" : 4,
"preferredEncrypt" : false,
"kind" : "video",
"direction" : "sendrecv",
"uri" : "http:\/\/www.webrtc.org\/experiments\/rtp-hdrext\/abs-send-time"
},
{
"preferredId" : 5,
"preferredEncrypt" : false,
"kind" : "audio",
"direction" : "inactive",
"uri" : "http:\/\/www.ietf.org\/id\/draft-holmer-rmcat-transport-wide-cc-extensions-01"
},
{
"preferredId" : 5,
"preferredEncrypt" : false,
"kind" : "video",
"direction" : "inactive",
"uri" : "http:\/\/www.ietf.org\/id\/draft-holmer-rmcat-transport-wide-cc-extensions-01"
},
{
"preferredId" : 6,
"preferredEncrypt" : false,
"kind" : "video",
"direction" : "sendrecv",
"uri" : "http:\/\/tools.ietf.org\/html\/draft-ietf-avtext-framemarking-07"
},
{
"preferredId" : 7,
"preferredEncrypt" : false,
"kind" : "video",
"direction" : "sendrecv",
"uri" : "urn:ietf:params:rtp-hdrext:framemarking"
},
{
"preferredId" : 10,
"preferredEncrypt" : false,
"kind" : "audio",
"direction" : "sendrecv",
"uri" : "urn:ietf:params:rtp-hdrext:ssrc-audio-level"
},
{
"preferredId" : 11,
"preferredEncrypt" : false,
"kind" : "video",
"direction" : "sendrecv",
"uri" : "urn:3gpp:video-orientation"
},
{
"preferredId" : 12,
"preferredEncrypt" : false,
"kind" : "video",
"direction" : "sendrecv",
"uri" : "urn:ietf:params:rtp-hdrext:toffset"
}
],
"fecMechanisms" : [
]
}
I got it working now. It seems the problem was with the Socket.IO implementation.
I used a regular closure based approach.
After trying it with a semaphore like you did in your sample app it now works for the most part. Sometimes I still get errors that seem come from requests being processed out of order.
before:
func request(_ event: String,
data: SocketData = [],
_ completion: @escaping (_ data: JSON) -> Void = { _ in }) {
client.emitWithAck(event, data).timingOut(after: self.TO) { data in
do {
let json = try self.data2JSON(data: data)
completion(json)
} catch {
l.error(error)
}
}
}
now:
func syncRequest(_ event: String,
data: SocketData = []) -> JSON? {
let semaphore: DispatchSemaphore = DispatchSemaphore.init(value: 0)
var response: JSON?
let queue: DispatchQueue = DispatchQueue.global()
queue.async {
self.client.emitWithAck(event, data).timingOut(after: self.TO) { data in
do {
let json = try self.data2JSON(data: data)
response = json
semaphore.signal()
} catch {
l.error(error)
}
}
}
_ = semaphore.wait(timeout: .now() + 10.0)
return response
}
Edit: Sorry to bother you so much. It seems like this is solely Socket.IO issue, hence I'm going to close this.
Great, glad to see you got it working.
The reason I separated the Recv/Send listeners it because they share the same name (onConnect), so I couldn't use both of them in the same class.... I'm considering renaming the events to onSend/RecvTransportConnect, however doing so will make it differ from the original libmediasoup listeners..
ah, thanks for clearing that up!
If your RoomClient
would conform to both Listeners, wouldn't they both be able to call the same onConnect()
? The function signature is the same and they both call your handleLocalTransportConnectEvent(transport: transport, dtlsParameters: dtlsParameters)
edit: well, ok, I just saw that ConsumerListener
and ProducerListener
also have identical delegate method signatures for onTransportClose(_:)
except for the parameter type, which makes them throw an error when implemented on the same class:
Method 'onTransportClose' with Objective-C selector 'onTransportClose:' conflicts with previous declaration with the same Objective-C selector
Similar problem. But maybe this could be avoided if you change the signature to onTransportClose(consumer: Consumer!)
instead of onTransportClose(_ consumer: Consumer!)
, so that they are unique... I don't know much about objc, so that's only a wild guess.
edit2: I just tried it as follows and it seems to work just fine:
func onConnect(_ transport: Transport!, dtlsParameters: String!) {
let params = JSON(parseJSON: dtlsParameters).dictionaryObject
if transport is RecvTransport {
// connect consumer transport request
}
if transport is SendTransport {
// connect producer transport request
}
}
お疲れ様です!🙂
Thanks for creating this framework.
I've been mostly sticking to the implementation of your sample app https://github.com/ethand91/mediasoup-ios-client-sample
but when
is called, it fails silently (
test
is not printed) and theRTCEAGLVideoView
(orRTCMTLVideoView
) used to render theVideoTrack
stops showing the video input. I can confirm that the VideoView is working by not callingproduce
.Do you have any idea what I'm missing?
Best Regards, Martin