Closed juslesan closed 3 years ago
Thank you for sharing your experience.
Could you please tell me which OS & node version you are using? Actually we have pre-built binaries on npm, we can check for your version.
- There do not seem to be any validation checks for the iceServers parameter for connections. I spent quite some time figuring out why I was unable to set connections between machines. Turns out that I had the incorrect format for the Iceservers and got no error reports.
iceServers
parameter has to be an array and this is checked by library. Other format checks made by libdatachannel
itself. (protocol, port etc...)
Could you please write an example of wrong server config?
- Are there any plans to set a threshold for the onBufferedAmountLow event? Currently it's required to wait for the buffer to be cleared for the event to fire.
I am not sure if I understood correctly but you can always query current buffer level by using datachannel function bufferedAmount
- Will it be possible to increase the maximum message size in the future?
Max message is limited by libdatachannel
. Please check it from here;
https://github.com/paullouisageneau/libdatachannel/blob/973f58ec8b2c17d51dc56e5c545131896cd04837/include/rtc/include.hpp#L66
Could you please tell me which OS & node version you are using? Actually we have pre-built binaries on npm, we can check for your version.
I'm using MacOS Catalina for local development. For running performance testing and CI I'm using Ubuntu 20. On ubuntu the install seems to be faster. For mac it takes a couple of minutes.
Could you please write an example of wrong server config?
The faulty config was a list of JSONs. It works well now with the list format.
I am not sure if I understood correctly but you can always query current buffer level by using datachannel function bufferedAmount
I mean specifically the limit that is used to emit the onBufferedAmountLow event. Currently it can only be emitted when the buffer size is 0. Would it be possible to increase this limit for the event? It would allow more continuous sending of data, instead of waiting for the buffer to clear.
Max message is limited by libdatachannel.
So is it possible to manipulate the description messages to increase the max_message_size?
Thanks for the response!
Could you please try 0.0.11-dev
branch?
You can set buffered amount threshold like dc1.setBufferedAmountLowThreshold(100);
Also please test jest with npm run test
. I added cleanup
function and now it is exiting normally on my side.
I tried it and the install is much faster already! The setBufferedAmountLowThreshold function doesn't seem to work yet, although I can access the binding. Tests also seemed to be a more stable but I can't try it on CI yet.
setBufferedAmountLowThreshold
function is working as I see. Please re-check it.
About max message size;
Default is 65536 which is define at libdatachannel
.
Local max message size (LOCAL_MAX_MESSAGE_SIZE
) ise 262144.
Max Message size is calculated like Minimum of (remoteMax, LOCAL_MAX_MESSAGE_SIZE)
.
If both side is using libdatachannel
then limit is LOCAL_MAX_MESSAGE_SIZE
.
I am not sure how to change sdp maxMessage parameter. I will ask to libdatachannel
author.
If it is OK I can add necessary function to change maxMessage.
I am using github CI for creating pre-built binaries. It is using macos-latest
as mac OS.
Is it suitable for you?
You can check pre-built binaries from release pages.
setBufferedAmountLowThreshold seems to work correctly, the error was at my end. Thanks!
The prebuilt binaries seem to be fine. The installation is much faster as there is no need to run cmake.
I'm waiting to hear any news about the max message size! It would be great if the max message size could be increased a bit. Otherwise we will need to start fragmenting larger messages.
Local max message size ise 262144
. Isn't this enough for you?
For bigger sizes, we have to change libdatachannel
source files and this is not planned for now.
Hi!
The current local message size should be enough. However, we may have users that want to send bigger messages. Moreover, we'll be implementing connectivity to browsers in the overlay network in the future. My understanding is that the maxMessageSize for inter-library communication is 65536? The connections between browsers and node-datachannel connections would be limited to this. We could do a workaround by fragmenting the messages in the application layer or by setting limits, but it would be easier for us to be able to send bigger messages if needed.
If there are no plans for allowing to change the maxMessageSize then there's no need for you to prioritise for now. It would be a nice feature in the future though!
Thanks!
My understanding is that the maxMessageSize for inter-library communication is 65536?
No current maxMessageSize is 262144. If you are getting 65536 as max then maybe browser is limiting?
No current maxMessageSize is 262144.
Okay good! I haven't actually tried to connect with browser yet. I previously misunderstood how the default max message size of 65kb works in the libdatachannel source code.
Thanks for all of the updates and answers!
Hello!
I have been working on implementing this library to be used as the backbone of the Streamr Network. So far your library has been working well, great work! I have some questions and suggestions related to the current implementation of this library that are somewhat related to the libdatachannel. I also have one issue to report.
Installing the library takes quite a while currently. Would it be in anyway possile to optimise this? I have been using the npm library for installs.
There do not seem to be any validation checks for the iceServers parameter for connections. I spent quite some time figuring out why I was unable to set connections between machines. Turns out that I had the incorrect format for the Iceservers and got no error reports.
Are there any plans to set a threshold for the onBufferedAmountLow event? Currently it's required to wait for the buffer to be cleared for the event to fire.
Will it be possible to increase the maximum message size in the future?
Are there any plans to implement negotiated dataChannels?
There seems to be an issue with the datachannels sometimes not closing properly. This leads into issues when testing with Jest. CI often gets completely stuck even if Jest is started with --forceExit. Hopefully you could look into this problem!
Thanks! /Santeri