Closed DominikSchaller closed 3 years ago
I also thought about such an approach. I think the stable option is not trivial to realize. When is the connection stable? And then, only increase buffers if needed, never decrease? For me the following would fulfill my needs:
I also thought about such an approach. I think the stable option is not trivial to realize. When is the connection stable? And then, only increase buffers if needed, never decrease? For me the following would fulfill my needs:
- on a new connection auto buffer is turned on
- when I see, the auto-buffer sizes are constant, I push a (new) button "Stable mode", which turns auto buffers off an increases both buffers by a configurable value which defaults to 2. For me this button should be positioned under the buffer sliders, because I arrange my windows, so I can always see ping and overall delay, even if the other part of the settings window is behind the jamulus main window.
How is Auto done at the moment? I would do it in the same fashion, just with a different tolerance level. I wouldn't force the user to push a new button on connect. The cool thing about the Auto mode is that the user doesn't need to monitor the connection. He can just connect and play (or sing).
I find myself and others usually disabling Auto and moving both sliders up 2 or 3 notches.
That's exactly what we do in our choir as well. I would certainly welcome a more "conservative" Auto mode.
The cool thing about the Auto mode is that the user doesn't need to monitor the connection. He can just connect and play (or sing).
I agree, this would be the best solution. Just curious if it could be done.
I just saw, this was already discussed here a while ago: https://github.com/jamulussoftware/jamulus/issues/417 Maybe someone can consolidate the issues (I don't know how to do this in Github).
I think a simple solution is to create a settable minimum buffer size for the auto jitter buffer at the client side. eg. buffer cannot go below a set minimum. Also have a settable maximum buffer size which cannot go above a set maximum. Thus, the auto jitter buffer would have 'user set' range limiting. This would limit dropouts at the small end, and limit latency at the big end.
Hi,
from a band-point-of-view I can totally agree with this statement:
I think for a band setting it works great. You want the lowest possible latency as possible, some hiccups in the sound are tolerable.
We regularly do the opposite of what you describe: turn "auto" off and lower the buffer settings so to a minimum with just tolerable audio dropouts. Please make sure that it will stay possible to easily tweak for lowest possible latency.
This seems like a duplicate of this discussion https://github.com/jamulussoftware/jamulus/discussions/1054
In the interests of keeping the debate in one place until a spec or PR for the work can be agreed, I'll close this now.
The Auto checkbox for the buffer sizes works great, but it focuses too much on reducing latency for my taste. I think for a band setting it works great. You want the lowest possible latency as possible, some hiccups in the sound are tolerable. But for a choir setting with >20 singers the hiccups in the sound get annoying. I find myself and others usually disabling Auto and moving both sliders up 2 or 3 notches. This introduces a bit more latency, but the sound is much more stable. Of course one needs to re-adjust regularily if the connection changes.
So my proposal is this: Instead of having the Auto checkbox I suggest a combobox with the following entries:
I'm not great with naming things, so I hope you have better ideas how to name the entries.
I think this would be a useful feature for bigger groups.