Open Madhu205 opened 8 years ago
As far as I know, there is no such feature considered in UPnP or DLNA. It would require some millisecond-exact tuning of time differences between renderers, and there is simply no mechanism defined to do so. So - I would suspect not.
I am thinking by implementing a control point, which have a feature like selecting multiple renders instead of one render at a time,It may possible to achieve synchronization between multiple renders. What you say?
You still need to account for various speeds of renderers: if one starts a couple of milliseconds earlier than the other, you will have a strong echo effect (say you have the renderers in different rooms). So there needs to be something in the protocol that can synchronize the renderers between each other. A control point only sends a URL over and possibly the 'start', but does not actively monitor if and when renderers actually start.
Since UPnP does not have any description of such a synchronization protocol (that I know of), one could think of a proprietary extension to gmrender.
I know that, but by implementing a control point with the above feature, at least we can feel like media playing on multiple renders synchronously(same type of renders). Because I am thinking their is no other option for doing this.
Not sure whether resurrecting an old issue is a good idea.
In january I have built snapcastc which plays back Audio in synch (bettet than 1ms) across multiple devices. Combining this with a dlna frontend seems like a good idea. Adjusting Volume and stopping a Client could then be done in a generic way.
This works well over WiFi with many clients using Opus for Transfer. We could provide a single upnp Client that would sink its input into snapcastc which would distribute and play it. Also multiple control points could be run on each Playback device.
I am New to upnp/dlna. What do you think?
Do you use gstreamer as backend or have a separate way ? Do all devices play from the same http stream, or is there one that then distributes it ? Do you have a link to the code ?
don't the openhome extensions provide renderer synchronization support? it would maybe be better to use an existing thing for this rather than stirring together our own
Songcast (OpenHome) and Sncapcast seem fairly similar, but @christf might have a better idea about differences. From all what I have heard of OpenHome so far is a Linn-proprietary thing with thrown-over-the-wall specification but no actual open-source implementation to look at, so it might be easier to start with some existing casting protocol such as snapcast and @christf 's implementation.
From gmrender persepctive, we should support OpenHome mostly for the render control aspects (including the OpenHome extensions), but should stay out of music decoding and casting, for which we should use existing libraries to not duplicate effort (gmrender uses gstreamer for that currently).
@christf one way I see to get something running quickly is to use --gstout-audiopipe
in gmrender-resurrect which allows to write the data into a pipe and thus should already be possible to work directly with snapcastc ?
For the code, have a look at https://github.com/christf/snapcastc
and yes, if gmrender will output to a pipe, then this should already work directly with snapcastc. Will have a closer look.
... Gstreamer is Not used. Instead this decodes Opus and directly plays to alsa.
I am having trouble getting Gstreamer to work - it complains about missing plugins for mpeg1 and vorbis while they avtually are installed.
... mmh, do the instructions in https://github.com/hzeller/gmrender-resurrect/blob/master/INSTALL.md miss something ?
Maybe some sudo ldconfig
missing ?
For what it's worth there's also shairport-sync if you operate in the Apple-sphere.
@christf Is the original Snapcast implementation that heavy on resources?
On Mon, Oct 28, 2019 at 07:47:27AM -0700, Tucker Kern wrote:
For what it's worth there's also shairport-sync if you operate in the Apple-sphere.
yes it exists. And it is specifically designed for the apple-sphere. I have looked at it a while ago, not sure what codecs are used and how good the synchronisation works nowadays.
@christf Is the original Snapcast implementation that heavy on resources?
I tried the originial snapcast implementation and could not get it to work without skips because of the way the input pipe is read. I attempted a fix in late 2018 but realized it would take a rewrite. That is why snapcastc was born in january 2019. It just works.
-- () ascii ribbon campaign - against html e-mail /\ www.asciiribbon.org - against proprietary attachments
For what it's worth. It looks like the AVTransport Service v3 has declared some capability of synchronized playback. http://www.upnp.org/specs/av/UPnP-av-AVTransport-v3-Service.pdf
Is it possible to play a song in multiple renders at a time.Presently this feature not available with current control points.If is it possible,then how to implement it. Thanks for any information..