badaix / snapcast

Synchronous multiroom audio player
GNU General Public License v3.0
6.08k stars 451 forks source link

WebAudio: Send data from client as TCP #932

Open YeonV opened 2 years ago

YeonV commented 2 years ago

Similar to the broadcast-functionality of the Snap.Net-app, one could use WebAudio to broadcast from client to server:

https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getUserMedia

JS-Example of using WebAudio including input device selection and asking for user-permission in browser: https://github.com/YeonV/wled-manager/blob/v0.0.9/renderer/components/AudioContainer.jsx#L41-L65

badaix commented 2 years ago

I don't understand this. What is the "user story" behind this? What does the user want to do, how is WebAudio related to this?

BrainP4in commented 2 years ago

Not sure if YeonV meant this, but a user story for me would be: There are multiple devices which run Snapclient with speakers attached. Since most Sound cards have AUX-IN or Mic Jacks, you could use them for inserting audio from an Analog device or an Android Phone.

Don't know if it's too much of an edge case for this Project, but thanks either way, I love snapcast!

kingosticks commented 2 years ago

Those machines should additionally run snapserver and configure an ALSA source. It's still a mystery where webaudio might fit into all this.

BrainP4in commented 2 years ago

Oh, I think I misunderstood him. Could he mean with "could use WebAudio to broadcast from client to server" that it would be useful to him if Snapserver would have a "WebAudio Source"? No idea if this is useful for anybody, but it sounds like it after reading it a view times. My first thought of a use case would be the option to stream from devices of Friends via the Browsers Screen Capture API.

nis65 commented 2 years ago

My 50 cts: I run a system with about 6 raspis. Two of them have a snapserver running, all of them hab two snapclients eunning, one connecting to one snapserver, the other to the other snapserver. Both output simultaneously to the alsa Har hardware dmix sink. As I have automated the setup with ansible,, this scales well.

So I think snapclients should remain pure sinks and the snapserver a pure source.

The streaming of a 'remote analog source can be achieved by running asnapserver there and a snapclient in the main snapserver, the audio stream is passend e.g. via an alsa loopback device.

Maybe it would male sense to add snapclient as source type to snapserver, so that the alsa loopback device of the example above could be omitted?