Closed andrewvmail closed 8 years ago
It used to be the case that you could use wrtc
to receive audio, after providing it as the WebRTC implementation as described in https://github.com/onsip/SIP.js/pull/108. However, wrtc
dropped MediaStream support somewhere around https://github.com/js-platform/node-webrtc/commit/992543f249dca33060eef76090c4aaebd9f24985, so you'd have to use an old version to make that work.
I'm not aware of any other MediaStream implementations for Node, but you might be able to use something like NW.js or Electron to create a standalone webapp that supports media streaming, if you're interested in that.
I'm closing this for now. https://github.com/feross/webtorrent-hybrid/issues/5#issuecomment-178995442 lists some libraries that may be useful for server-side MediaStream support. Specifically, if electron-webrtc adds MediaStream support, it may be possible to use it to receive (and potentially send?) audio.
Hi Guys, just wanna say wonderful library you guys have here.
Now that said, I know that nodejs support is pretty experimental. But could someone give me guidance on how to get mediaHandling working something in the line of accepting the calls and playing an audio of some sort and hang up.
I was able to get everything going, overriding the mediaHandlerFactory with an empty get and set descriptions functions. But not sure where to start to get media flowing from both UA running in node.
Thanks Andrew