Closed afgenovese closed 7 years ago
That's a good question. Theoretically, MediaStreamAudioSourceNode supports the streaming. However, I don't think any browser supports the multichannel streaming more that 2 at the moment. If MediaStreamAudioSourceNode supports up to 4 channels, it will work with Omnitone without any issue.
FWIW, this is the existing bug entry for Chromium: https://bugs.chromium.org/p/chromium/issues/detail?id=453876
In short, if your streaming set up requires the microphone input from getUserMedia API, it's not going to be possible anytime soon. However, if you want to relay the video streaming from the server (not the web browser) I think it might work through MediaStreamAudioSourceNode.
So essentially we can stream it but the user doesn't have control of what they see, it has to be directed on the server side? Just confirming.
Let me clarify my opinion:
FOA stream capturing (browser) > streaming > FOA decoding/rendering (browser + Omnitone): this is not possible at the moment because the browser uses getUserMedia()
to capture the audio and it does not support more than 2 channels on any browser at the moment. This API is managed by WebRTC team, so we have to ask them to enable the multi-channel support. (Chrome bug entry)
FOA stream (streaming server) -> streaming -> FOA decoding/rendering (browser + Omnitone): this depends on MediaStreamAudioSourceNode. With the new FOARenderer in 0.2.x, you can create a MediaStreamAudioSourceNode and connect the stream into the FOA renderer directly. I haven't tested this in the real web app, but I think it is possible by looking at the code.
@gzalles I am not sure what you meant by "user doesn't have control of what they see". If the media is encoded for 360 video + FOA stream, user can definitely control what they hear/see.
In short, Omnitone is media-type-agnostic. It simply decodes/rendered the FOA stream (4-channel) to the binaural audio stream no matter where the source comes from.
Please re-open the issue if any question/problem remains.
@hoch
I had a related question to your comments above. I can open another issue if you feel the need but,
In your README you have the following example code:
'''javascript // Set up an audio element to feed the ambisonic source audio feed. var audioElement = document.createElement('audio'); audioElement.src = 'audio-file-foa-acn.wav'; '''
Does this allow for direct references to video files as well as audio files? If not, then what would be the proper syntax or setup for a video file? Are there any file extensions limitations from omnitone itself or just browsers/platforms?
I want to pair 360 video with ambisonics, and stream live it on the internet (delay is acceptable for stitching and pairing). Can omnitone be a solution for this issue? What are the possibilities for live VR reproduction?