GoogleChrome / omnitone

Spatial Audio Rendering on the web.
https://googlechrome.github.io/omnitone
Apache License 2.0
858 stars 115 forks source link

Can we use omnitone for live streaming? #23

Closed afgenovese closed 7 years ago

afgenovese commented 8 years ago

I want to pair 360 video with ambisonics, and stream live it on the internet (delay is acceptable for stitching and pairing). Can omnitone be a solution for this issue? What are the possibilities for live VR reproduction?

hoch commented 8 years ago

That's a good question. Theoretically, MediaStreamAudioSourceNode supports the streaming. However, I don't think any browser supports the multichannel streaming more that 2 at the moment. If MediaStreamAudioSourceNode supports up to 4 channels, it will work with Omnitone without any issue.

hoch commented 8 years ago

FWIW, this is the existing bug entry for Chromium: https://bugs.chromium.org/p/chromium/issues/detail?id=453876

In short, if your streaming set up requires the microphone input from getUserMedia API, it's not going to be possible anytime soon. However, if you want to relay the video streaming from the server (not the web browser) I think it might work through MediaStreamAudioSourceNode.

gzalles commented 7 years ago

So essentially we can stream it but the user doesn't have control of what they see, it has to be directed on the server side? Just confirming.

hoch commented 7 years ago

Let me clarify my opinion:

  1. FOA stream capturing (browser) > streaming > FOA decoding/rendering (browser + Omnitone): this is not possible at the moment because the browser uses getUserMedia() to capture the audio and it does not support more than 2 channels on any browser at the moment. This API is managed by WebRTC team, so we have to ask them to enable the multi-channel support. (Chrome bug entry)

  2. FOA stream (streaming server) -> streaming -> FOA decoding/rendering (browser + Omnitone): this depends on MediaStreamAudioSourceNode. With the new FOARenderer in 0.2.x, you can create a MediaStreamAudioSourceNode and connect the stream into the FOA renderer directly. I haven't tested this in the real web app, but I think it is possible by looking at the code.

@gzalles I am not sure what you meant by "user doesn't have control of what they see". If the media is encoded for 360 video + FOA stream, user can definitely control what they hear/see.

hoch commented 7 years ago

In short, Omnitone is media-type-agnostic. It simply decodes/rendered the FOA stream (4-channel) to the binaural audio stream no matter where the source comes from.

hoch commented 7 years ago

Please re-open the issue if any question/problem remains.

topherbuckley commented 7 years ago

@hoch

I had a related question to your comments above. I can open another issue if you feel the need but,

In your README you have the following example code:

'''javascript // Set up an audio element to feed the ambisonic source audio feed. var audioElement = document.createElement('audio'); audioElement.src = 'audio-file-foa-acn.wav'; '''

Does this allow for direct references to video files as well as audio files? If not, then what would be the proper syntax or setup for a video file? Are there any file extensions limitations from omnitone itself or just browsers/platforms?