Open karasusan opened 3 years ago
This is the first PR https://github.com/Unity-Technologies/com.unity.webrtc/pull/478
Set up CI process of the WebGL platform in the internal CI platform ((@karasusan)
I tried this topic last week, however, I found this is needed more time than I expected. So we test the package for WebGL platform manually.
memo: WRS-158
The pull request(#554) above was merged into this feature branch on our repository. https://github.com/Unity-Technologies/com.unity.webrtc/tree/experimental/webgl-platform
Please feel free to try WebGL build using this branch. And post issues on this thread if you find.
The issue below is posted in Unity forum https://forum.unity.com/threads/webgl-platform-support.1147331/#post-7884949
Is it possible to add support for microphone in browser too? I have seen MediaStreamAddUserMedia
function but I don't see there are any sample for it
@Thaina
I guess the MediaStreamAddUserMedia
API is not supported for microphone.
Do you have any ideas? @nvanofwegen
As far as I can see MediaStreamAddUserMedia
was called navigator.mediaDevices.getUserMedia
which seem totally about microphone so I wish it did work and I am started to try using it now. But because there was no sample I still try to figure out how to start with it
I has play with api and it seem I could open microphone and register stream and track. Only these function was not implemented
There is a WebGL scene in the Samples, which uses the UserMedia API of the browser. https://github.com/Unity-Technologies/com.unity.webrtc/blob/a39e89c2b18f37c093977ce3106f1305fd03992a/Samples~/WebGL/WebGLSample.cs#L104-L106
This basically create a VideoStream in Unity and attaches the MediaStream received from the UserMedia API on the javascript side. This stream is send to the receiver.
Please note that this has little to nothing to do with the Audio implementation recently added in 2.4.0-exp.5:
The methods you quoted (context register) were added to 'pass' the tests, but are not implemented.
Beside those limitations, you should be able to send your webcam/microphone using this package. It's just plain javascript functionality and cannot be used with Unity audio components.
Hope this helps :)
I have been using this constraints to get only the microphone
var receiveStream = new MediaStream();
receiveStream.AddUserMedia(new (){ audio = true,video = false });
And this could open the microphone for me. Currently now I have clone this repo and add these (and only these) code to experimental/webgl-platform
branch
// Runtime\Plugins\WebGL\Context.jslib:213
ContextRegisterAudioReceiveCallback: function (contextPtr, trackPtr, AudioTrackOnReceive){
if (!uwcom_existsCheck(contextPtr, 'ContextRegisterAudioReceiveCallback', 'context')) return;
if (!uwcom_existsCheck(trackPtr, 'ContextRegisterAudioReceiveCallback', 'track')) return;
var ctx = UWManaged[contextPtr];
/** @type {MediaStreamTrack} */
var audioStreamTrack = UWManaged[trackPtr];
const audioContext = new AudioContext();
const microphone = audioContext.createMediaStreamSource(new window.MediaStream([audioStreamTrack]));
const analyserNode = audioContext.createAnalyser();
microphone.connect( analyserNode );
var bufferLength = analyserNode.frequencyBinCount;
var dataArray = new Float32Array(bufferLength);
function live() {
requestAnimationFrame(live);
analyserNode.getFloatFrequencyData(dataArray);
var settings = audioStreamTrack.getSettings();
var ptr = uwcom_arrayToReturnPtr(dataArray, Float32Array);
Module.dynCall_viiiiii(AudioTrackOnReceive, trackPtr, ptr + 4, bufferLength,settings.sampleRate , settings.channelCount, settings.sampleSize);
}
live();
},
// Runtime\Scripts\AudioStreamTrack.cs:257
[AOT.MonoPInvokeCallback(typeof(DelegateAudioReceive))]
static void OnAudioReceive(
IntPtr ptrTrack, float[] audioData, int size, int sampleRate, int numOfChannels, int numOfFrames)
{
WebRTC.Sync(ptrTrack, () =>
{
if (WebRTC.Table[ptrTrack] is AudioStreamTrack track)
{
foreach(var i in Enumerable.Range(0,audioData.Length))
audioData[i] = Mathf.Clamp01(Mathf.Pow(10,audioData[i] / 20));
track.OnAudioReceivedInternal(audioData, sampleRate, numOfChannels, numOfFrames);
}
});
}
I now could send audioData
which is an array of float in decibel from the microphone into unity. However now it still cannot create AudioClip
properly and eventually throw this below exception and stop the loop
Is this feature was deprecated?
@Thaina Sorry for waiting this but we haven't start to make it.
@karasusan Are there anything blocking? I would like to have some prospect and vision of this feature so maybe I could start making pull request
@Thaina This branch is old and isn't following the latest update. But you can check the idea. https://github.com/Unity-Technologies/com.unity.webrtc/tree/experimental/webgl-platform
I have already check that branch but it still stay at the same place
Should there be anything else I need to concern before start implement this?
Hi!
I'm using the new experimental branch (https://github.com/Unity-Technologies/com.unity.webrtc/commits/thaina/experimental/webgl-platform) and building works, except I'm getting the following error when launching the game:
Is this a bug or is there any config step I'm missing?
No, I am very sorry but that branch is not really work yet. I just fork to try to implement it but never finishing. Only finish the migration to newer version
No, I am very sorry but that branch is not really work yet. I just fork to try to implement it but never finishing. Only finish the migration to newer version
Oh, okay. Thanks for letting me know 🫡
Precondition
WebGL support has been advanced by external developers in Unity community. Very thanks @nvanofwegen and @gtk2k https://github.com/The-Barn-Games/com.unity.webrtc/tree/feature/webgl-support
WebGL support is a long journey and we have many issues with merging it to the main branch. So I made the feature branch that receiving contributions from developers. https://github.com/Unity-Technologies/com.unity.webrtc/tree/experimental/webgl-platform
TODO