Open trdroid opened 4 months ago
This should be supported already. Could you share an example of the script you're using to display the Rive file?
Hi @damzobridge,
Here are the steps I've followed:
Thanks for sending over the project. It looks like the example projects are using an older version of the RiveScreen
script. You'll need to update yourRiveScreen
script to include the audio initialization code:
private Rive.AudioEngine m_audioEngine;
void OnAudioFilterRead(float[] data, int channels)
{
if (m_audioEngine == null)
{
return;
}
m_audioEngine.Sum(data, channels);
}
private void Awake()
{
if (asset != null)
{
m_file = Rive.File.Load(asset);
m_artboard = m_file.Artboard(0);
m_stateMachine = m_artboard?.StateMachine();
int channelCount = 1;
switch (AudioSettings.speakerMode)
{
case AudioSpeakerMode.Mono:
channelCount = 1;
break;
case AudioSpeakerMode.Stereo:
channelCount = 2;
break;
case AudioSpeakerMode.Quad:
channelCount = 4;
break;
case AudioSpeakerMode.Surround:
channelCount = 5;
break;
case AudioSpeakerMode.Mode5point1:
channelCount = 6;
break;
case AudioSpeakerMode.Mode7point1:
channelCount = 8;
break;
case AudioSpeakerMode.Prologic:
channelCount = 2;
break;
}
m_audioEngine = Rive.AudioEngine.Make(channelCount, AudioSettings.outputSampleRate);
m_artboard.SetAudioEngine(m_audioEngine);
}
Camera camera = gameObject.GetComponent<Camera>();
Assert.IsNotNull(camera, "RiveScreen must be attached to a camera.");
// Make a RenderQueue that doesn't have a backing texture and does not
// clear the target (we'll be drawing on top of it).
m_renderQueue = new Rive.RenderQueue(null, false);
m_riveRenderer = m_renderQueue.Renderer();
m_commandBuffer = m_riveRenderer.ToCommandBuffer();
if (!Rive.RenderQueue.supportsDrawingToScreen())
{
m_helper = new CameraTextureHelper(camera, m_renderQueue);
m_commandBuffer.SetRenderTarget(m_helper.renderTexture);
}
camera.AddCommandBuffer(cameraEvent, m_commandBuffer);
DrawRive(m_renderQueue);
}
We'll update the example projects soon to include audio support.
I've updated the RiveScreen.cs with the aforementioned changes and I am able to hear the audio now. Thank you @damzobridge
Hi @damzobridge, with the aforementioned update, I am able to hear the audio when played in the Unity Editor but when I built the same to WebGL, the audio in the Rive file is not played; I even tried clicking in the browser once but that did not play the audio either.
Thanks, you're right, audio currently works on the other platforms, but not webGL. We're looking into the issue now.
In the meantime, I recommend using Rive Events, that way you can trigger audio natively in Unity in response.
@trdroid This is happening because Unity doesn't support OnAudioFilterRead
on WebGL which we rely on at the moment. We're looking into ways around this. As mentioned, using Rive events would let you integrate your audio more natively using AudioClips and other audio-related plugins you might be using already.
Thank you @damzobridge for specifying the reason as to why it's currently broken in WebGL. Like you've mentioned, I'd resort to Rive Events and Unity Audio to get around this.
Hi,
I rendered a Rive file with sound effects in Unity and noticed that the sound effects are not played. I was wondering if this would be supported in the future.
Thank you.