Closed JSmithOner closed 1 year ago
Why are you 1) taking the audio data out of AudioWorkletProcessor and then 2) play it through an AudioBufferSourceNode?
You can just play the audio with AudioWorkletNode. Am I missing something?
Sorry the implementation is incomplete and it's been a while since I've tested this. My current goal is to send data from an audioWorkletNode to another audioWorkletNode client to client through WebRTC and at this point I was only able to emulate this behaviour with a lot of latency (the signal is uncompressed).Thanks
We are unsure what the question is. I am closing this, but please feel free to reopen if you have a question relevant to the title of the issue.
I have this code which basically retrieve some samples outside an AudioWorklet in a shape of
Float32Array
. I use postMessage to send the samples outside of the audioWorklet and trigger the restitution. At this point if I set the buffer to 32 ( basically 32 * 128 ) the sound sounds ok (few glitches) but latency is enormous.I you had any ideas how I could solve this or eventually create an example on this repo this would be great