Closed mtmckenna closed 7 years ago
Wasn't it possible to use jsfx.Sound
directly? Also, since it's plain PCM buffer, you should be able to use the float PCM directly. e.g.
var processor = new jsfx.Processor(params);
var block = new Float32Array(processor.getSamplesLeft())
processor.generate(block);
source.buffer = block;
I think something like this would make more sense:
jsfx.SoundData = function(params, modules){
var processor = new Processor(params, modules);
var block = new Float32Array(processor.getSamplesLeft())
processor.generate(block);
return block;
};
So the above usage could be simplified to:
source.buffer = jsfx.SoundData(params);
Although, naming it jsfx.SoundBuffer
would make it more consistent with AudioContext.
BTW. you can already use AudioContext directly:
var node = jsfx.Node(context, params, jsfx.DefaultModules, 2048);
node.connect(context.destination);
Hello,
I did try to use jsfx.Sound directly with createMediaElementSource
, but I found it harder to use than createBufferSource
because I still had (I think) to play and pause using the <audio>
element's play and pause functions. I'm not 100% positive, but I think that means even though I was using Web Audio API, I was still bogged down by the limitations of the <audio>
tag on mobile.
Thank you for your SoundData
solution--it definitely looks cleaner. I tried it in my app, and received this error: Uncaught TypeError: Failed to set the 'buffer' property on 'AudioBufferSourceNode': The provided value is not of type 'AudioBuffer'.
.
I'll try to create a little demo app that reproduces the problem this weekend and get back to you. I'd love SoundData to be something I can use!
Thank you for your help!
McKenna
Thanks for your help--I was able to use jsfx.Node directly with AudioContext, so that was good. The problem with it though is that ScriptProcesserNode
doesn't have a start()
and stop()
method, so I wasn't able to find a way to play back the generated effect without creating a new Node
.
My use case is that I'm building a game that uses sound effects generated by jsfx. I'd like to generate the sound effects once when the app loads and be able to replay them on demand. I could use the jsfx.Node
way of doing it if I regenerate the Node
each time the effect needs to be played, but that seems inefficient.
Also, I tried using adding a SoundBuffer
method to jsfx as you described. However, I wasn't able to get that to work either. I created a branch on my fork of jsfx that adds a "Play w/ Sound Buffer" button to index.html
that shows the error in case you're interested in looking at that.
Let me know if I can provide any other info. If it doesn't make sense to you to add a Wave
type method to jsfx, that's okay--I can keep using my fork.
Thank you!
McKenna
I'm guessing instead of FloatArray it would need to use AudioBuffer.
If we can't get the SoundBuffer version working then I'm fine with Wave
.
Hi, I've finally had some time to look at this.
This should work now:
var source = context.createBufferSource();
source.buffer = jsfx.AudioBuffer(context, params);
source.connect(context.destination);
source.start();
Thank you! Just was able to try this out today, and it worked perfectly. Thank you for taking the time to add this feature.
Would you also be willing to update the package on NPM?
Thank you!
Hello! I'd like to be able to get the raw wave data from jsfx so I can pass it through it an audio context to play later. Something like this:
You already had the ability to return a Uint8Array, so in this commit, I exposed that function. I also moved some common logic into its own function. Do you think exposing this method makes sense?
Thank you!
McKenna