Open Terranic opened 4 months ago
Flutter Web is poorly supported by flutter sound. I am currently working on a new project which will do streaming, but not for today. There is much work to do. If you are looking for streaming on web, you can have a look to the W3C web audio api. If you just want to translate int16 to float 32, or the reverse, you can loop into your buffer. The algorithm is trivial.
I have the same issue. I need to stream audio in Flutter WEB (to use with elevenlabs websocket API). flutter_sound is working fine with Android and iOS but does not work on WEB. I'm very interested by the library you are working on
Flutter Web is poorly supported by flutter sound. I am currently working on a new project which will do streaming, but not for today. There is much work to do. If you are looking for streaming on web, you can have a look to the W3C web audio api. If you just want to translate int16 to float 32, or the reverse, you can loop into your buffer. The algorithm is trivial.
I have the same issue. I need to stream audio in Flutter WEB (to use with elevenlabs websocket API). flutter_sound is working fine with Android and iOS but does not work on WEB. I'm very interested by the library you are working on
I decide to do something to support audio streams on flutter web. This will be a new feature, so let me work on that during several days (or week?). I will post here how things are going.
btw: do you need pcm int16 or float32 ?
I have the same issue. I need to stream audio in Flutter WEB (to use with elevenlabs websocket API). flutter_sound is working fine with Android and iOS but does not work on WEB. I'm very interested by the library you are working on
Flutter Web is poorly supported by flutter sound. I am currently working on a new project which will do streaming, but not for today. There is much work to do. If you are looking for streaming on web, you can have a look to the W3C web audio api. If you just want to translate int16 to float 32, or the reverse, you can loop into your buffer. The algorithm is trivial.
I have the same issue. I need to stream audio in Flutter WEB (to use with elevenlabs websocket API). flutter_sound is working fine with Android and iOS but does not work on WEB. I'm very interested by the library you are working on
funny.... same usecase here 👍
I decide to do something to support audio streams on flutter web. This will be a new feature, so let me work on that during several days (or week?). I will post here how things are going.
btw: do you need pcm int16 or float32 ?
Great!!! actually I would think int16 - but maybe make it configurable ifpossible
I am currently working on this request. I think it will not be too difficult to implement. Be patient.
I decide to do something to support audio streams on flutter web. This will be a new feature, so let me work on that during several days (or week?). I will post here how things are going.
btw: do you need pcm int16 or float32 ?
would be nice to support mp3 and pcm format with multiple bit rates.
I promised to let you inform of this request development : Things are going well. I will finished a beta version in a few days. I am really impressed by the very short latency on web. I have yet to figure out how to improve Flutter Sound API with new features (for example PCM-float32) and continue being compatible with the current API on iOS and Android.
great job! Any beta versions are highly appreciated for testing (if you want to)
I released today a new beta version 9.8.1. This version support Record to Stream on web. Please note the following points :
numChannels:
can be specified. If you do not specify this parameter, the default is 1
.codec:
parameter can be specified. Its value can be either Codec.pcm16
or Codec.pcmFloat32
sampleRate
is deprecated. Flutter Web always uses the sampleRate supported by the hardware. We cannot change it, because we would have to redo a sampling for that and that would be bad for the sound quality..myRecorder.getSampleRate()
to get the sample rate used by the platform.startRecorder()
you can choose one of three parameters:
toStream:
which is compatible with the old toStream:
parameter. You receive a Stream of UInt8List
packets. This parameter is not supported for numChannels: != 1
. This parameter is more or less deprecated, and is kept for compatibility with previous versions.toStreamFloat32:
can specify a StreamSink<List<Float32List>>
. Each packet received by this stream sink is composed of several Float32List
. There is 1 Float32List
for each channel. This parameter is the most efficient. Specially on Web and on iOS. This is probably the parameter that the App must use.toStreamInt16:
can specify a StreamSink<List<Int16List>>
. Each packet received by this stream sink is composed of several Int16List
. There is 1 Int16List
for each channel. This parameter is slightly less efficient than toStreamFloat32:
bufferSize:
must be a power of 2. Its default is 8192Could you also offer webm/opus support which is a standard codec by chrome/edge. Because pcm creates too much traffic
Yes, webm/opus is a true important codec on web.
This codec is available with Record to Buffer
and record to file
.
Actually Record to Stream
is only with RAW PCM (PCM-INT16 and PCM_FLOAT32).
I agree that it would be great if we can support live encoding for other codecs. I do not know if it is even possible : compressing audio needs a lot of CPU, and not sure that we can do that on live audio.
I keep this issue open, because it is interesting.
Actually, the Mic -> Stream encoding works great with the MediaRecorder Browser API (e,g, react code:
mediaRecorder.current = new MediaRecorder(stream, { mimeType: 'audio/webm', audioBitsPerSecond: 16000 });
mediaRecorder.current.ondataavailable = (event) => ....
)
Not quite sure if you use it
This feature would be very interesting on flutter sound web. I will look to that in a few days. One of the problems would be the compatibility with Android and iOS. Probably flutter sound users would be happy to compress live audio data. For example with a mp3 codec …
I will post something on this thread in a few days.
Just a note: I think that opus-webm is not supported on Safari. But I am not completely sure.
@Terranic : actually we use Web Audio API and not MediaRecorder. I have just looked to MediaRecorder and this lib seems very powerful. I suggest to use it on Flutter Sound Web, and work on iOS and Android later.
If you look to Mozilla Doc you can see that the App can do RequestData() or can specify a timeslice value. I am actually not sure if it is important to be able to do that on Flutter Sound, but I guess it is ...
What do you think ?
I run the code from record_to_steream_example
However, recordingDataControl.stream.listen
did not trigger a callback and did not obtain any buf
@Larpoux
Flutter Sound Version : 9.8.1 OS:Android 13
I am actually working on recording to stream. I am sorry if there are some regressions. Everything will be more stable in a few days.
btw : what is your platform? iOS, Android or web ?
Thanks, I am looking forward to the upcoming version
Some news about the dev : Things are going pretty well. On flutter web, when the app will want to record to stream, it will be able to choose between:
i Hope to deliver a new flutter sound version on Tuesday. A little bit of patience…
Flutter Sound 9.9.0 is released.
Codec.opusWebM
and Codec.aacMP4
. (#1056).MediaRecorderExample
to show how to record to stream on Web.codec:
parameter can be specified. Its value can be either :
Codec.pcm16
Codec.pcmFloat32
Codec.opusWebM
Codec.aacMP4
toStream:
parameter must be specified. You receive a Stream of UInt8List
packets on your StreamSink
.requestData()
can be called to update the StreamSink
with the current Uint8List
timeSlice:
can be specified. This is the Duration beetween every automatic requestData()
. If Duration.zero
the automatic requestData()
is desactivated. Its default value is Duration.zero
.Of course tell me if any problems. Next step will be to support Player on WEB from a live Stream. I will do it if someone request this feature. I do not expect to support Record to MP4 stream on mobiles in Flutter Sound 9.x. (FlutterSound 10.0 will be probably next year). Same for Playback from a live compressed stream on mobiles. Of course this planning can be adjusted, if someone desperately needs those features.
o support Player on WEB from a live Stream.
Great job ! I'm still very interested by the support Player on WEB from a live Stream to use it with ElevenLabs Socket API
Ok @fvisticot . I am going to see what I can do to support PCM-FLOAT32 and PCM-INT16 as a live streaming for playback on flutter web.
BTW, if someone has a good idea for a OPUS-WEBM support, I would be interested: @Terranic has been a great help for me to support OPUS-WEBM when recording on web.
For your information: I expect to release Flutter Sound 9.10 at the end of the week. It will support playback from stream on flutter web. At least PCM-INT16 and PCM-FLOAT-32. I will try to look also to an OPUS-WEBM, support if not too hard.
For your information: I expect to release Flutter Sound 9.10 at the end of the week. It will support playback from stream on flutter web. At least PCM-INT16 and PCM-FLOAT-32. I will try to look also to an OPUS-WEBM, support if not too hard.
you could also make it configurable, such as 'audio/mp3' : (example from a react code - as you can see, the audioQueue is being filled by audio stream chunks and then played when "playNextInQueue" is triggered)
audioContext.current = new (window.AudioContext || window.webkitAudioContext)();
<....>
const blob = new Blob(audioQueue.current.splice(0, audioQueue.current.length), { type: 'audio/mp3' });
const arrayBuffer = await blob.arrayBuffer();
const audioBuffer = await audioContext.current.decodeAudioData(arrayBuffer);
const source = audioContext.current.createBufferSource();
source.buffer = audioBuffer;
source.connect(audioContext.current.destination);
source.onended = playNextInQueue;
source.start();
Thank you @Terranic for your precious advices.
The way you suggest could be a very good solution, if it works ...
Mozilla doc has this following notice about decodeAudioData
:
This method only works on complete file data, not fragments of audio file data.
I was afraid by this notice.
There is another thing which bothers me: between 2 chunks there will be some milliseconds where the browser will have nothing to play. I am afraid that there will have a small blank between two chunks that will be audible.
I wonder if it will be useful to set an audio processor node at the head of the audio string. This processor will have three functions:
what do you think?
Thank you @Terranic for your precious advices. The way you suggest could be a very good solution, if it works ... Mozilla doc has this following notice about
decodeAudioData
:This method only works on complete file data, not fragments of audio file data.
I was afraid by this notice.
Hi - no, in my react implementation, the python server sends chunks (8192 bytes currently set) from a complete mp3 file which are queued in the audiobuffer mentioned above. I do not notice any disruption in playback. But: I start playback only after the full mp3 file has been received. I did not yet tested it to start playback on receiving the first (or a bit more for buffering) chunk.
(PS: of course could be an undocumented feature -- only tested it with chrome 127)
I have tried the version 9.10.4 Is this version able to play streamed sound from WEB ? The code who is working on Android does not work on WEB. I tried to stream as PCM16K file but it does not work
No, sorry. I have not work on this requirement on 9.x. If you look to the Kanban table you will see that it is planed for 10.0 and not 9.x.
I have too much load on my shoulders and I must manage priorities.
Flutter Sound Version : 9.7.2
Is there no way to stream an audio stream on Flutter Web from mic with flutter_sound? Because streaming requires PCM16 but there is no PCM16 encoder support in any browser.
(and intermediate storing to a file does not work on Flutter web)