Open oseiasmribeiro opened 4 years ago
This visualizer should allow you to do that if you listen to either the waveform or the fft data and look at the amplitude or magnitude.
Hey @ryanheise, Can you give some more details to look at the amplitude or magnitude of the data, I have looked at these models; VisualizerWaveformCapture
, VisualizerFftCapture
can't see these fields.
How can I reach these values?
@onatcipli take your time with the example since it shows how to reach this data (and also displays the data).
When I try to play full mp3 audio from network, visualizer works good, but when I try to stream fft or waveform from audio-stream, event always null, do you know how to fix it?
_justPlayer.visualizerWaveformStream.listen((event) { print("VISUALISER $event "); }); _justPlayer.visualizerFftStream.listen((event) { print('FFTVISUALISER $event '); });
Would this issue be reproducible if I modified the official example with your URL? If so, what URL can I plug in?
Would this issue be reproducible if I modified the official example with your URL? If so, what URL can I plug in?
Thanks for reply. http://radiogi.sabr.com.tr:8001/voice_stream_128
Would this issue be reproducible if I modified the official example with your URL? If so, what URL can I plug in?
Sorry, I forgot to say, that problem becomes only on iOS. On android audio-stream visualization working
So just to clarify, that's a yes to my first question? Regarding iOS, are you trying @Eittipat 's pull request mentioned 12 comments up?
So just to clarify, that's a yes to my first question? Regarding iOS, are you trying @Eittipat 's pull request mentioned 12 comments up?
1st question. I think it should reproduce if u will use this URL.
Yes I am using Etipat package just_audio: git: url: https://github.com/Eittipat/just_audio.git ref: visualizer path: just_audio
@ryanheise @Eittipat Don't you know how to fix problem with visualizer on iOS device, because audio stream visualizing not working. by this link http://radiogi.sabr.com.tr:8001/voice_stream_128
I also added NSMicrophoneUsageDescription to info.plist, and request permission for microphone. when open app, but id didn't help me, Fft and WaveForm streams don't send any data
Hello @zatovagul,
I will look into this issue this weekend.
@ryanheise seems "processTap" does not execute when using http://radiogi.sabr.com.tr:8001/voice_stream_128 I don't know much about MTAudioProcessingTapCallbacks, so I leave it to you.
@ryanheise Sorry, how can I use this package with visualizer and equalizer. When I am using the last version, there is no visualizer functionality. But when I am trying to use Ettipats visualizer branch there is no AndroidEqualizer functionality???
I'm sorry the visualizer
branch is a bit behind master
. I had intended to merge @Eittipat 's FFT implementation first and then bring it in line with master
, however there are still some copyright issues to sort out and that needs to be resolved first.
@Eittipat , would you be willing to merge any conflicts if I brought the visualizer
branch up to date?
@ryanheise Yes I would
I've just merged the latest code into the visualizer
branch, including the equalizer.
@ryanheise I've updated my pull request (#546). I also solved the copyright issue. ^ ^
@ryanheise I've updated my pull request (#546). I also solved the copyright issue. ^ ^
http://radiogi.sabr.com.tr:8001/voice_stream_128
but visualizer still not working with this link.
And with this link also: https://broadcast.golos-istini.ru/voice_live_64
Hopefully this weekend I can do some testing.
Sorry, can you help me.
I got this issue
Invalid description in the "just_audio" pubspec on the "just_audio_platform_interface" dependency: "../just_audio_platform_interface" is a relative path, but this isn't a local pubspec.
575 | ╷
576 | 13 │ path: ../just_audio_platform_interface
577 | │ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
578 | ╵
I solved it locally, but now I am trying to use Ci/Cd and got this error repeatedly
My pubspec.yaml
dependencies:
just_audio:
git:
# url: https://github.com/Eittipat/just_audio.git
url: https://github.com/ryanheise/just_audio.git
ref: visualizer
path: just_audio
dependency_overrides:
just_audio_platform_interface:
git:
# url: https://github.com/Eittipat/just_audio.git
url: https://github.com/ryanheise/just_audio.git
ref: visualizer
path: just_audio_platform_interface
Unfortunately for CI at the moment I think you'll have to fork the repo, make the necessary changes so that dependency overrides aren't necessary, and then use your fork as a dependency. Of course that's not ideal. This branch will still exist as its own branch for quite a while before being considered stable enough to merge into the master branch and publish on pub.dev, but perhaps I should do something similar to what I did with audio_service when working on the year long development of an experimental branch: basically, I'd make it so the pubspec.yaml files in git refer to just_audio_platform_interface via git references rather than relative paths within the repository, and have the alternative path-based dependencies still there and commented out (because the plugin developers still need to work based on those). Anyway, for now though I suggest the fork approach.
@ryanheise
I found this https://developer.apple.com/forums/thread/45966 It says "The MTAudioProcessingTap is not available with HTTP live streaming". That's why @zatovagul got nothing when playing from http://radiogi.sabr.com.tr:8001/voice_stream_128
However, I found some good news but I have not looked into it yet. https://stackoverflow.com/questions/16833796/avfoundation-audio-processing-using-avplayers-mtaudioprocessingtap-with-remote
The trick is to KVObserve the status of the AVPlayerItem; when it's ready to play
That sounds familiar... I thought I was already doing something like that where ensureTap
is called within observeValueForKeyPath
.
Hi all, @Eittipat 's PR is now merged into the visualizer
branch which adds an iOS implementation of FFT. Thanks to @Eittipat 's work this now reaches feature parity between Android and iOS. This has now also been symlinked to macOS which also appears to work correctly.
For anyone who was already using the FFT visualizer on Android, note that I also just changed the plugin to convert the platform data from Uint8List
to Int8List
which is more appropriate for FFT, and added some convenience methods to extract the magnitude and phase out of the raw data. The example has been updated to do this with a new FFT widget demo. If anyone can write a better FFT visualizer, please feel welcome to. (e.g. I haven't done any smoothing of the data.)
This is still not ready to be merged into the public release. I think some improvements should be made on when the Android implementation prompts the user for permissions, and on the iOS side the TAP code should be reviewed and possibly refactored to allow for future uses of the TAP.
Any Idea on when the visualizer will be done
Any Idea on when the visualizer will be done
Did u try to use it? It works in a lot of situations
@ryanheise I updated your library and change my code for Int8List, but it's still not working For example with this link https://broadcast.golos-istini.ru/voice_64
I will try to fix it in native code
https://github.com/ryanheise/just_audio/issues/97#issuecomment-1015559833
I just don't understand how to use it in IOS Native code(
I just don't understand how to use it in IOS Native code(
I've already looked at it. It doesn't work. I think you have to wait for the AVAudioEngine version (which is still in the early stage - #334)
Curious what the current plans are for the visualizer branch. Is it still planned to be merged in or is it now waiting for https://github.com/ryanheise/just_audio/pull/784 before further updates?
Hi @spakanati
No it is not waiting for #784 , probably going forward there will be both the current AVQueuePlayer-based and the AVAudioEngine-based implementions available since they may end up supporting different feature sets.
What this branch is waiting on is a finalisation of the API (particularly for requesting permissions and also for starting/stopping the visualizer), and also a code review and perhaps code refactoring on the iOS side to handle the TAP code more cleanly.
I would be a bit nervous about just merging this TAP code until it has been well tested, so I think this branch would remain here as the way for people to experiment with the visualizer until the final code has been tested and I am confident that it will not break anything.
Of course to help speed this up, people are welcome to help on any of the above points, either through code or by contributing thoughts/ideas through discussion.
Thanks for the clarification! I've been able to use the visualizer
branch successfully on both iOS and Android with mp3 network files, but I just ran into the HLS issue mentioned above, so that's why I was wondering about the AVAudioEngine-based implementation. It sounds like only the AVAudioEngine implementation will be able to support the visualizer when using HLSAudioSource?
As far as the permissions, I agree that it might be common to want more control over the timing of the request, especially because a microphone record request is a little confusing/jarring for users. This was pretty easy for me to get around, though -- I just did my own permission request before ever calling player.startVisualizer
, so I was able to show my own messaging. I'd guess a lot of people are already handling permission requests for other parts of their app, so one option could be to remove the permission handling entirely from the visualizer and just list the necessary permissions in the docs.
First and foremost, I would like to express my sincere gratitude for the considerable effort and dedication you have devoted to this project alongside the rest of the contributors. Since this issue is still open, I have made a tiny test to see how it will behave using the /example associated with the branch, I have identified several issues that I would like to bring to your attention.
1) Bug audio sound dissapears
adding the following code to add a stop button for testing. IconButton( icon: const Icon(Icons.stop), onPressed: () { player.stopVisualizer(); }, ), at line 288 in example_visualizer.dart
Click stop while audio is playing, then click pause then click play, it will play without a sound. if it didnt happen try different approach as sometimes it will not happen, try when paused then click stop (not pause) few times then try to play again for example. it will play without sound. also strangely if it plays without a sound and call player.stopVisualizer(); by clicking stop, it will play the sound.
2) Crash
changing to the following code IconButton( icon: const Icon(Icons.stop), onPressed: () { player.stop(); }, ),
It will crash the app when the song is playing and visualizer running and stop called.
changing to the following code IconButton( icon: const Icon(Icons.stop), onPressed: () { player.stopVisualizer(); player.stop(); }, ),
It will crash the app as the stopVisualizer is not finished before stop called
changing to the following code IconButton( icon: const Icon(Icons.stop), onPressed: () { player.stopVisualizer(); await Future.delayed((const Duration(seconds: 2)), (){}); player.stop(); }, ),
it will work as the stopVisualizer had the time to finish before stop called.
crashes log: terminal) "Restarted application in 490ms.
3 get visualizerCaptureSize -> 1024 Lost connection to device. Exited"
Thanks @karrarkazuya , this is exactly the sort of feedback I was hoping for, since this branch is quite experimental and can't be merged until it is sufficiently tested and becomes stable.
Since you didn't mention which platform you were testing on, could you confirm which one that is? I would guess iOS or macOS since the Tap is an Apple concept.
The test was actually made on iOS simulator as showing in the terminal log, however since you mentioned this now I have also tested on iOS simulator, iOS device (iPhone XR), and real Android Device (SD 8Gen 1)
The results were as following On the Android device: Audio bug does not exists Crash does not exists
On iOS Device: Audio bug does not exist Crash exists in same mentioned behavior log "3 get visualizerCaptureSize -> 1024
objc_retain + 16 libobjc.A.dylib
objc_retain:
-> 0x19c1b6e5c <+16>: ldr x17, [x17, #0x20]
0x19c1b6e60 <+20>: tbz w17, #0x2, 0x19c1b6e18 ; _lldb_unnamedsymbol1362
0x19c1b6e64 <+24>: tbz w16, #0x0, 0x19c1b6e40 ; lldb_unnamed_symbol1362 + 40
0x19c1b6e68 <+28>: lsr x17, x16, #55
Target 0: (Runner) stopped.
Lost connection to device.
Exited"On iOS simulator:
Audio bug exists in same mentioned behavior (even with different build)
Crash exists in same behavior
log
"
Thread 47 Crashed:: AUDeferredRenderer-0x15c4671d0
0 libobjc.A.dylib 0x105ca5454 objc_retain + 16
1 just_audio 0x1059ced04 processTap + 528 (AudioPlayer.m:476)
2 MediaToolbox 0x113cdacd4 aptap_AudioQueueProcessingTapCallback + 216
3 AudioToolbox 0x115c19c54 AQProcessingTap::DoCallout(unsigned int&, AudioTimeStamp&, unsigned int&, AudioBufferList&, std::1::optional
This comment doesn't seem to work for me. I'm still getting an error on pub get
:
Resolving dependencies...
Error on line 19, column 11: Invalid description in the "just_audio" pubspec on the "just_audio_platform_interface" dependency: "../just_audio_platform_interface" is a relative path, but this isn't a local pubspec.
╷
19 │ path: ../just_audio_platform_interface
│ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
╵
pub get failed
This is in my pubspec.yml:
name: cgr_player
description: A new Flutter project.
publish_to: 'none' # Remove this line if you wish to publish to pub.dev
version: 1.0.1+1
environment:
sdk: '>=2.12.0 <3.0.0'
dependencies:
flutter:
sdk: flutter
audio_session: ^0.1.14
# just_audio: ^0.9.36
just_audio:
git:
url: https://github.com/ryanheise/just_audio.git
ref: visualizer
path: just_audio
# just_audio_background: ^0.0.1-beta.11
just_audio_background:
git:
url: https://github.com/ryanheise/just_audio.git
ref: visualizer
path: just_audio_background
cupertino_icons: ^1.0.2
dependency_overrides:
just_audio_platform_interface:
git:
url: https://github.com/ryanheise/just_audio.git
ref: visualizer
path: just_audio_platform_interface
dev_dependencies:
flutter_test:
sdk: flutter
flutter_lints: ^2.0.0
flutter:
uses-material-design: true
Can somebody help me setup this branch? Thanks!
I think this is because on the visualizer
branch, the pubspec of just_audio is using a local reference to the aforementioned package:
https://github.com/ryanheise/just_audio/blob/visualizer/just_audio/pubspec.yaml#L18
It should probably be using a dependency_override instead.
I am not sure what the best way to proceed until this gets addressed in some way is, either clone this repo localy instead of using a git url, or fork this repo and fix the pubspec files I think.
That's correct, there is a chicken and egg problem with developing plugins within the federated plugin architecture that is quite inconvenient to deal with. As long as this branch is in development and hasn't been published, it will continue to be inconvenient. Running a local dependency definitely works, that's obviously what I do, as a plugin developer.
I should probably bump up the priority of this branch so that it gets released. In order to do that, I need to look at two things:
I am just getting started in Dart, but couldn't you specify the local path inside a dependency_override
instead, so that it doesn't affect when using this package as a dependency?
You could try it and if you find something that would be lower maintenance, you would be welcome to make a pull request.
Makes sense!
Btw one thing I came across, I'm not sure how relevant it is or if it should be mentioned in the docs anywhere perhaps:
You need the RECORD_AUDIO
permission on Android even if you're analyzing audio files (i.e. not using the microphone at all). Otherwise, you will not get any analysis data.
That is true, the example shows this, but I haven't written the documentation yet until I finalise how the permission API will actually work. I think rather than it being initiated by startVisualizer
, there should be a separate API, and perhaps even a separate plugin would be more appropriate. I welcome feedback on which of these options is preferred.
In the latest commit, permission handling is separated from the plugin, so your app can start the permission request flow at a suitable time before starting the visualizer. I've updated the example and the docs.
The remaining issue before merging is to review the TAP code mentioned earlier. Extra eyes on it are welcome. E.g.
Unfortunately I have no idea about the TAP processor, but we just ran into an issue when trying to use this branch with background audio on Android. We're getting an error:
E/flutter (21878): [ERROR:flutter/runtime/dart_vm_initializer.cc(41)] Unhandled Exception: UnimplementedError: visualizerWaveformStream has not been implemented.
E/flutter (21878): #0 AudioPlayerPlatform.visualizerWaveformStream (package:just_audio_platform_interface/just_audio_platform_interface.dart:82:5)
E/flutter (21878): #1 AudioPlayer._setPlatformActive.subscribeToEvents (package:just_audio/just_audio.dart:1406:20)
E/flutter (21878): #2 AudioPlayer._setPlatformActive.setPlatform (package:just_audio/just_audio.dart:1526:7)
I don't quite understand where those are even supposed to be implemented, so any tips on how to approach this would be welcome. I will also try simply not subscribing to those events in subscribeToEvents
.
I tried to implement the missing methods. This is as far as I got:
https://github.com/useronym/just_audio/commit/e2d2fe71c1a6c6bae5631a6c993f08fc8b085050
I'm not sure if this is correct or if there's anything missing. I haven't actually tested this properly, as we are actually moving away from using the visualizer and doing offline pre-processing to generate spectral analysis of our audio files instead.
I don't quite understand where those are even supposed to be implemented, so any tips on how to approach this would be welcome. I will also try simply not subscribing to those events in
subscribeToEvents
.
Are you using just_audio_background? If so, you're getting the error because just_audio_background hasn't implemented that part of the platform interface. If you look inside that plugin's code, you'll see it already implements two of the other event streams, so the implementation of this new event stream would be like that:
class _JustAudioPlayer extends AudioPlayerPlatform {
final eventController = StreamController<PlaybackEventMessage>.broadcast();
final playerDataController = StreamController<PlayerDataMessage>.broadcast();
...
@override
Stream<PlaybackEventMessage> get playbackEventMessageStream =>
eventController.stream;
@override
Stream<PlayerDataMessage> get playerDataMessageStream =>
playerDataController.stream;
...
}
The implementation should provide all the visualizer event to the main plugin via this 3rd stream that should be overridden.
The permission handling change has been working well for me. Is there a recommended way to use this branch in a project that also targets web (or a path for web in general if the branch is hopefully close to merging)? I understand the visualizer isn't implemented yet for web, but all playback breaks on web because of calls to unimplemented visualizerWaveformStream even if startVisualizer is never used.
Provide a simple Audio Visualizer (showing low, medium and high frequencies) that offers the possibility of increasing or decreasing a number of bars. This is useful for the user to make sure that the audio is being played or recorded in the application. Sometimes the volume or mic can be at a minimum.
Example: https://dev.to/armen101/audiorecordview-3jn5