Open prechtelm opened 3 years ago
std::bad_alloc means it goes out of memory. Does it leak memory in your case somehow? Do you have extra things in your app that leak memory?
I'm afraid not. Could restart of vosk after X minutes resolve the issue?
You'd better figure out where you leak the memory. Maybe you create many models instead of single one.
Can you reproduce this issue with our demo?
After intensive testing, app is not leaking. Memory looks rock solid without vosk. With vosk active native memory allocation increases steadily. 20 - 50 mb per 10 minutes. Only one model is loaded. That looks like a leak in vosk?
I can reproduce the same issue using your demo application.
The first time when I noticed that the memory allocation increases steadily, was in my own application where I added the VOSK (with the English model) as the STT engine for constant listening. If I use it in a noisy environment I have a crash (because of the out-of-memory case) in around two hours. I profiled the application and noticed that the native memory allocation increased steadily. When I use the same application using the Google Cloud STT engine I don't have the memory issue. After I have figured out the issue I decided to check if I use VOSK lib in the correct way, cloned and run the demo application on my device, profiled it, and noticed that even the original demo app has the same memory allocation issue. In addition, I noticed that if I start microphone -> stop microphone -> start microphone the allocation increased dramatically. It seems that the VOSK library doesn't clear some allocations when we stop microphone listening and create new allocations when we start the microphone listening.
The steps to reproduce the first issue:
The steps to reproduce the second issue:
Could you please clarify, is something we can do with it to solve the issue (I mean using Java code, not the c++), or is it totally on the native side and needs you to fix it?
Any updates?
Any updates @nshmyrev ?
I didn't have time to look on this yet
I am wondering if we are the only one with this issue?
Depends on who are "we"
Me and sokolyaka found the issue also on our side. Same as prechtelm.
Faced the same issue. It seems to be an issue with how AudioRecord works or perhaps the RecognizerThread. Shutting down the speech service (speechService.shutdown()
) to release the audioRecord object doesn't work. It only prevents memory allocation from increasing; it does not free up memory.
Restarting vosk will make things worse.
One workaround is to pause/resume the service(speechService.setPause(true)
) intermittently, but it is not a good solution if your app needs to listen continuously without missing any input
@nshmyrev any progress on the native lib leak? Or did anybody find an workaround? Memory leak is definitely traceable to vosk. Starting from Recognizer.acceptWaveForm() the RecognizerThread dumps to mem full with native code which never gets cleared. This makes the library basically unusable on Android. Is there anything we can do?
Hello, any update on this issue or a proper workaround solution? We will check if pausing the service can help here.
Faced the same issue. It seems to be an issue with how AudioRecord works or perhaps the RecognizerThread. Shutting down the speech service (
speechService.shutdown()
) to release the audioRecord object doesn't work. It only prevents memory allocation from increasing; it does not free up memory. Restarting vosk will make things worse. One workaround is to pause/resume the service(speechService.setPause(true)
) intermittently, but it is not a good solution if your app needs to listen continuously without missing any input
Hey nanaghartey, could you please clarify your workaround? Because just pause/resume of the service doesn't work for me, and memory doesn't free up.
@nshmyrev hello. Do you have any update about this problem maybe? Thanks
@DjToMeK27 still looking on it. Few more days.
@nshmyrev hello 😅 Any news about this problem? Or is it a tricky one?
😢
Busy days...
Any news? We would love to build upon Vosk, currently not really possible...
Hi @nshmyrev . Need an update on this issue please :)
+1
I don't think this is isolated to the java or android sdk. I experience the same issue using the (open source) rust ffi bindings: here
@nshmyrev Hello. I am having the same problem. Do you have any updates on this issue?
Has anyone tried out @fanmingyi's PR solution to this memory leak issue?
@timmolter can you elaborate on the exact solution?
It looks like he's closing the recognizer object when setting the state to DONE
.
https://github.com/alphacep/vosk-android-demo/pull/212/files
Has anyone tried out @fanmingyi's PR solution to this memory leak issue?
I tried and the results were still the same . No improvements!
Does the app eventually crash? With an OOM or what? Or does it stabilize at say 1GB or some level?
We are having the same issue. Any updates on this?
I would say the best thing until fix comes is
大佬,解决下吧,求求了
Any updates? @nshmyrev
this is the solution: It looks like he's closing the recognizer object: https://github.com/alphacep/vosk-android-demo/pull/212/files
What timmolter wrote. 10x a lot
I also encountered memory problems in docker+springboot2+jdk1.8. I did not create model and Recognizer objects during program initialization because they would occupy a lot of memory. This function is not commonly used in my system, so I will create an object every time someone else needs it, but the memory will grow by 600MB after each call. After observation, the memory will not be used for a long time. Actively release it. The program eventually crashed, oom! I am using the latest sdk of java.
<dependency>
<groupId>org</groupId>
<artifactId>jaudiotagger</artifactId>
<version>2.0.3</version>
</dependency>
<!-- Speech recognition -->
<dependency>
<groupId>net.java.dev.jna</groupId>
<artifactId>jna</artifactId>
<version>5.14.0</version>
</dependency>
<dependency>
<groupId>com.alphacephei</groupId>
<artifactId>vosk</artifactId>
<version>0.3.45</version>
</dependency>
This is a great sdk, I hope it can be perfect, thank you very much
Hey, boss, can you fix this for a minute? Looking forward to your optimized patch.
Hi, after some runtime (2 - 8h) the app crashes with the following error. Looks like a native crash in libvosk_jni.so. How can we fix that issue?
2021-01-27 10:34:06.666 ? A/DEBUG: 2021-01-27 10:34:06.666 ? A/DEBUG: Build fingerprint: 'Android/rk3288/rk3288:7.1.2/TEST/ROM-20190802:userdebug/test-keys' 2021-01-27 10:34:06.666 ? A/DEBUG: Revision: '0' 2021-01-27 10:34:06.666 ? A/DEBUG: ABI: 'arm' 2021-01-27 10:34:06.666 ? A/DEBUG: pid: 24511, tid: 29658, name: Thread-280 >>> de.mytest.app <<< 2021-01-27 10:34:06.666 ? A/DEBUG: signal 6 (SIGABRT), code -6 (SI_TKILL), fault addr -------- 2021-01-27 10:34:06.675 ? A/DEBUG: Abort message: '/buildbot/src/android/ndk-release-r21/external/libcxx/../../external/libcxxabi/src/abort_message.cpp:72: abort_message: assertion "terminating with uncaught exception of type std::bad_alloc: std::bad_alloc" failed' 2021-01-27 10:34:06.675 ? A/DEBUG: r0 00000000 r1 000073da r2 00000006 r3 00000008 2021-01-27 10:34:06.675 ? A/DEBUG: r4 5d97f978 r5 00000006 r6 5d97f920 r7 0000010c 2021-01-27 10:34:06.675 ? A/DEBUG: r8 6f3b8bad r9 00a00000 sl a9ca1008 fp 00100000 2021-01-27 10:34:06.675 ? A/DEBUG: ip 0000000c sp 5d97e858 lr a9c5b857 pc a9c5e0c0 cpsr 600f0010 2021-01-27 10:34:06.710 ? A/DEBUG: backtrace: 2021-01-27 10:34:06.710 ? A/DEBUG: #00 pc 0004a0c0 /system/lib/libc.so (tgkill+12) 2021-01-27 10:34:06.710 ? A/DEBUG: #01 pc 00047853 /system/lib/libc.so (pthread_kill+34) 2021-01-27 10:34:06.710 ? A/DEBUG: #02 pc 0001d8b5 /system/lib/libc.so (raise+10) 2021-01-27 10:34:06.710 ? A/DEBUG: #03 pc 00019401 /system/lib/libc.so (libc_android_abort+34) 2021-01-27 10:34:06.710 ? A/DEBUG: #04 pc 00017048 /system/lib/libc.so (abort+4) 2021-01-27 10:34:06.710 ? A/DEBUG: #05 pc 0001b8b3 /system/lib/libc.so (libc_fatal+22) 2021-01-27 10:34:06.710 ? A/DEBUG: #06 pc 000195fb /system/lib/libc.so (__assert2+18) 2021-01-27 10:34:06.710 ? A/DEBUG: #07 pc 006f7ddf /data/app/de.mytest.app-2/lib/arm/libvosk_jni.so 2021-01-27 10:34:06.710 ? A/DEBUG: #08 pc 006f7ee1 /data/app/de.mytest.app-2/lib/arm/libvosk_jni.so 2021-01-27 10:34:06.710 ? A/DEBUG: #09 pc 006f64a1 /data/app/de.mytest.app-2/lib/arm/libvosk_jni.so 2021-01-27 10:34:06.710 ? A/DEBUG: #10 pc 006f5d4f /data/app/de.mytest.app-2/lib/arm/libvosk_jni.so 2021-01-27 10:34:06.710 ? A/DEBUG: #11 pc 006f5d17 /data/app/de.mytest.app-2/lib/arm/libvosk_jni.so (__cxa_throw+74) 2021-01-27 10:34:06.710 ? A/DEBUG: #12 pc 006f09d3 /data/app/de.mytest.app-2/lib/arm/libvosk_jni.so (_Znwj+54) 2021-01-27 10:34:06.711 ? A/DEBUG: #13 pc 00260b13 /data/app/de.mytest.app-2/lib/arm/libvosk_jni.so (_ZN3fst21CompactHashStateTableINS_24DefaultComposeStateTupleIiNS_15PairFilterStateINS2_INS_18IntegerFilterStateIaEENS_17WeightFilterStateINS_17TropicalWeightTplIfEEEEEENS3_IiEEEEEENS_11ComposeHashISCEEE9FindStateERKSC+166) 2021-01-27 10:34:06.711 ? A/DEBUG: #14 pc 002602f5 /data/app/de.mytest.app-2/lib/arm/libvosk_jni.so (_ZN3fst8internal14ComposeFstImplINS_17DefaultCacheStoreINS_6ArcTplINS_17TropicalWeightTplIfEEEEEENS_23PushLabelsComposeFilterINS_24PushWeightsComposeFilterINS_22LookAheadComposeFilterINS_24AltSequenceComposeFilterINS_16LookAheadMatcherINS_3FstIS6_EEEESF_EESF_SF_LNS_9MatchTypeE3EEESF_SF_LSH_3EEESF_SF_LSH_3EEENS_24GenericComposeStateTableIS6_NS_15PairFilterStateINSM_INS_18IntegerFilterStateIaEENS_17WeightFilterStateIS5_EEE 2021-01-27 10:34:06.711 ? A/DEBUG: #15 pc 0025ff21 /data/app/de.mytest.app-2/lib/arm/libvosk_jni.so (_ZN3fst8internal14ComposeFstImplINS_17DefaultCacheStoreINS_6ArcTplINS_17TropicalWeightTplIfEEEEEENS_23PushLabelsComposeFilterINS_24PushWeightsComposeFilterINS_22LookAheadComposeFilterINS_24AltSequenceComposeFilterINS_16LookAheadMatcherINS_3FstIS6_EEEESF_EESF_SF_LNS_9MatchTypeE3EEESF_SF_LSH_3EEESF_SF_LSH_3EEENS_24GenericComposeStateTableIS6_NS_15PairFilterStateINSM_INS_18IntegerFilterStateIaEENS_17WeightFilterStateIS5_EEE 2021-01-27 10:34:06.711 ? A/DEBUG: #16 pc 0025e4c9 /data/app/de.mytest.app-2/lib/arm/libvosk_jni.so (_ZN3fst8internal14ComposeFstImplINS_17DefaultCacheStoreINS_6ArcTplINS_17TropicalWeightTplIfEEEEEENS_23PushLabelsComposeFilterINS_24PushWeightsComposeFilterINS_22LookAheadComposeFilterINS_24AltSequenceComposeFilterINS_16LookAheadMatcherINS_3FstIS6_EEEESF_EESF_SF_LNS_9MatchTypeE3EEESF_SF_LSH_3EEESF_SF_LSH_3EEENS_24GenericComposeStateTableIS6_NS_15PairFilterStateINSM_INS_18IntegerFilterStateIaEENS_17WeightFilterStateIS5_EEE 2021-01-27 10:34:06.711 ? A/DEBUG: #17 pc 0025417f /data/app/de.mytest.app-2/lib/arm/libvosk_jni.so (_ZNK3fst10ComposeFstINS_6ArcTplINS_17TropicalWeightTplIfEEEENS_17DefaultCacheStoreIS4_EEE15InitArcIteratorEiPNS_15ArcIteratorDataIS4_EE+106) 2021-01-27 10:34:06.711 ? A/DEBUG: #18 pc 00264197 /data/app/de.mytest.app-2/lib/arm/libvosk_jni.so (_ZN3fst8internal13ArcMapFstImplINS_6ArcTplINS_17TropicalWeightTplIfEEEES5_NS_28RemoveSomeInputSymbolsMapperIS5_iEEE6ExpandEi+122) 2021-01-27 10:34:06.711 ? A/DEBUG: #19 pc 002635dd /data/app/de.mytest.app-2/lib/arm/libvosk_jni.so (_ZNK3fst9ImplToFstINS_8internal13ArcMapFstImplINS_6ArcTplINS_17TropicalWeightTplIfEEEES6_NS_28RemoveSomeInputSymbolsMapperIS6_iEEEENS_3FstIS6_EEE16NumInputEpsilonsEi+90) 2021-01-27 10:34:06.711 ? A/DEBUG: #20 pc 002d5ecc /data/app/de.mytest.app-2/lib/arm/libvosk_jni.so (_ZN5kaldi23LatticeFasterDecoderTplIN3fst3FstINS1_6ArcTplINS1_17TropicalWeightTplIfEEEEEENS_7decoder16BackpointerTokenEE18ProcessNonemittingEf+1176) 2021-01-27 10:34:06.711 ? A/DEBUG: #21 pc 002d37f4 /data/app/de.mytest.app-2/lib/arm/libvosk_jni.so (_ZN5kaldi23LatticeFasterDecoderTplIN3fst3FstINS1_6ArcTplINS1_17TropicalWeightTplIfEEEEEENS_7decoder16BackpointerTokenEE15AdvanceDecodingEPNS_18DecodableInterfaceEi+312) 2021-01-27 10:34:06.711 ? A/DEBUG: #22 pc 0023ce9f /data/app/de.mytest.app-2/lib/arm/libvosk_jni.so (_ZN15KaldiRecognizer14AcceptWaveformERN5kaldi6VectorIfEE+126) 2021-01-27 10:34:06.711 ? A/DEBUG: #23 pc 0023cddf /data/app/de.mytest.app-2/lib/arm/libvosk_jni.so (_ZN15KaldiRecognizer14AcceptWaveformEPKci+130) 2021-01-27 10:34:06.711 ? A/DEBUG: #24 pc 002a5549 /data/app/de.mytest.app-2/lib/arm/libvosk_jni.so (vosk_recognizer_accept_waveform+4) 2021-01-27 10:34:06.711 ? A/DEBUG: #25 pc 002395ed /data/app/de.mytest.app-2/lib/arm/libvosk_jni.so (Java_org_kaldi_VoskJNI_KaldiRecognizer_1AcceptWaveform+38) 2021-01-27 10:34:06.712 ? A/DEBUG: #26 pc 004700db /data/app/de.mytest.app-2/oat/arm/base.odex (deleted) (offset 0x42de000)