alumae / kaldi-gstreamer-server

Real-time full-duplex speech recognition server, based on the Kaldi toolkit and the GStreamer framwork.
BSD 2-Clause "Simplified" License
1.07k stars 342 forks source link

kald-gstreamer-server supports model upto kaldi version? #197

Open Umar17 opened 5 years ago

Umar17 commented 5 years ago

Dear all,

Can someone guide me upto which kaldi version, models are supported with gstreamer server? Kaldi is now compiled with MKL libraries. Models built on this latest version give following error when loaded in worker.py (error occurs on worker side as soon as a request is recieved):

_Intel MKL FATAL ERROR: Cannot load libmkl_avx.so or libmkldef.so

On setting path to LD_PRELOAD=/path/to/libmkl_avx.so:/path/to/libmkl_def.so

Error is changed to:

_python: symbol lookup error: /opt/intel/compilers_and_libraries_2018.5.274/linux/mkl/lib/intel64_lin/libmkl_def.so: undefined symbol: mkl_servmalloc

Looking for guidance.

leakyH commented 5 years ago

I met this problem as well:

Intel MKL FATAL ERROR: Cannot load libmkl_avx.so or libmkl_def.so

short solution: add these two lines in worker.py

import mkl
mkl.get_max_threads()

2019-07-15 12-45-43 的屏幕截图

details: I believe this kind of problem in this repo is NOT quite similar to this sort of thing: For a short test, in a bash command, do python -c 'import sklearn.linear_model.tests.test_randomized_l1' If there was no error message, it's not this kind of problem.

Inspired by this issue #192 from @txz233 , and this, I believe it's a matter of threading. And this strange solution seems work, even not quite clear what these two lines really do. maybe it's something related to conda environment, and point to a wrong lib or what else...

UPD: I update my MKL and execute the following command before running the program export LD_PRELOAD=/opt/intel/mkl/lib/intel64/libmkl_def.so:/opt/intel/mkl/lib/intel64/libmkl_avx2.so:/opt/intel/mkl/lib/intel64/libmkl_core.so:/opt/intel/mkl/lib/intel64/libmkl_intel_lp64.so:/opt/intel/mkl/lib/intel64/libmkl_intel_thread.so:/opt/intel/lib/intel64_lin/libiomp5.so this method works for both python and bash programs