Open averak opened 3 years ago
Hi!
I'm beginner of Kaldi and Gstreamserver. I have a trained kaldi model with nnet3 on CSJ. That's why I tried to run this model with docker-kaldi-gstreamer-server.
$ python kaldigstserver/master_server.py --port=80 $ python kaldigstserver/worker.py -u ws://localhost:80/worker/ws/speech -c csj_nnet3.yaml
(The details of the yaml file are written at the end of this issue)
client.py
$ python kaldigstserver/client.py -u ws://localhost:80/client/ws/speech -r 32000 test/data/english_test.raw
Then, the worker server went down leaving the following log.
INFO 2020-12-30 13:59:01,256 101 GET /client/ws/speech?content-type= (127.0.0.1) 0.41ms INFO 2020-12-30 13:59:01,257 dd1a08a3-930b-489d-a176-10a21f2325fb: OPEN INFO 2020-12-30 13:59:01,257 dd1a08a3-930b-489d-a176-10a21f2325fb: Request arguments: content-type="" INFO 2020-12-30 13:59:01,258 dd1a08a3-930b-489d-a176-10a21f2325fb: Using worker <__main__.DecoderSocketHandler object at 0x7f681fb2e150> 2020-12-30 13:59:01 - DEBUG: __main__: <undefined>: Got message from server of type <class 'ws4py.messaging.TextMessage'> 2020-12-30 13:59:01 - INFO: decoder2: dd1a08a3-930b-489d-a176-10a21f2325fb: Initializing request 2020-12-30 13:59:01 - INFO: __main__: dd1a08a3-930b-489d-a176-10a21f2325fb: Started timeout guard 2020-12-30 13:59:01 - INFO: __main__: dd1a08a3-930b-489d-a176-10a21f2325fb: Initialized request 2020-12-30 13:59:01 - DEBUG: __main__: dd1a08a3-930b-489d-a176-10a21f2325fb: Checking that decoder hasn't been silent for more than 10 seconds INFO 2020-12-30 13:59:01,483 dd1a08a3-930b-489d-a176-10a21f2325fb: Forwarding client message (<type 'str'>) of length 2048 to worker 2020-12-30 13:59:01 - DEBUG: __main__: dd1a08a3-930b-489d-a176-10a21f2325fb: Got message from server of type <class 'ws4py.messaging.BinaryMessage'> 2020-12-30 13:59:01 - DEBUG: decoder2: dd1a08a3-930b-489d-a176-10a21f2325fb: Pushing buffer of size 2048 to pipeline 2020-12-30 13:59:01 - DEBUG: decoder2: dd1a08a3-930b-489d-a176-10a21f2325fb: Pushing buffer done INFO 2020-12-30 13:59:01,741 dd1a08a3-930b-489d-a176-10a21f2325fb: Forwarding client message (<type 'str'>) of length 2048 to worker 2020-12-30 13:59:01 - DEBUG: __main__: dd1a08a3-930b-489d-a176-10a21f2325fb: Got message from server of type <class 'ws4py.messaging.BinaryMessage'> 2020-12-30 13:59:01 - DEBUG: decoder2: dd1a08a3-930b-489d-a176-10a21f2325fb: Pushing buffer of size 2048 to pipeline 2020-12-30 13:59:01 - DEBUG: decoder2: dd1a08a3-930b-489d-a176-10a21f2325fb: Pushing buffer done INFO 2020-12-30 13:59:01,995 dd1a08a3-930b-489d-a176-10a21f2325fb: Forwarding client message (<type 'str'>) of length 2048 to worker 2020-12-30 13:59:01 - DEBUG: __main__: dd1a08a3-930b-489d-a176-10a21f2325fb: Got message from server of type <class 'ws4py.messaging.BinaryMessage'> 2020-12-30 13:59:01 - DEBUG: decoder2: dd1a08a3-930b-489d-a176-10a21f2325fb: Pushing buffer of size 2048 to pipeline 2020-12-30 13:59:01 - DEBUG: decoder2: dd1a08a3-930b-489d-a176-10a21f2325fb: Pushing buffer done INFO 2020-12-30 13:59:02,250 dd1a08a3-930b-489d-a176-10a21f2325fb: Forwarding client message (<type 'str'>) of length 2048 to worker 2020-12-30 13:59:02 - DEBUG: __main__: dd1a08a3-930b-489d-a176-10a21f2325fb: Got message from server of type <class 'ws4py.messaging.BinaryMessage'> 2020-12-30 13:59:02 - DEBUG: decoder2: dd1a08a3-930b-489d-a176-10a21f2325fb: Pushing buffer of size 2048 to pipeline 2020-12-30 13:59:02 - DEBUG: decoder2: dd1a08a3-930b-489d-a176-10a21f2325fb: Pushing buffer done 2020-12-30 13:59:02 - DEBUG: __main__: dd1a08a3-930b-489d-a176-10a21f2325fb: Checking that decoder hasn't been silent for more than 10 seconds INFO 2020-12-30 13:59:02,504 dd1a08a3-930b-489d-a176-10a21f2325fb: Forwarding client message (<type 'str'>) of length 2048 to worker 2020-12-30 13:59:02 - DEBUG: __main__: dd1a08a3-930b-489d-a176-10a21f2325fb: Got message from server of type <class 'ws4py.messaging.BinaryMessage'> 2020-12-30 13:59:02 - DEBUG: decoder2: dd1a08a3-930b-489d-a176-10a21f2325fb: Pushing buffer of size 2048 to pipeline 2020-12-30 13:59:02 - DEBUG: decoder2: dd1a08a3-930b-489d-a176-10a21f2325fb: Pushing buffer done 2020-12-30 13:59:02 - INFO: decoder2: dd1a08a3-930b-489d-a176-10a21f2325fb: Connecting audio decoder 2020-12-30 13:59:02 - INFO: decoder2: dd1a08a3-930b-489d-a176-10a21f2325fb: Connected audio decoder INFO 2020-12-30 13:59:02,645 Worker <__main__.WorkerSocketHandler object at 0x7f681fb86290> leaving INFO 2020-12-30 13:59:02,649 dd1a08a3-930b-489d-a176-10a21f2325fb: Handling on_connection_close() INFO 2020-12-30 13:59:02,649 dd1a08a3-930b-489d-a176-10a21f2325fb: Closing worker connection [2] 8887 segmentation fault (core dumped) python kaldigstserver/worker.py -u ws://localhost:80/worker/ws/speech -c
My yaml file is below.
use-nnet2: True decoder: nnet-mode: 3 use-threaded-decoder: true model : /opt/models/tdnn1a_online/final.mdl word-syms : /opt/models/tdnn1a_online/phones.txt mfcc-config : /opt/models/tdnn1a_online/conf/mfcc.conf ivector-extraction-config : /opt/models/tdnn1a_online/conf/ivector_extractor.conf max-active: 10000 beam: 10.0 lattice-beam: 6.0 acoustic-scale: 0.083 do-endpointing : true endpoint-silence-phones : "1:2:3:4:5:6:7:8:9:10" traceback-period-in-secs: 0.25 chunk-length-in-secs: 0.25 num-nbest: 1 out-dir: tmp use-vad: False silence-timeout: 10 # Just a sample post-processor that appends "." to the hypothesis post-processor: perl -npe 'BEGIN {use IO::Handle; STDOUT->autoflush(1);} sleep(1); s/(.*)/\1./;' #post-processor: (while read LINE; do echo $LINE; done) # A sample full post processor that add a confidence score to 1-best hyp and deletes other n-best hyps #full-post-processor: ./sample_full_post_processor.py logging: version : 1 disable_existing_loggers: False formatters: simpleFormater: format: '%(asctime)s - %(levelname)7s: %(name)10s: %(message)s' datefmt: '%Y-%m-%d %H:%M:%S' handlers: console: class: logging.StreamHandler formatter: simpleFormater level: DEBUG root: level: DEBUG handlers: [console]
Thanks
Did you ever use it successfully for some recognition? or It's just at the beginning of your setup.
i met the same problem
Hi!
I'm beginner of Kaldi and Gstreamserver. I have a trained kaldi model with nnet3 on CSJ. That's why I tried to run this model with docker-kaldi-gstreamer-server.
(The details of the yaml file are written at the end of this issue)
client.py
Then, the worker server went down leaving the following log.
My yaml file is below.
Thanks