Closed alidabaghi123 closed 1 year ago
You should probably specify all software versions, especially torch versions and onnx versions if relevant, and the filenames of any of our uploaded models that you are using, as I suspect this may come down to a version mismatch of some kind. (But Fangjun will know better).
Could you please post your complete command and all the logs?
Also, please re-check that the given tokens.txt matches your model.
You should probably specify all software versions, especially torch versions and onnx versions if relevant, and the filenames of any of our uploaded models that you are using, as I suspect this may come down to a version mismatch of some kind. (But Fangjun will know better).
Thank you very much.
Could you please post your complete command and all the logs?
Also, please re-check that the given tokens.txt matches your model.
Have you fixed it?
Have you fixed it?
excuse me. I have updated comment.
excuse me. I have updated comment.
Please post all of the logs. That is, it should contain the logs since you start streaming_server.py
. The logs should contain the output of
https://github.com/k2-fsa/sherpa/blob/da5a7153fa972952d2d2fd01215053d23a6ecd7b/sherpa/bin/streaming_server.py#L780
Please give us as much information as you could.
excuse me. I have updated comment.
Please post all of the logs. That is, it should contain the logs since you start
streaming_server.py
. The logs should contain the output ofPlease give us as much information as you could.
So the log shows it indeed returns decoded results to the client but throws an error at the end.
Could you change https://github.com/k2-fsa/sherpa/blob/da5a7153fa972952d2d2fd01215053d23a6ecd7b/sherpa/cpp_api/online-recognizer.cc#L189 to
std::cerr << i << ", ";
auto sym = sym_table[i];
std::cerr << sym<< "\n";
and re-compile sherpa and try again?
thank you. my problem is tokens.txt. and fixed it
hello. i worked with sherpa framework for zipformer streamin. some times disconned from server and show this error in terminal. why?