openvinotoolkit / openvino

OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference
https://docs.openvino.ai
Apache License 2.0
7.17k stars 2.24k forks source link

[GNA] GNA device must be closed before opening again #399

Closed fujunwei closed 4 years ago

fujunwei commented 4 years ago

How to create multiple ExecutableNetwork instance that will open GNA device repeatedly?

jgespino commented 4 years ago

Hi @fujunwei

Could you please provide additional information on what you are trying to accomplish?

Regards, Jesus

fujunwei commented 4 years ago

It can be reproduced with creating second ExecutableNetworkin speech_sample L.649 such as:

ExecutableNetwork secondeExecutableNet = ie.LoadNetwork(netBuilder.getNetwork(), deviceStr, genericPluginConfig);

There are below error messages when running including the two ExecutableNetwork instance.

[ INFO ] Loading network files
[ INFO ] Batch size is 1
[ INFO ] Using scale factor of 2175.43 calculated from first utterance.
[ INFO ] Loading model to the device
[ ERROR ] [GNAPlugin] in function void GNADeviceHelper::checkStatus() const: Bad GNA status 14, GNA_INVALIDHANDLE - Error: Device: invalid handle
fujunwei commented 4 years ago

Another question is how to get file HCLG.fst/words.txt/final.mdl to produce text for the third step in the speech_sample?

3. Run the Kaldi decoder to produce n-best text hypotheses and select most likely text given the WFST (HCLG.fst), vocabulary (words.txt), and TID/PID mapping (final.mdl):

mdeisher commented 4 years ago

@fujunwei in the current release only one LoadNetwork at a time is supported by the GNA plugin. This issue is fixed already (in the 2020 branch) but the fix depends on a new version of gna.dll/gna.so that is not yet open source and is not yet publicly available. Sorry about the missing final.mdl. I just attached it to this message and will try to get it uploaded to https://download.01.org/openvinotoolkit/models_contrib/speech/kaldi/ Note that it had to be renamed to final_mdl.txt so that github would allow it to be attached.

final_mdl.txt

fujunwei commented 4 years ago

Thanks so much.

in the current release only one LoadNetwork at a time is supported by the GNA plugin. This issue is fixed already (in the 2020 branch) but the fix depends on a new version of gna.dll/gna.so that is not yet open source and is not yet publicly available.

We will keep an eye on the version, it's helpful for me to let me know when it is publicly available.

Note that it had to be renamed to final_mdl.txt so that github would allow it to be attached.

I get HCLG.fst and words.txt from kaldi-gstreamer-server and inference dev93_scores_10.ark with speech_sample got below results:

4k0c0301 ON OUR LAW
4k0c0302 HA HA HA HA
4k0c0303 HA HA HA HA HA
4k0c0304 HA HA
4k0c0305 ON
4k0c0306 HA HA HA
4k0c0307 HA HA HA
4k0c0308 HA HA HA
4k0c0309 HA
4k0c030a ON

Are they expected results? i don't find the test_filt.txt for the fourth step. 4. Run the word error rate tool to check accuracy given the vocabulary (words.txt) and reference transcript (test_filt.txt):

Is it need to train the network with Kaldi if i want to generate new ark file to inference for the first step. 1. Prepare a speaker-transformed feature set given the feature transform specified in final.feature_transform and the feature files specified in feats.scp:

mdeisher commented 4 years ago

@fujunwei, the files from different Kaldi recipes are not compatible with one another. The best way is to train from scratch using your favorite Kaldi recipe. Then you will have all the required files.

I compressed all the missing files using 7-zip and then split it into a multi-part zip archive. You may need to rename the files (e.g., by changing "_001.zip" to ".zip.001") in order for your un-archiver to reconstruct the original archive. wsj_dnn5b_smbr_001.zip wsj_dnn5b_smbr_002.zip wsj_dnn5b_smbr_003.zip wsj_dnn5b_smbr_004.zip wsj_dnn5b_smbr_005.zip wsj_dnn5b_smbr_006.zip wsj_dnn5b_smbr_007.zip wsj_dnn5b_smbr_008.zip wsj_dnn5b_smbr_009.zip

fujunwei commented 4 years ago

Thanks for your help.