yajiemiao / pdnn

PDNN: A Python Toolkit for Deep Learning. http://www.cs.cmu.edu/~ymiao/pdnntk.html
Apache License 2.0
224 stars 105 forks source link

Error to read PDNN nnet with kaldi #53

Open freyes85 opened 6 years ago

freyes85 commented 6 years ago

Hi all! I'm trying to extract bnf (steps_pdnn/make_bnf_feat.sh) using a nnet with "rectifier" activation function trained with PDNN and the nnet-forward function from kaldi tells me that it does not recognize that marker component.

It seems that a pdnn nnet with ReLu cant be read by kaldi. How can i convert the PDNN nnet model to Kaldi nnet model, to recognize the model type.

thx in advance Flavio

MaigoAkisame commented 6 years ago

Do you have any error messages?

I'm not familiar with the Kaldi nnet model, but I may be able to make something out of the error message...

freyes85 commented 6 years ago

First of all thanks for your fast answer! I've run one of pddn's recipes: run_DNN.py, using rectifier activation function and with kaldi output file format, everything went ok. Then when I try to dump BNF with pdnn/kaldipdnn/steps_pdnn/make_bnf_feat.sh I have this error message:

nnet-forward --apply-log=true /Datos/freyes/Flavio/ProyectoTeorico/NuevaLinea/Experimentacion/Codigos_Python/Prueba_Inicios/pdnn/working_dir/dnn/bnf.nnet scp:/Datos/freyes/Flavio/ProyectoTeorico/NuevaLinea/Experimentacion/Codigos_Python/Prueba_Inicios/pdnn/working_dir/test/split1/1/feats.scp ark,scp:/Datos/freyes/Flavio/ProyectoTeorico/NuevaLinea/Experimentacion/Codigos_Python/Prueba_Inicios/pdnn/working_dir/bnf_feat/feats_bnf_test_bnf.1.ark,/Datos/freyes/Flavio/ProyectoTeorico/NuevaLinea/Experimentacion/Codigos_Python/Prueba_Inicios/pdnn/working_dir/bnf_feat/feats_bnf_test_bnf.1.scp

LOG (nnet-forward[5.2]:SelectGpuId():cu-device.cc:110) Manually selected to compute on CPU. ERROR (nnet-forward[5.2]:MarkerToType():nnet-component.cc:111) Unknown 'Component' marker : '

thx again and hope you can help once more, Flavio

El jue., 30 ago. 2018 a las 12:08, Yun Wang (Maigo) (< notifications@github.com>) escribió:

Do you have any error messages?

I'm not familiar with the Kaldi nnet model, but I may be able to make something out of the error message...

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/yajiemiao/pdnn/issues/53#issuecomment-417375166, or mute the thread https://github.com/notifications/unsubscribe-auth/AWmq1YmdIsVsbcox9i85A3yN9nSF7vl7ks5uWA4FgaJpZM4WTxnA .

-- “Tener el valor de sus virtudes es un negocio.” “Tener el valor de sus defectos, eso si es valor.”

                                            Flavio J. Reyes Díaz
MaigoAkisame commented 6 years ago

I see that the ReLU activation function can be called either "rectifier" or "relu" in PDNN. But in Kaldi, neither is a valid name for a nnet component.

I'm surprised that Kaldi nnet doesn't support the ReLU activation function. If you need it, you may have to modify the files nnet-component.cc and nnet-component.h to add support for ReLU. Kaldi already has implementations of other activation functions like sigmoid and tanh. I hope it's not hard to implement ReLU as well.

freyes85 commented 6 years ago

I agree that the component market relu and rectifier for Kaldi are not valid. But, nnet from Kaldi have an implementation of ReLU activation function identified as ParametricRelu (nnet/nnet-parametric-relu.h), which is a generalization of the ReLU function. On the other hand, nnet2 have an implementation of ReLU known as RectifiedLinearComponent, but PDNN work on nnet. There is some way or tool in PDNN that allows to modify the components name in Kaldi format? I wonder if you have any implementation to dump bnf without using Kaldi?

Thks again Flavio

El jue., 30 ago. 2018 a las 17:48, Yun Wang (Maigo) (< notifications@github.com>) escribió:

I see that the ReLU activation function can be called either "rectifier" or "relu" in PDNN. But in Kaldi, neither is a valid name for a nnet component.

I'm surprised that Kaldi nnet doesn't support the ReLU activation function. If you need it, you may have to modify the files nnet-component.cc and nnet-component.h to add support for ReLU. Kaldi already has implementations of other activation functions like sigmoid and tanh. I hope it's not hard to implement ReLU as well.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/yajiemiao/pdnn/issues/53#issuecomment-417478048, or mute the thread https://github.com/notifications/unsubscribe-auth/AWmq1WRXJRMyvguUi0ikj30DwFJj-fsOks5uWF25gaJpZM4WTxnA .

-- “Tener el valor de sus virtudes es un negocio.” “Tener el valor de sus defectos, eso si es valor.”

                                            Flavio J. Reyes Díaz
MaigoAkisame commented 6 years ago

Sorry, I don't have a ready implementation...