FrancoisLasson / Temporal_DBN

A Temporal Deep Belief Network implementation using Theano
2 stars 1 forks source link

Finetuning : RBMs ? #10

Open FrancoisLasson opened 8 years ago

FrancoisLasson commented 8 years ago

Should finetune all TDBN or just CRBM+LogReg?

At the moment (13 avril 2016), I try to put together all my models. In other words, until now I train RBMs and use them to generate a new dataset (train, validation, test). Then, I use this dataset to pretrain CRBM and finetune CRBM and Logistic Regressive with respect to a supervising criterion.

My idea is : Could we improve recognition rate introducing RBMs in the finetuning phase? To do that, I have to put together RBMs and CRBM+LogReg. In other words, I have to cut myself off the intermediate dataset. In this way, CRBM could be influence by the RBMs and so, RBMs could be finetune.

FrancoisLasson commented 8 years ago

Problem n°1 : How concatenate RBMs' hidden layer in the theano model description? Numpy concatenation generate an error cos, in the theano model, variables are abstracts, not associated with a value. Have to generate my own concatenation process?

Solution : https://groups.google.com/forum/#!topic/theano-users/MsoO76BE-DI

FrancoisLasson commented 8 years ago

L'idée est de déduire les valeurs de crbm_x et crbm_hist, les entrées du CRBM en fonction des parametres des RBMs et du dataset. C'est à dire, reprendre le principe du logReg dont l'entree est deduite de la sortie du CRBM. Par ce principe, la sortie du LogReg dependra des parametres des RBMs et du dataset et il sera donc possible de "finetune" l'ensemble.