Open jiangds518 opened 4 years ago
Find the RNN unit in the keras model (https://github.com/yangarbiter/multilabel-learn/blob/master/mlearn/models/rethinknet/rethinkNet.py#L126)
When the RNN unit is SimpleRNN, you can just plot the recurrent matrix in it, that's the memory transformation matrix W. (https://www.tensorflow.org/api_docs/python/tf/keras/layers/SimpleRNN)
Thanks for your reply :), I am writing a paper and want to cite your paper. And I have two other questions: 1)why the value of units in RNN is 128? 2) If the number of labels in my dataset is 6 and I want to visualize memory transformation matrix W, should I set the value of units to 6? Thanks again!!
Hi,
Previously we found that adding another layer between RNN and the label will improve the performance, so the number of units in RNN is set to 128. If you want to plot the memory transformation as the size of the number of labels, you may want to tweak the architecture a bit. Changing the number of unit in RNN from 128 to 6 and remove the dense layer right after it should work in your case.
x = get_rnn_unit(rnn_unit, n_labels, x, activation='sigmoid', l2w=regularizer,
recurrent_dropout=0.25)
Let me know if you have any question. Thanks.
hello, I am training rethinkNet on a image data set, I want to know how to visualize memory transformation matrix W.