mpapadomanolaki / UNetLSTM

GNU General Public License v2.0
23 stars 10 forks source link

About the Parameters #4

Open Tiacy opened 4 years ago

Tiacy commented 4 years ago

Hello, Thanks for your work! I have a question: Whether the 'dropout' parameter in the LSTM is set to 0.7 can get the optimal effect? And for optimal results, what is the LSTM's dropout parameter set to ? Thanks very much! image

mpapadomanolaki commented 4 years ago

Hello,

The paper results were produced when the dropout value was set to 0.7, so this was the optimal value for my experiments.

Best, Maria

Tiacy commented 4 years ago

OK. I see. Thanks for your help!

Tiacy commented 4 years ago

Hello, I have another question,Why combine hidden features and input features as input gate inputs in the LSTM module, lines 46-49 in 'Networkl.py'? What happens if hidden features are removed from the input gate? Do you try that? Thanks for your help! image

mpapadomanolaki commented 4 years ago

Hello,

This is how the LSTMs work, they combine current information with previous information, that's why i do that. Otherwise the temporal relationship of different time steps will not be calculated properly.

Tiacy commented 4 years ago

Hello, Thanks for your work! I found that there was only one pooling operation in the Unet-LSTM model. Why not a pooling operation immediately after each convolution operation during the down sampling? Thanks for your reply.

mpapadomanolaki commented 4 years ago

Hello,

There are 4 max pooling operations during the downsampling. Look at the encoder function.