Open Chenpuh opened 3 years ago
Hi there, Sorry for just read, GitHub didn't notify me about this issue. I didn't remember correctly. Perhaps about 10 times or more. Aside from the GPU, I think the Jupyter notebook also contributes the result I obtained.
Now, I stay away from the Jupyter notebook to get reproducible results. And yes, it is now a little bit smaller than what I reported in the paper when I run it again (the notebook still shows higher results).
Excuse me. I train the model 20 times in
LSTM+Dense
model, but the highest maximum accuracy is only 68.42%. In your paper,due to random initialization of GPU and CPU used for computation, it needs to be performed several times to obtain the similar result although a fixed random number is initialized at the top of the computer program for deep learning computation.
How many performs are needed to achieve similar results?