Closed xisnu closed 7 years ago
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs, but feel free to re-open it if needed.
Is it possible that you have to set the start epoch to prevent it from starting from scratch?
@Cerno-b what do you mean ? I do not understand
I was thinking it could be related to this question: https://stackoverflow.com/questions/52476191/what-does-initial-epoch-in-keras-mean/52478034
I am trying to implement a simple BLSTM-CTC model using Keras (backend Tensorflow). I am testing this with a small dataset of online handwriting samples (316 Train data with 10 distinct characters and 4 words). each of these samples have 401 timesteps and at each timestep I have 16 features. So the input is a Numpy array of dimension [316,401,16]. My network is implemented successfully as suggested by this example. My code is as follows
This network is compiled successfully. Now I am running it and saving with
When I am running the training everything is working fine. The CTC error is reducing as expected. But when I am trying to load the model from a previous state then it is not restoring from the last saved state. Say I have executed 10 epochs
But when I am loading it again
Clearly there is something wrong as the loaded model is not starting from an error around 10.4657. I also tried to save the whole model with
save()
andload_model()
which gave me an error "KeyError: CTC Lambda Func not found". I am totally in dark. Is the Lmbda layer creating any problem? Please help if possible. Thank you for your time.