Closed SumukhaNadig closed 4 years ago
Hi, You can set custom_load parameter to True and provide the path of the saved weights to custom_load_path in the DeepQLearning.cfg file. This will initialize the NN weights with saved weights.
Thank you, just wanted to know which file the weights are getting stored to? I couldn't see any .npy files and all of them seem to have a different encoding.
The weights are not stored as .npy file, rather they are stored as checkpoints.
This might help: https://stackoverflow.com/questions/41265035/tensorflow-why-there-are-3-files-after-saving-the-model
Thank you. Just want to confirm, its enough to have ../../drone0/drone0_user in the load path right? In the given config file, it was given as ../../drone0/drone0. Also I don't need to explicitly load index and meta files, I assume and just drone0/drone0_user should be able to load all 3 required files.
Also just wanted to know what the DeepReinforce.cfg does, couldn't find any documentation on it.
Yes it is enough. If you are using saver.restore() then you don't need to explicitly load other files.
That is work in progress. Apparently I mistakenly pushed the file to git. It will be the Policy Gradient RL algorithm.
Thank you so much! :)
Hi, while training my system crashes at times due to unknown reasons so I was wondering if there is any way to reload the weights from previous training sessions which were saved at some point instead of starting all over.