Open TengliEd opened 6 years ago
The adam file has been uploaded with notes on how to compile this into the torch framework. Actually, the adam_state function is developed for finetuning the trained framework with existing adam parameters, which is not implemented by Torch before, and doesn't influence the training process. So you can either incorporate our adam file or just use the inherent adam.
Thanks for your question.
Hi, To train the model, I just followed the instructions step-by-step. Put the given adam_state.lua in ./torch/pkg/optim and edit init.lua by adding a new line: require("optim.adam_state") then make. but as I tried to train the ecnn, got the error: luajit: ...er/torch/install/share/lua/5.1/optim/init.lua:35: module 'optim.adam_state.lua' not found: no field package.preload['optim.adam_state.lua'] ...
Did I do something wrong?
@yutseng318 You may need to add the new line at the end of the original script.
@TengliEd Tkx
In line 230 of training_reflection_ecnn.lua, optim.adam_state was a nil value as I called it. You mean optim.adam?