Since the storage of every weight and gradWeight is changed, this function should be called only once on a given network.
However, in train.lua, this function gets called once every epoch. I'm not entirely sure how this affects the model or the results though. In any case, just putting this out there.
For the function model:getParameters(), here's what the documentation (https://github.com/torch/nn/blob/master/doc/module.md#flatparameters-flatgradparameters-getparameters) says:
However, in train.lua, this function gets called once every epoch. I'm not entirely sure how this affects the model or the results though. In any case, just putting this out there.