Closed jsjason closed 9 years ago
@beomyeol, would you kindly take a look at this?
@jsjason I'll take a look at this.
@jsjason This looks good. By this change, we should check isLearnable()
when LocalNeuralNetParameterProvider
generates parameter updates for each layer.
Could you please modify LocalNeuralNetParameterProvider
?
@beomyeol, I did a search and found that NeuralNetwork
was the class that was actually calling Layer.setParameters()
. I pushed a commit that fixes that part to use LayerBase.isLearnable()
. Please check, thanks.
Also, we have a new issue: Layer.getParameter()
is being called at NeuralNetwork.backPropagateFromTo()
for back propagation. Adding isLearnable()
here wasn't trivial, so I left it as it is. Let's do a offline discussion about this later.
@jsjason Layer parameters are needed to compute error gradients for fully connected layer. we need offline discussion on this issue.
Other things are fine.
Good.
@beomyeol and I discussed offline. We decided to tackle the other issues (the getParameter
call in NeuralNetwork.backPropagateFromTo()
) in another PR. For the meantime, this PR is enough for FullyConnectedLayers so @beomyeol agreed in merging this.
This PR tries to change the Layer interface to an abstract class, minimizing the code that subclasses need to write. It tries to use a single abstract class to deal with both 'LearnableLayers' and 'UnlearnableLayers'. As a result, 'UnlearnableLayers' will have an unused
layerParameter
field, but need not care about it anyhow.Closes #109.