snuspl / dolphin

14 stars 2 forks source link

Change interface Layer to abstract class LayerBase #111

Closed jsjason closed 9 years ago

jsjason commented 9 years ago

This PR tries to change the Layer interface to an abstract class, minimizing the code that subclasses need to write. It tries to use a single abstract class to deal with both 'LearnableLayers' and 'UnlearnableLayers'. As a result, 'UnlearnableLayers' will have an unused layerParameter field, but need not care about it anyhow.

Closes #109.

jsjason commented 9 years ago

@beomyeol, would you kindly take a look at this?

beomyeol commented 9 years ago

@jsjason I'll take a look at this.

beomyeol commented 9 years ago

@jsjason This looks good. By this change, we should check isLearnable() when LocalNeuralNetParameterProvider generates parameter updates for each layer. Could you please modify LocalNeuralNetParameterProvider?

jsjason commented 9 years ago

@beomyeol, I did a search and found that NeuralNetwork was the class that was actually calling Layer.setParameters(). I pushed a commit that fixes that part to use LayerBase.isLearnable(). Please check, thanks.

Also, we have a new issue: Layer.getParameter() is being called at NeuralNetwork.backPropagateFromTo() for back propagation. Adding isLearnable() here wasn't trivial, so I left it as it is. Let's do a offline discussion about this later.

beomyeol commented 9 years ago

@jsjason Layer parameters are needed to compute error gradients for fully connected layer. we need offline discussion on this issue.

Other things are fine.

dongjoon-hyun commented 9 years ago

Good.

jsjason commented 9 years ago

@beomyeol and I discussed offline. We decided to tackle the other issues (the getParameter call in NeuralNetwork.backPropagateFromTo()) in another PR. For the meantime, this PR is enough for FullyConnectedLayers so @beomyeol agreed in merging this.