nitishsrivastava / deepnet

Implementation of some deep learning algorithms.
BSD 3-Clause "New" or "Revised" License
896 stars 438 forks source link

Linear Layer error positive phase #23

Closed rdevon closed 11 years ago

rdevon commented 11 years ago

Train Step: 0Traceback (most recent call last): File "../../../deepnet/deepnet/trainer.py", line 56, in main() File "../../../deepnet/deepnet/trainer.py", line 51, in main model.Train() File "/na/homes/dhjelm/CUDANET/deepnet/deepnet/neuralnet.py", line 646, in Train losses = self.TrainOneBatch(step) File "/na/homes/dhjelm/CUDANET/deepnet/deepnet/dbm.py", line 270, in TrainOneBatch losses1 = self.PositivePhase(train=True, step=step) File "/na/homes/dhjelm/CUDANET/deepnet/deepnet/dbm.py", line 131, in PositivePhase self.ComputeUp(node, train=train) File "/na/homes/dhjelm/CUDANET/deepnet/deepnet/dbm.py", line 108, in ComputeUp if layer.pos_phase: AttributeError: 'LinearLayer' object has no attribute 'pos_phase'

ghost commented 11 years ago

fixed now.

rdevon commented 11 years ago

Hey Nitish, sorry another question.

I'm slowly digging deeper into the code. Does l1decay need to be negative? It appears, on the surface, to be the case, but I have no idea what "edge.temp" is at that point (and I'm worried it's a catch-all variable), but I assume (the cuds code isn't clear on this detail), that if there is a target, the sign is relative to that. Why would I need to decay to anything but 0, and under what situations would I expect that to happen here?

thanks

-devon

On Apr 3, 2013, at 3:22 AM, Nitish Srivastava notifications@github.com wrote:

fixed now.

— Reply to this email directly or view it on GitHub.