Closed dnth closed 9 years ago
I'm going to go ahead and close this issue -- in my opinion the HF optimizer is dead, as more recent techniques (RMSProp, ESGD, Adam) have surpassed its performance in terms of both speed and accuracy. The HF code is also not compatible with Python 3.
The hf optimizer does not work if implemented on the xor.py in the example folder.
Below is the error returned on the compiler: Traceback (most recent call last): File "/home/camaro/workspace/theanets/xor.py", line 14, in
e.train([X, Y], optimize='hf', patience=5000, batchsize=4)
File "/home/camaro/theanets/theanets/main.py", line 258, in train
for in self.itertrain(_args, _kwargs):
File "/home/camaro/theanets/theanets/main.py", line 321, in itertrain
opt = self.create_trainer(opt, _kwargs)
File "/home/camaro/theanets/theanets/main.py", line 213, in create_trainer
return factory(_args, kw)
File "/home/camaro/theanets/theanets/trainer.py", line 681, in init
None)
File "/tmp/hf.py", line 66, in init**
Gv = gauss_newton_product(costs[0], p, v, s)
File "/tmp/hf.py", line 14, in gauss_newton_product
Jv = T.Rop(s, p, v)
File "/home/camaro/Theano/theano/gradient.py", line 292, in Rop
elif seen_nodes[out.owner][out.owner.outputs.index(out)] is None:
KeyError: None