lmjohns3 / theanets

Neural network toolkit for Python
http://theanets.rtfd.org
MIT License
328 stars 73 forks source link

RNN not working #53

Closed majidaldo closed 9 years ago

majidaldo commented 9 years ago

commit 2e4d72557ca19a60141a53d298ac03ba29477088 broke it.

C:\Anaconda\lib\site-packages\theano\tensor\subtensor.py:114: FutureWarning: com
parison to `None` will result in an elementwise object comparison in the future.

  stop in [None, length, maxsize] or
C:\Anaconda\lib\site-packages\theano\scan_module\scan_perform_ext.py:85: Runtime
Warning: numpy.ndarray size changed, may indicate binary incompatibility
  from scan_perform.scan_perform import *
I 2014-12-16 14:11:01 theanets.trainer:142 compiling RmsProp learning function
Traceback (most recent call last):
  File "testrnn.py", line 65, in <module>
    xp.train(ecgb_trn, ecgb_val)
  File "c:\users\majid\documents\github\theano-nets\theanets\main.py", line 252,
 in train
    for _ in self.itertrain(*args, **kwargs):
  File "c:\users\majid\documents\github\theano-nets\theanets\main.py", line 315,
 in itertrain
    opt = self.create_trainer(opt, **kwargs)
  File "c:\users\majid\documents\github\theano-nets\theanets\main.py", line 207,
 in create_trainer
    return factory(*args, **kw)
  File "c:\users\majid\documents\github\theano-nets\theanets\trainer.py", line 3
54, in __init__
    super(RmsProp, self).__init__(network, **kwargs)
  File "c:\users\majid\documents\github\theano-nets\theanets\trainer.py", line 1
46, in __init__
    updates=list(network.updates) + list(self.learning_updates()))
  File "C:\Anaconda\lib\site-packages\theano\compile\function.py", line 223, in
function
    profile=profile)
  File "C:\Anaconda\lib\site-packages\theano\compile\pfunc.py", line 490, in pfu
nc
    no_default_updates=no_default_updates)
  File "C:\Anaconda\lib\site-packages\theano\compile\pfunc.py", line 217, in reb
uild_collect_shared
    raise TypeError(err_msg, err_sug)
TypeError: ('An update must have the same type as the original shared variable (
shared_var=W_xh_0_g1, shared_var.type=CudaNdarrayType(float32, matrix), update_v
al=Elemwise{add,no_inplace}.0, update_val.type=TensorType(float64, matrix)).', '
If the difference is related to the broadcast pattern, you can call the tensor.u
nbroadcast(var, axis_to_unbroadcast[, ...]) function to remove broadcastable dim
ensions.')