tensorflow / fold

Deep learning with dynamic computation graphs in TensorFlow
Apache License 2.0
1.82k stars 266 forks source link

Gradient and Adam Optimiser error #67

Open akigmm opened 7 years ago

akigmm commented 7 years ago

I am developing a NN for analysis for around 100000 sequences and am getting the following error

raise type(e)(node_def, op, message)

InvalidArgumentError: Input 0 of node Adam/update_Variable_67/ApplyAdam was passed float from _recv_Variable_67_0:0 incompatible with expected float_ref.

Since I am new to tensorflow I am not able to understand the error and how to fix it