02 Problem with CudaNdarray in the proposed solution.
Traceback (most recent call last):
File "ex_02_detect_negative.py", line 63, in
f(0.)
File "/home/danielcanelhas/workspace/Theano/theano/compile/function_module.py", line 588, in call
outputs = self.fn()
File "/home/danielcanelhas/workspace/Theano/theano/gof/link.py", line 761, in f
raise_with_op(node, thunk)
File "/home/danielcanelhas/workspace/Theano/theano/gof/link.py", line 759, in f
wrapper(i, node, _thunks)
File "/home/danielcanelhas/workspace/Theano/theano/gof/link.py", line 774, in wrapper
f(_args)
File "ex_02_detect_negative.py", line 41, in neg_check
do_check_on(x, node, fn)
File "ex_02_detect_negative.py", line 32, in do_check_on
if var.min() < 0:
AttributeError: 'CudaNdarray' object has no attribute 'min'
Apply node that caused the error: GpuFromHost(x)
Inputs types: [TensorType(float32, scalar)]
Inputs shapes: ['No shapes']
Inputs strides: ['No strides']
Inputs scalar values: ['not scalar']
HINT: Re-running with most Theano optimization disabled could give you a back-traces when this node was created. This can be done with by setting the Theano flags optimizer=fast_compile
HINT: Use the Theano flag 'exception_verbosity=high' for a debugprint of this apply node.
03 problem with GpuSoftmax in the proposed solution.
workarounds may be to compare the fgraph as a string containing "Softmax", though I'm not sure how much you like it:
s = str(f.maker.fgraph)
if "Softmax" in s:
return True
return False
alternatively :
import theano.sandbox.cuda as CUDA
...
if isinstance(app.op, T.nnet.Softmax) or isinstance(app.op, CUDA.nnet.GpuSoftmax):
return True
return False
02 Problem with CudaNdarray in the proposed solution.
workarounds may be to compare the fgraph as a string containing "Softmax", though I'm not sure how much you like it:
alternatively :