Open mattm401 opened 7 years ago
tried running with core.DeviceScope(device)
set to CPU and it produced the same output--so potentially unrelated to GPU.
Okay, I was able to capture the error:
`ERROR main Incompatible constructor arguments. The following argument types are supported:
Running on CPU mode, the output is still the same but I cannot capture the error...
Worked this out in CPU Model, scrapped the .pb I had and re-pulled from the github models repo (updated init to exec in your code); however, in GPU mode. I get an error when switching to GPU/CUDA
blob->template IsType<TensorCPU>(). Blob is not a CPU Tensor: data
Does this suggest that Predictor only works on the CPU?
hi @mattm401 could you solve "Blob is not a CPU tensor: data" issue?
I'm facing the same situation
I haven't had a chance to look into this particular issue. I ended up getting my application running in CPU mode and speed hasn't been an issue in my application.
Similar to #799, I am trying to run through the Loading Pre-Trained Models Tutorial with GPU/CUDA/Windows10: https://caffe2.ai/docs/tutorial-loading-pre-trained-models.html
The following code executes:
`device_opts = core.DeviceOption(caffe2_pb2.CUDA, 0) workspace.FeedBlob('data', img, device_option=device_opts)
init_def = caffe2_pb2.NetDef() with open(INIT_NET, 'rb') as f: init_def.ParseFromString(f.read()) init_def.device_option.CopyFrom(device_opts) workspace.RunNetOnce(init_def.SerializeToString())
net_def = caffe2_pb2.NetDef() with open(PREDICT_NET, 'rb') as f: net_def.ParseFromString(f.read()) net_def.device_option.CopyFrom(device_opts) workspace.CreateNet(net_def.SerializeToString())
print 'Running net...' p = workspace.Predictor(init_def, net_def)`
However, the system outputs a bunch of data about the model/predictor and then immediately exits without running the rest of the code or providing any additional information as to why the system has exited. Has anyone seen this behavior before?