Open EvGe22 opened 6 years ago
You may need to change the batch size of the placeholder node to be consistent with what you provide to trt.create_inference_graph. This can be done at the GraphDef level.
Could you try the following before calling trt.create_inference_node:
for node in frozen_graph.node:
if 'Placeholder' in node.op:
node.attr['shape'].shape.dim[0].size = 12 # or whatever you set max_batch_size to
Thanks!
Yeah, I have figured it out eventually. Thanks
I have actually changed the shape of the placeholder in the build_detection_graph
function.
Great. I will work on adding this feature in.
I am running a detection example with ssd_inception_v2 and I have changed the
max_batch_size
to 24, but when I try to actually compute a batch of any size other than 1 I get this error:ValueError: Cannot feed value of shape (12, 300, 300, 3) for Tensor 'input:0', which has shape '(1, ?, ?, 3)'
Is there anything else that needs to be changed?