NVIDIA-AI-IOT / tf_trt_models

TensorFlow models accelerated with NVIDIA TensorRT
BSD 3-Clause "New" or "Revised" License
684 stars 244 forks source link

Unable to change batch_size #16

Open EvGe22 opened 6 years ago

EvGe22 commented 6 years ago

I am running a detection example with ssd_inception_v2 and I have changed the max_batch_size to 24, but when I try to actually compute a batch of any size other than 1 I get this error: ValueError: Cannot feed value of shape (12, 300, 300, 3) for Tensor 'input:0', which has shape '(1, ?, ?, 3)' Is there anything else that needs to be changed?

ghost commented 6 years ago

You may need to change the batch size of the placeholder node to be consistent with what you provide to trt.create_inference_graph. This can be done at the GraphDef level.

Could you try the following before calling trt.create_inference_node:

for node in frozen_graph.node:
    if 'Placeholder' in node.op:
        node.attr['shape'].shape.dim[0].size = 12 # or whatever you set max_batch_size to

Thanks!

EvGe22 commented 6 years ago

Yeah, I have figured it out eventually. Thanks I have actually changed the shape of the placeholder in the build_detection_graph function.

ghost commented 6 years ago

Great. I will work on adding this feature in.