NVIDIA-AI-IOT / tf_trt_models

TensorFlow models accelerated with NVIDIA TensorRT
BSD 3-Clause "New" or "Revised" License
681 stars 245 forks source link

jetson nano not enough memory #85

Open giahuytn opened 3 years ago

giahuytn commented 3 years ago

hello all, I tried to run detection.ipynb on jetsion nano model B1 (jetpack: 4.5 b129 and TF: 1.15) . It threw error: memory not enough when create inference

trt_graph = trt.create_inference_graph(
    input_graph_def=frozen_graph,
    outputs=output_names,
    max_batch_size=1,
    max_workspace_size_bytes=1 << 25,
    precision_mode='FP16',
    minimum_segment_size=50
)

how can i fix this? thanks,