NVIDIA-AI-IOT / nanoowl

A project that optimizes OWL-ViT for real-time inference with NVIDIA TensorRT.
Apache License 2.0
232 stars 42 forks source link

Torch2TRT not being found #14

Closed costasvav closed 7 months ago

costasvav commented 7 months ago

I have put on a fresh install of JetPack 5.1.2 on my Seeed Orin NX 8GB using a J401 carrier board. I have installed Torch 2.1.0, Torchvision 0.16, and Torch2trt. I can import torch2trt in Python3 when I first start up the device. However, when I attempt to start NanoOWL, it cannot find torch2trt (ModuleNotFoundError: No module named 'torch2trt'). The command that causes this is predictor = OwlPredictor(args.model, image_encoder_engine=args.image_encoder_engine). Once I call this, torch2trt is no longer able to be imported to Python3, as it was before I called this command. Is OwlPredictor somehow changing paths to libraries? Thoughts?

jaybdub commented 7 months ago

Hi @costasvav ,

Thanks for reaching out.

Does running the following fail at the last line?

import torch2trt

predictor = OwlPredictor(args.model, image_encoder_engine=args.image_encoder_engine)

import torch2trt

Best, John

costasvav commented 7 months ago

I found my issue, it was my beginner fault. I was in a folder that had a torch2trt subfolder, hence able to import a library with that name. It was not the actual library however. I still have issues installing torch2trt, but I will bring that up on that repo.