Closed elementdl closed 4 years ago
Hi @elementdl, the SSD-Mobilenet-v2 model from Hello AI World is from TensorFlow and was converted using this tool: https://github.com/AastaNV/TRT_object_detection
Hi @dusty-nv,
Thank you for your response. I was wondering, the librairy jetson.inference, is it a library directly for the jetson nano or is it a specific a library for the project hello ai world?
Because, in your tutorial it's very easy to chose a model for object detection or recognition, but it's mandatory to chose specific models. Let's say I would like to use the Yolov3-tiny model. Is there some library in python like jetson.infence that allows me to use this model on the jetson nano?
Sincerely,
The jetson.inference library uses TensorRT underneath for accelerated inferencing on Jetson platforms, including Nano/TX1/TX2/Xavier. TensorRT can load models from frameworks trained with caffe, TensorFlow, PyTorch, or models in ONNX format.
That said, there is also typically some pre-/post-processing code required to support the models. Pre-processing typically includes conversion from RGBA to planar NCHW format that the DNN's expect, in addition to mean pixel subtraction. Post-processing is interpretation of the DNN outputs, for example bounding box clustering for detection models.
The pre-/post-processing support isn't included in jetson.inference for YOLO, however there is a TensorRT sample for YOLO included in JetPack 4.3 - you can find it on your Jetson at /usr/src/tensorrt/samples/python/yolov3_onnx
Hi @dusty-nv ,
I'm new to IA and computer Vision. I trained myself with YOLOv3, with the darknet framework. I just finished your tutoriel with Hello AI world for the Jetson nano, and I was wondering : which framework do I use when I use SSD-MobileNet-V2?
The reason is I'm trying to use the jetson nano for a part of an autonomous car (a prototype so way smaller) by using real-time object detection, and I'm currently thinking about what framework to use on the jetson, what model to use (YoloV3-tiny or SSD-MobileNet-V2).
Thanks for your response,