GeekAlexis / FastMOT

High-performance multiple object tracking based on YOLO, Deep SORT, and KLT 🚀
MIT License
1.12k stars 256 forks source link

Provide trt model #236

Closed cross-hello closed 2 years ago

cross-hello commented 2 years ago

Your issue may already be reported! Please search the issues before creating one.

Current Behavior

I am converting onnx model to trt using the code below (code clone from here ):

import os
import tensorrt as trt

def convert(model_path, engine_file_path):
    TRT_LOGGER = trt.Logger()
    #model_path='FashionMNIST.onnx'
    #engine_file_path = "FashionMNIST.trt"
    EXPLICIT_BATCH = 1 << (int)(trt.NetworkDefinitionCreationFlag.EXPLICIT_BATCH)#batchsize=1

    with trt.Builder(TRT_LOGGER) as builder, builder.create_network(EXPLICIT_BATCH)\
     as network, trt.OnnxParser(network, TRT_LOGGER) as parser:
        builder.max_workspace_size = 1 << 28
        builder.max_batch_size = 1
        if not os.path.exists(model_path):
            print('ONNX file {} not found.'.format(model_path))
            exit(0)
        print('Loading ONNX file from path {}...'.format(model_path))
        with open(model_path, 'rb') as model:
            print('Beginning ONNX file parsing')
            if not parser.parse(model.read()):
                print ('ERROR: Failed to parse the ONNX file.')
                for error in range(parser.num_errors):
                    print (parser.get_error(error))

        network.get_input(0).shape = [1, 1, 28, 28]
        print('Completed parsing of ONNX file')
        engine = builder.build_cuda_engine(network)
        with open(engine_file_path, "wb") as f:
            f.write(engine.serialize())

import sys
if __name__=='__main__':
    l=sys.argv
    print(l)
    if len(l)<3:
        print(f'Usage: {l[0]} input_onnx_filename  output_trt_file_name')
    convert(l[1],l[2])

Now the following screenshot seems onnx model parse fail. image

How to Reproduce

Describe what you want to do

  1. What input videos you will provide, if any:
  2. What outputs you are expecting:
  3. Ask your questions here, if any:
    • If you can, please provide trt model as is. (The model downloaded using FastMOT/scripts/download_models.sh don't have trt model native)
    • Else provide me a way to convert. Thank you before~

Your Environment

Common issues

  1. GStreamer warnings are normal
  2. If you have issues with GStreamer on Desktop, disable GStreamer and build FFMPEG instead in Dockerfile
  3. TensorRT plugin and engine files have to be built on the target platform and cannot be copied from a different architecture
  4. Reinstalled OpenCV is usually not as optimized as the one shipped in Jetpack
cross-hello commented 2 years ago

Or if I use inside code of converting onnx to trt, the error throws out.
image

cross-hello commented 2 years ago

It is something weird, the problem solves after redownload corresponding modules.

tgbaoo commented 6 months ago

It is something weird, the problem solves after redownload corresponding modules.

hey could you tell more detail about redownload corresponding modules is what modules? Thanks

cross-hello commented 6 months ago

Sorry for unable to help.

That is the file working in last company.

And there is a time.