issues
search
maggiez0138
/
Swin-Transformer-TensorRT
This project aims to explore the deployment of Swin-Transformer based on TensorRT, including the test results of FP16 and INT8.
MIT License
161
stars
29
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
how to deploy swint on the xavier?
#16
GeneralJing
closed
1 year ago
1
RuntimeError: shape '[1, 5, 7, 5, 7, 1]' is invalid for input of size 1369
#15
iPengXPro
opened
1 year ago
0
Export onnx file need add param --quantize?
#14
zhu2bowen
opened
1 year ago
0
SwinIR fp16 mode output black pics
#13
zhu2bowen
opened
1 year ago
0
cannot deal with input image w !=h?
#12
zhu2bowen
closed
1 year ago
0
Swin does not support dynamic input shape after tracing the module
#11
fatemebafghi
opened
1 year ago
1
Hello, I have not reproduced the income of the batch you mentioned!
#10
tensorflowt
closed
1 year ago
1
Support Swin transformer object detection
#9
manhtd98
opened
2 years ago
1
ModuleNotFoundError:No module named 'trt.engine'
#8
tensorflowt
closed
1 year ago
1
No improvement using fp16 mode
#7
zjujh1995
opened
2 years ago
2
the link(torch/onnx/symbolic_opset9.py) is expired
#6
linuxmi
closed
1 year ago
2
the operator roll to ONNX
#5
linuxmi
closed
2 years ago
0
Where are the diagrams below obtained ?
#4
tensorflowt
closed
1 year ago
1
avergae FPS of each model
#3
Linaom1214
closed
1 year ago
2
Does it support dynamic batch inference?
#2
wangjingg
closed
1 year ago
5
question about CUDA Version
#1
PigBroA
closed
2 years ago
2