Open thesword53 opened 1 year ago
I dont see the torch-tensorrt code in the link you shared.
@bowang007 Keep an eye on this, might be related to some of your PRs
I'm also having this issue
I also noticed a simple sum between 2 fp16 tensors implicitly cast them to a fp32 tensor.
I'm also having this issue, how to slove it?
I am encountering the same issue.
This PR can help resolve above issue. Thanks!
@bowang007 Is there any update for your commit? It seems fail in a few check. Eagerly looking forward to your update.
also having this issue!
This PR can help resolve above issue. Thanks!
There is a new error with this PR. Is there any update?
Hi @johnzlli , can you try using dynamo path instead? We are now supporting Dynamo since Torchscript path is being deprecated. Thanks!
Hi @johnzlli , can you try using dynamo path instead? We are now supporting Dynamo since Torchscript path is being deprecated. Thanks!
Thanks for your reply! Dynamo is a great work, but there is no way to export the compiled model. So that we have to still use torchscript.
Bug Description
TensorRT throws error about fp32 tensors input despite I am using fp16 tensors as input.
I attached the file
IFRNet.py
adapted from https://github.com/ltkong218/IFRNet/blob/main/models/IFRNet.pyTo Reproduce
Steps to reproduce the behavior:
Expected behavior
Environment
conda
,pip
,libtorch
, source): Arch Linux AURAdditional context
IFRNet.py.gz