Closed zx1292982431 closed 7 months ago
Add:
Thank you for your interest in our work. Accutually, inference=True
is only used for measuring the FLOPs in our experiments, and inference=True
or inference=False
will produce the same result theoretically. So, it doesn't matter if you set inference=False
in training and evaluation.
Thank you for your interest in our work. Accutually,
inference=True
is only used for measuring the FLOPs in our experiments, andinference=True
orinference=False
will produce the same result theoretically. So, it doesn't matter if you setinference=False
in training and evaluation.
Tankyou!
Aowesome job! I encountered some problems when trying to reproduce the OnlineSpeatialnet Mamba version. I hope to get your help. When I set the
inference=False
, the model can forward normally. But when I set theinference=True
, it can't work. Here is the Traceback:Additionally, I did that on single V100(32G) gpu, and here are my environment configuration:
My WeChat ID is zx1292982431, if it can make our communication more convenient.