Closed eeyrw closed 4 hours ago
We only conduct testing on Linux. Please consult the LMDeploy team (https://github.com/InternLM/lmdeploy) for this issue.
Sure. I just did not notice this repo is for InternVL rather than LMDeploy.
For those who care this: https://github.com/InternLM/lmdeploy/issues/2684
Motivation
I tried to deploy InterlVL-1B by LMDeploy which said this model is not supported by turbomind so I added ‘--backend pytorch’ but finally got such kind of error:
Triton does not support Windows. So does it mean pytorch backend of LMDeploy is totally not supported on Windows? Is there any workaround for this?
Related resources
No response
Additional context
No response