I have llama3-awq-int4 model path in my local /data. And when I start the script, it connect the huggingface to download file, actuall, I have network issue, so it raise requests.exceptions.ConnectionError, but I wonder why it can load model with local?
And I just want to know, how can I restart with the local file?
System Info
with docker method.
Information
Tasks
Reproduction
I want to start lorax server with docker. And the shell script is :
I have llama3-awq-int4 model path in my local
/data
. And when I start the script, it connect the huggingface to download file, actuall, I have network issue, so it raiserequests.exceptions.ConnectionError
, but I wonder why it can load model with local?And I just want to know, how can I restart with the local file?
Expected behavior
run local model with docker.