LucienShui / huggingface-vscode-endpoint-server

starcoder server for huggingface-vscdoe custom endpoint
Apache License 2.0
167 stars 56 forks source link

could I load the GPTQ-for-SantaCoder/starcoder-GPTQ-8bit-128g model? #5

Closed heber closed 1 year ago

heber commented 1 year ago

could I load the GPTQ-for-SantaCoder/starcoder-GPTQ-8bit-128g model offline? when i use the command "python main.py --pretrained {the path to starcoder-GPTQ-8bit-128g}, there is a error:

File ... transformers/pipelines/init.py", line 779, in pipeline framework, model = infer_framework_load_model( File ...transformers/pipelines/base.py , line 271, in infer_framework_load_model raise ValueError(f"Could not load model {model} with any of the following classes: {class tuple}.") ValueError: Could not load model starcoder-GPTQ-8bit-128g with any of the following classes: (<class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>, <class 'transformers.models.gpt2.modeling_gpt2.GPT2LMHeadModel'>).

LucienShui commented 1 year ago

I couldn't find any configuration file under mayank31398/starcoder-GPTQ-8bit-128g, I thought this model couldn't be loaded directly through transformers.