defog-ai / sqlcoder

SoTA LLM for converting natural language questions to SQL queries
Apache License 2.0
3.44k stars 219 forks source link

How to inference in the model 'SQLCoder-7B' #27

Closed robinji0 closed 1 year ago

robinji0 commented 1 year ago

I use the inference code to load ''SQLCoder-7B', it doesn't work. So what't the inference code for 'SQLCoder-7B'? Could you add an example on README?

rishsriv commented 1 year ago

Can you share the code snippet you used to run the model? To loads the 7B param model, you should be changing the code to defog/sqlcoder-7b. Seems to be working for me.

robinji0 commented 1 year ago

Can you share the code snippet you used to run the model? To loads the 7B param model, you should be changing the code to defog/sqlcoder-7b. Seems to be working for me.

I use below code to load the model, the path is the local path on my machine. I add the param from_tf=True because the error shows 'If you tried to load a PyTorch model from a TF 2.0 checkpoint, please set from_tf=True.'
''' tokenizer = AutoTokenizer.from_pretrained("/workspace/sqlcoder-7b", trust_remote_code=True) model = AutoModelForCausalLM.from_pretrained("/workspace/sqlcoder-7b", trust_remote_code=True, torch_dtype=torch.float16,

load_in_8bit=True,

                                  from_tf=True,
                                  device_map="auto",
                                  use_cache=True)

'''

After I add the param, another error happens: Traceback (most recent call last): model = AutoModelForCausalLM.from_pretrained("/workspace/sqlcoder-7b", File "/home/pai/envs/sqlcoder/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 565, in from_pretrained return model_class.from_pretrained( File "/home/pai/envs/sqlcoder/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3265, in from_pretrained if resolved_archive_file.endswith(".index"): AttributeError: 'list' object has no attribute 'endswith'

rishsriv commented 1 year ago

Ah it seems like you've built huggingface with tensorflow. Mistral-7B (the underlying model for sqlcoder-7b) does not yet work with the tensorflow build.

You'll have to reinstall huggingface with a pytorch build to run it.

zhouenxian commented 1 year ago

I use the inference code to load ''SQLCoder-7B' ,but an error occur. Here are traceback information:

Traceback (most recent call last): File "/datas/work/zex/llm/sqlcoder-7b/spider_sqlcoder_7b.py", line 103, in model = AutoModelForCausalLM.from_pretrained( File "/home/linewell/anaconda3/envs/SQLCoder/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py", line 482, in from_pretrained config, kwargs = AutoConfig.from_pretrained( File "/home/linewell/anaconda3/envs/SQLCoder/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py", line 1022, in from_pretrained config_class = CONFIG_MAPPING[config_dict["model_type"]] File "/home/linewell/anaconda3/envs/SQLCoder/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py", line 723, in getitem raise KeyError(key) KeyError: 'mistral'