Open testercell opened 3 months ago
What is your OS? I set the following
MODEL_ID = "TheBloke/Mistral-7B-Instruct-v0.1-GPTQ"
MODEL_BASENAME = "model.safetensors"
and got this
logging.INFO("GPTQ models will NOT work on Mac devices. Please choose a different model.")
TypeError: 'int' object is not callable
if you are using cuda use GPTQ model or you are on mac use GGUF https://youtu.be/ASpageg8nPw?t=74
I'm trying to use the following as the model id and base name MODEL_ID = "TheBloke/Mistral-7B-Instruct-v0.1-GPTQ" MODEL_BASENAME = "wizardLM-7B-GPTQ-4bit.compat.no-act-order.safetensors"
But when runing run_localgpt.py i get the following error \miniconda3\Lib\site-packages\auto_gptq\modeling_utils.py", line 147, in check_and_get_model_type raise TypeError(f"{config.model_type} isn't supported yet.") TypeError: mistral isn't supported yet.
Any help is super appreciated!!