KoboldAI / KoboldAI-Client

https://koboldai.com
GNU Affero General Public License v3.0
3.45k stars 743 forks source link

model type issue #439

Open HaarishIrfan2171 opened 3 months ago

HaarishIrfan2171 commented 3 months ago

I use the Qwen1.5-0.5B-Chat-AWQ model and i declare the model path as /home/administrator/KoboldAI-Client/models/qwen/Qwen1.5-0.5B-Chat-AWQ in linux server.

throw the Error like

Traceback (most recent call last): File "aiserver.py", line 10290, in load_model(initial_load=True) File "aiserver.py", line 2251, in load_model model_config = AutoConfig.from_pretrained(vars.custmodpth, revision=args.revision, cache_dir="cache") File "/home/administrator/KoboldAI-Client/runtime/envs/koboldai/lib/python3.8/site-packages/transformers/models/auto/configuration_auto.py", line 796, in from_pretrained config_class = CONFIG_MAPPING[config_dict["model_type"]] File "/home/administrator/KoboldAI-Client/runtime/envs/koboldai/lib/python3.8/site-packages/transformers/models/auto/configuration_auto.py", line 503, in getitem raise KeyError(key) KeyError: 'qwen2'

how to resolve it