Closed otakurg closed 10 months ago
Try editing start webui.bat to: call python server.py --auto-devices --chat --wbits 4 --groupsize 128
try following this reddit thread : https://www.reddit.com/r/Oobabooga/comments/12b448p/loading_gpt4_x_alpaca_13b_native_4bit_128g_could/
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Hi,
Source from where I followed steps on using oobabooga+gpt4-x-alpaca: https://www.youtube.com/watch?v=nVC9D9fRyNU&t=191s
I was trying to use the gpt4-x-alpaca-13b-native-4bit-128g model to try out oobabooga web-ui. I encountered the following error:
Traceback (most recent call last): File "F:\Ai trepreneur\AI\IMAGE AI\oobabooga-windows\oobabooga-windows\text-generation-webui\server.py", line 302, in
shared.model, shared.tokenizer = load_model(shared.model_name)
File "F:\Ai trepreneur\AI\IMAGE AI\oobabooga-windows\oobabooga-windows\text-generation-webui\modules\models.py", line 170, in load_model
model = AutoModelForCausalLM.from_pretrained(checkpoint, **params)
File "F:\Ai trepreneur\AI\IMAGE AI\oobabooga-windows\oobabooga-windows\installer_files\env\lib\site-packages\transformers\models\auto\auto_factory.py", line 471, in from_pretrained
return model_class.from_pretrained(
File "F:\Ai trepreneur\AI\IMAGE AI\oobabooga-windows\oobabooga-windows\installer_files\env\lib\site-packages\transformers\modeling_utils.py", line 2349, in from_pretrained
raise EnvironmentError(
OSError: Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory models\gpt4-x-alpaca-13b-native-4bit-128g.