nomic-ai / pygpt4all

Official supported Python bindings for llama.cpp + gpt4all
https://nomic-ai.github.io/pygpt4all/
MIT License
1.02k stars 162 forks source link

GPT4ALL + MPT ---> Bad Magic error ? #110

Open gykung opened 1 year ago

gykung commented 1 year ago

I am trying to run the new MPT models by MosaicML with pygpt4all. In loading the following, I get a "bad magic" error. How do i overcome it? I've checked https://github.com/ggerganov/llama.cpp/issues and there aren't similar issues reported for the MPT models.

Code:


from pygpt4all.models.gpt4all_j import GPT4All_J

model = GPT4All_J('./models/ggml-mpt-7b-chat.bin')

Error:


runfile('C:/Data/gpt4all/gpt4all_cpu2.py', wdir='C:/Data/gpt4all')

gptj_model_load: invalid model file './models/ggml-mpt-7b-chat.bin' (bad magic)

Windows fatal exception: int divide by zero