I am trying to run the new MPT models by MosaicML with pygpt4all. In loading the following, I get a "bad magic" error. How do i overcome it? I've checked https://github.com/ggerganov/llama.cpp/issues and there aren't similar issues reported for the MPT models.
Code:
from pygpt4all.models.gpt4all_j import GPT4All_J
model = GPT4All_J('./models/ggml-mpt-7b-chat.bin')
Error:
runfile('C:/Data/gpt4all/gpt4all_cpu2.py', wdir='C:/Data/gpt4all')
gptj_model_load: invalid model file './models/ggml-mpt-7b-chat.bin' (bad magic)
Windows fatal exception: int divide by zero
I am trying to run the new MPT models by MosaicML with pygpt4all. In loading the following, I get a "bad magic" error. How do i overcome it? I've checked https://github.com/ggerganov/llama.cpp/issues and there aren't similar issues reported for the MPT models.
Code:
Error: