simonw / llm-gpt4all

Plugin for LLM adding support for the GPT4All collection of models
Apache License 2.0
218 stars 20 forks source link

Error: Model filename not in model list #8

Open dv0zn3r opened 1 year ago

dv0zn3r commented 1 year ago

Unexpected behaviour: For models ggml-mpt-7b-base, ggml-mpt-7b-instruct, ggml-gpt4all-j-v1, after the first prompt the model is not downloaded, instead an error occurs Error: Model filename not in model list: <model name>.

ryanfb commented 1 year ago

With Python gpt4all 1.0.8, I currently get the following models working out of the box with llm + llm-gpt4all:

The following models give me the "Model filename not in model list" error out of the box:

However, I've found a workaround that works for some (all?). If you drop into a Python shell and run e.g.:

from gpt4all import GPT4All
model = GPT4All("ggml-vicuna-7b-1.1-q4_2.bin")

Using the filename from the exception, subsequent runs of llm seem be able to use the cached model.

The Python source for the gpt4all package is available here: https://github.com/nomic-ai/gpt4all/tree/main/gpt4all-bindings/python