Issue Summary: Llama 3.1 via gpt4all returns blank responses. Llama 3.0 works fine.
Possible Fix: According to this gpt4all issue, the Llama 3.1 model requires gpt4all version 3.1.1, whereas llm-gpt4all requires gpt4all>=2.5.1. This is a problem for folks installing gpt4all via pip, as the latest pypi version of gpt4all is 2.7.0.
More Details:
When running llm (v0.15) and llm-gpt4all (v0.4) I can download the gpt4all llama 3 and llama 3.1 models.
The version of gpt4all installed on my machine (by way of llm-gpt4all) was 2.6.0, but I can manually upgrade to 2.7.0.
Issue Summary: Llama 3.1 via gpt4all returns blank responses. Llama 3.0 works fine.
Possible Fix: According to this gpt4all issue, the Llama 3.1 model requires gpt4all version 3.1.1, whereas llm-gpt4all requires
gpt4all>=2.5.1
. This is a problem for folks installing gpt4all via pip, as the latest pypi version of gpt4all is 2.7.0.More Details:
When running llm (v0.15) and llm-gpt4all (v0.4) I can download the gpt4all llama 3 and llama 3.1 models.
The version of gpt4all installed on my machine (by way of llm-gpt4all) was 2.6.0, but I can manually upgrade to 2.7.0.
Output from
llm models
:This works:
This returns a blank response:
Log shows: