simonw / llm-gpt4all

Plugin for LLM adding support for the GPT4All collection of models
Apache License 2.0
218 stars 20 forks source link

fix: allow local models to work without internet connection #18

Closed hydrosquall closed 10 months ago

hydrosquall commented 1 year ago

Motivation

Currently, the library tries to download the model even if it already exists locally, which prevents offline use.

Fixes https://github.com/simonw/llm-gpt4all/issues/10 , applying a code hint and investigation from @rotterb

Changes

Avoid trying to download model when that model's file already exists

Testing

I tried running the plugin with wifi disabled using a model that I had already downloaded.

llm uninstall llm-gpt4all
# changed into this project's directory
pip install -e '.[test]'
# turn wifi off
llm -m mistral-7b-instruct-v0 "hello world"
# received proper output
learning4life commented 1 year ago

This fix saved my presentation of llm-gpt4all to my colleagues when our internet connection failed 👍

Code used: Dockerfile:

FROM python:3.11
WORKDIR /code
COPY ./requirements.txt /code/requirements.txt
RUN pip install --no-cache-dir --upgrade -r /code/requirements.txt
# Download mistral-7b-instruct-v0 
RUN llm -m mistral-7b-instruct-v0 "Fun fact about AI?" --no-log --no-stream  
# Set default model
RUN llm models default mistral-7b-instruct-v0
# Fix no internet bug using https://github.com/simonw/llm-gpt4all/pull/18
COPY llm_gpt4all.py /usr/local/lib/python3.11/site-packages/

requirements.txt:

datasette
llm 
llm-gpt4all