Closed hydrosquall closed 10 months ago
This fix saved my presentation of llm-gpt4all to my colleagues when our internet connection failed 👍
Code used: Dockerfile:
FROM python:3.11
WORKDIR /code
COPY ./requirements.txt /code/requirements.txt
RUN pip install --no-cache-dir --upgrade -r /code/requirements.txt
# Download mistral-7b-instruct-v0
RUN llm -m mistral-7b-instruct-v0 "Fun fact about AI?" --no-log --no-stream
# Set default model
RUN llm models default mistral-7b-instruct-v0
# Fix no internet bug using https://github.com/simonw/llm-gpt4all/pull/18
COPY llm_gpt4all.py /usr/local/lib/python3.11/site-packages/
requirements.txt:
datasette
llm
llm-gpt4all
Motivation
Currently, the library tries to download the model even if it already exists locally, which prevents offline use.
Fixes https://github.com/simonw/llm-gpt4all/issues/10 , applying a code hint and investigation from @rotterb
Changes
Avoid trying to download model when that model's file already exists
Testing
I tried running the plugin with wifi disabled using a model that I had already downloaded.