mattcurf / ollama-intel-gpu

29 stars 4 forks source link

Upstream docker image? #10

Open pepijndevos opened 1 week ago

pepijndevos commented 1 week ago

Hi this docker image has been working great for me and of course it's totally valid to want to manage your own software, but I just wanted to point out that there is an upstream docker image that ships with everything preinstalled.

I adjusted your docker file to the following to launch ollama from their image:

FROM intelanalytics/ipex-llm-inference-cpp-xpu:latest

ENV ZES_ENABLE_SYSMAN=1
ENV OLLAMA_HOST=0.0.0.0:11434

RUN mkdir -p /llm/ollama; \
    cd /llm/ollama; \
    init-ollama;

WORKDIR /llm/ollama

ENTRYPOINT ["./ollama", "serve"]
mattcurf commented 1 week ago

The original idea of this repo was to understand the 'base pieces" required to enable Ollama on an Intel GPU "from scratch". Having said that, given the increased interest it would make sense to apply the change above. I'd probably branch the top of tree and keep that as an alternate path for those interested.

mattcurf commented 1 week ago

See https://github.com/mattcurf/ollama-intel-gpu/pull/11