av / harbor

Effortlessly run LLM backends, APIs, frontends, and services with one command.
https://github.com/av/harbor
Apache License 2.0
1.6k stars 107 forks source link

Harbor How Error - Will not successfully build the container on first use #108

Closed ColumbusAI closed 2 months ago

ColumbusAI commented 2 months ago

I am using Harbor in a WSL2 container and when I try to use harbor how for the first-time, I get this on several attempts:

harbor how [+] Building 2.2s (7/11) docker:default => [cmdh internal] load build definition from Dockerfile 0.0s => => transferring dockerfile: 389B 0.0s => [cmdh internal] load metadata for docker.io/pkgxdev/pkgx:latest 0.4s => [cmdh internal] load .dockerignore 0.0s => => transferring context: 2B 0.0s => [cmdh 1/7] FROM docker.io/pkgxdev/pkgx:latest@sha256:3ec497e47fa662eef62718bfd92908cc969a4928eba8bbf3ba064ed5 0.0s => => resolve docker.io/pkgxdev/pkgx:latest@sha256:3ec497e47fa662eef62718bfd92908cc969a4928eba8bbf3ba064ed571e31 0.0s => [cmdh internal] load build context 0.0s => => transferring context: 31B 0.0s => CACHED [cmdh 2/7] WORKDIR /app 0.0s => ERROR [cmdh 3/7] RUN pkgx install node@20 npm git 1.7s

[cmdh 3/7] RUN pkgx install node@20 npm git: 1.584 install: target 'git': No such file or directory

failed to solve: process "/bin/bash -c pkgx install node@20 npm git" did not complete successfully: exit code: 1

av commented 2 months ago

Thank you for the report, I can reproduce it with:

harbor build cmdh

Using pkgx was one of the earlier choices, there was another instance where images based on it stopped working like this.

I switched the image to use node:lts instead and made some extra updates that should make it more compatible with smaller models and newer Ollama versions.

The changes are in main and will go live as a part of next release

ColumbusAI commented 2 months ago

Here's the new result based on the latest commit. Does Harbor How only work with model llama3.1?

As a test, I installed llama3.1 through Ollama and I still get the same error.

$ harbor how
(node:17) [DEP0040] DeprecationWarning: The `punycode` module is deprecated. Please use a userland alternative instead.
(Use `node --trace-deprecation ...` to show where the warning was created)
✖ Retrieving command...
file:///app/cmdh/node_modules/ollama/dist/utils.js:58
        throw new ResponseError(message, response.status);
              ^

ResponseError: model "llama3.1" not found, try pulling it first
    at file:///app/cmdh/node_modules/ollama/dist/utils.js:58:15
    at Generator.next (<anonymous>)
    at fulfilled (file:///app/cmdh/node_modules/ollama/dist/utils.js:4:58)
    at process.processTicksAndRejections (node:internal/process/task_queues:105:5) {
  error: 'model "llama3.1" not found, try pulling it first',
  status_code: 404
}

Node.js v22.13.1
av commented 2 months ago

llama3.1 is the default for cmdh service that is behind the harbor how, it can be configured with harbor cmdh model, I'm hesitant to include any larger LLMs into default downloads as it might not be something that user's system is ready to accommodate. There's definitely a room for improvement in terms of detection if the model is pulled or not before trying to run the service

av commented 2 months ago

The check for harbor cmdh model to be available is in main

ColumbusAI commented 2 months ago

it's working! I'm gonna close this out thanks @av! you rock man