nilsherzig / LLocalSearch

LLocalSearch is a completely locally running search aggregator using LLM Agents. The user can ask a question and the system will use a chain of LLMs to find the answer. The user can see the progress of the agents and the final answer. No OpenAI or Google API keys are needed.
Apache License 2.0
5.59k stars 355 forks source link

Exiting chain with error: model 'all-minilm' not found, try pulling it first #40

Closed Claudioappassionato closed 5 months ago

Claudioappassionato commented 5 months ago

Exiting chain with error: model 'all-minilm' not found, try pulling it first

I WAIVER. Too many hours without a solution. when you make a simple and functional project I will be happy

backend-1 | 2024/04/04 13:43:49 Error adding document: error adding document: model 'all-minilm' not found, try pulling it first backend-1 | 2024/04/04 13:43:49 error from evaluator: Error adding document: error adding document: model 'all-minilm' not found, try pulling it first backend-1 | 2024/04/04 13:43:49 Error adding document: error adding document: model 'all-minilm' not found, try pulling it first backend-1 | 2024/04/04 13:43:49 error from evaluator: Error adding document: error adding document: model 'all-minilm' not found, try pulling it first

rst backend-1 | 2024/04/04 13:43:27 INFO Creating new session session=9b2eea7b-c5eb-4662-908c-2435f77d3c75 backend-1 | 2024/04/04 13:43:27 INFO Starting agent chain session=9b2eea7b-c5eb-4662-908c-2435f77d3c75 userQuery="{Prompt:how much do OpenAI and Microsoft plan to spend on their new datacenter? MaxIterations:30 ModelName:knoopx/hermes-2-pro-mistral:7b-q8_0 Session:9b2eea7b-c5eb-4662-908c-2435f77d3c75}" startTime=2024-04-04T13:43:27.128Z searxng-1 | 2024-04-04 13:43:48,208 WARNING:searx.engines.qwant: ErrorContext('searx/engines/qwant.py', 191, 'raise SearxEngineAPIException(f"{msg} ({error_code})")', 'searx.exceptions.SearxEngineAPIException', None, ('unknown (27)',)) False searxng-1 | 2024-04-04 13:43:48,209 ERROR:searx.engines.qwant: exception : unknown (27) searxng-1 | Traceback (most recent call last): searxng-1 | File "/usr/local/searxng/searx/search/processors/online.py", line 163, in search searxng-1 | search_results = self._search_basic(query, params) searxng-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ searxng-1 | File "/usr/local/searxng/searx/search/processors/online.py", line 151, in _search_basic searxng-1 | return self.engine.response(response) searxng-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ searxng-1 | File "/usr/local/searxng/searx/engines/qwant.py", line 151, in response searxng-1 | return parse_web_api(resp) searxng-1 | ^^^^^^^^^^^^^^^^^^^ searxng-1 | File "/usr/local/searxng/searx/engines/qwant.py", line 191, in parse_web_api searxng-1 | raise SearxEngineAPIException(f"{msg} ({error_code})") searxng-1 | searx.exceptions.SearxEngineAPIException: unknown (27) backend-1 | 2024/04/04 13:43:49 Search found 44 Results backend-1 | 2024/04/04 13:43:49 error from evaluator: no content found

nilsherzig commented 5 months ago

when you make a simple and functional project I will be happy

Ey I'm doing a lot of free work over here :(.

Looks like you're still running on an old version of this repo. Please get the current one.

cartergrobinson commented 5 months ago

@Claudioappassionato you can also run ollama pull all-minilm on your Ollama server. This will download the model manually. Then, restart LLocalSearch and try again. @nilsherzig thanks for you all your hard work. Loving this.

Claudioappassionato commented 5 months ago

quando realizzerai un progetto semplice e funzionale sarò felice

Ehi, sto facendo un sacco di lavoro gratis qui :(.

Sembra che tu stia ancora utilizzando una vecchia versione di questo repository. Per favore prendi quello attuale.

I'm actually using the one that's on display on the page

g it first backend-1 | 2024/04/04 18:38:24 error from evaluator: Error adding document: error adding document: model 'all-minilm' not found, try pulling it first chromadb-1 | INFO: [04-04-2024 18:38:34] 172.19.0.2:41844 - "GET /api/v1/heartbeat HTTP/1.1" 200 chromadb-1 | INFO: [04-04-2024 18:38:34] 172.19.0.2:41844 - "GET /api/v1/version HTTP/1.1" 200 chromadb-1 | INFO: [04-04-2024 18:38:34] 172.19.0.2:41844 - "GET /api/v1/tenants/default_tenant HTTP/1.1" 200 chromadb-1 | INFO: [04-04-2024 18:38:34] 172.19.0.2:41844 - "GET /api/v1/databases/default_database?tenant=default_tenant HTTP/1.1" 200 chromadb-1 | INFO: [04-04-2024 18:38:34] 172.19.0.2:41844 - "GET /api/v1/pre-flight-checks HTTP/1.1" 200 chromadb-1 | INFO: [04-04-2024 18:38:34] 172.19.0.2:41844 - "POST /api/v1/collections HTTP/1.1" 200 backend-1 | Exiting chain with error: model 'all-minilm' not found, try pulling it first

package utils

import ( "context" "fmt" "log/slog" "os"

"github.com/google/uuid"
"github.com/ollama/ollama/api"
"github.com/tmc/langchaingo/llms/ollama"

)

func NewOllamaEmbeddingLLM() (*ollama.LLM, error) { modelName := "all-minilm:v2" return NewOllama(modelName) }

func NewOllama(modelName string) (*ollama.LLM, error) { return ollama.New(ollama.WithModel(modelName), ollama.WithServerURL(os.Getenv("OLLAMA_HOST")), ollama.WithRunnerNumCtx(16000)) }

func GetSessionString() string { return uuid.New().String() }

func CheckIfModelExistsOrPull(modelName string) error { if err := CheckIfModelExists(modelName); err != nil { slog.Warn("Model does not exist, pulling it", "model", modelName) if err := OllamaPullModel(modelName); err != nil { return err } } return nil }

func GetOllamaModelList() ([]string, error) { client, err := api.ClientFromEnvironment() if err != nil { return nil, err } models, err := client.List(context.Background()) if err != nil { return nil, err } modelNames := make([]string, 0) for _, model := range models.Models { modelNames = append(modelNames, model.Name) } return modelNames, nil }

func CheckIfModelExists(requestName string) error { modelNames, err := GetOllamaModelList() if err != nil { return err } for _, mn := range modelNames { if requestName == mn { return nil } } return fmt.Errorf("Model %s does not exist", requestName) }

func OllamaPullModel(modelName string) error { pullReq := api.PullRequest{ Model: modelName, Insecure: false, Name: modelName, } client, err := api.ClientFromEnvironment() if err != nil { return err } return client.Pull(context.Background(), &pullReq, pullProgressHandler) }

var lastProgress string

func pullProgressHandler(progress api.ProgressResponse) error { percentage := progressPercentage(progress) if percentage != lastProgress { slog.Info("Pulling model", "progress", percentage) lastProgress = percentage } return nil }

func progressPercentage(progress api.ProgressResponse) string { return fmt.Sprintf("%d", (progress.Completed*100)/(progress.Total+1)) }