andrewnguonly / Lumos

A RAG LLM co-pilot for browsing the web, powered by local LLMs
MIT License
1.34k stars 94 forks source link

Preload Ollama models #174

Closed andrewnguonly closed 3 months ago

andrewnguonly commented 3 months ago

https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-pre-load-a-model-to-get-faster-response-times