ParisNeo / lollms-webui

Lord of Large Language Models Web User Interface
https://parisneo.github.io/lollms-webui/
Apache License 2.0
4.07k stars 515 forks source link

Unable to switch to any models installed outside of LOLLMS #529

Open profucius opened 2 months ago

profucius commented 2 months ago

Expected Behavior

Ability to change models that were installed to Ollama outside of LOLLMS.

Current Behavior

I have installed models into my Ollama using another source. Some directly within Ollama, some from another AI chatbot app. LOLLMS sees these were installed, but is calling them "custom model"s and does not let me select any of the to switch to it.

When I click any of these "custom model"s, the UI has a faded overlay (as if it is trying to load). When I look at the CMD window, it says this:

INFO: ::1:6408 - "POST /update_setting HTTP/1.1" 400 Bad Request

I can't tell if this is one or two separate issues.

Steps to Reproduce

I don't know how to explain how to reproduce it, other than I did a fresh install, then installed some models outside of LOLLMS. Try installing any specific model within Ollama directly, such as "Llama 3 7B" variant. You could also try using "Msty" app which is here on github, it's a windows app that is a chat manager similar to LOLLMS. I installed some models using that app, same result as installing directly within Ollama.

Possible Solution

I don't know what to suggest, other than perhaps provide some more context in the error message.

Context

This is a fresh install of LOLLMS. I configured nothing. The one model (mistral) that the UI shows is properly installed does work fine when I start a chat with it. I installed mistral outside of LOLLMS. None of the models were installed from within LOLLMS.

Screenshots

image

image

ParisNeo commented 2 months ago

This should have been fixed by now it was a problem with my sanitization function that forbids : in the model name

profucius commented 2 months ago

This should have been fixed by now it was a problem with my sanitization function that forbids : in the model name

Interesting. Should I try reinstalling? I'll try what you suggest. I'm capable of debugging if needed.

hostolis commented 1 month ago

Getting similar problems. Fresh install, nothing configured manually. image

ParisNeo commented 1 month ago

hey, did you test refreshing your page to allow the cache to be updated? lollms uses hard caching. Each time you update it you need to refresh the webui.