ParisNeo / lollms-webui

Lord of Large Language Models Web User Interface
https://lollms.com
Apache License 2.0
4.31k stars 541 forks source link

webui.bat requesting default model and binding #308

Open HsuChe opened 1 year ago

HsuChe commented 1 year ago

Expected Behavior

expecte

Current Behavior

Please describe the behavior you are currently experiencing. When I run the webui.bat from as suggested through the youtube video, I do not get to the expected behavior like in the video. I will be prompted for a default model:

Screenshot 2023-06-23 235307

When I press 3 to back out hoping that it will take me to a localhost, the following appears. Screenshot 2023-06-23 235400

Steps to Reproduce

Download and use the webui.bat on a windows system.

  1. Step 1: Download webui.bat from https://github.com/ParisNeo/lollms-webui/releases/tag/v0.0.9 for windows
  2. Step 2: run the webui.bat with python 3.10+ and git activated folder
  3. Step 3: Arrive at current behavior

Possible Solution

I am not sure, but maybe I am downloading the wrong webui.bat file, I or I am using the wrong app.py file that is referencing the wrong model folder?

Context

I tried working with various interfaces with oogabooga webui and GPT4All from nomic and was able to get them to work. I am attempting to get GGML models to work on my none m1 system, but the llama.cpp doesn't work. I was hoping that using lollms-webui might help me get GGML working on my windows system.

Screenshots

If applicable, add screenshots to help explain the issue.

PedzacyKapec commented 1 year ago

I get the same nonsense. It doesnt work. I cant even install it. Going back to GPT4ALL by Nomic AI. Sorry, cant waste time.

ParisNeo commented 1 year ago

I definetely understand. As I said. This is not a commercial product. This is a personal project made for fun. It is hacky and that's normal. I'm not doing competition to nomic ai's tool. They have many devs with people doing quality checks and stuff. I'm basically working with a guy who does help from time to time. I only work at night or weekend. So no time to check every possible configuration.

Henrique-Miranda commented 1 year ago

Same issue here i tried to install in Arch Linux. Stuck in Menu with options: 1 - Install model and 2- Change binding. And not start Server. Trying to solve this issue here

Henrique-Miranda commented 1 year ago

Just given a feedback, using PR of #296 the installer script run very well but stuck in this part asking for a model. Pressing 3 to back the server run and is possible to change model. Using Arch Linux Screenshot_20230625_151446 Screenshot_20230625_152404

ParisNeo commented 1 year ago

ok, now you can go to settings view, select a binding for example pyllamacpp or any other, then download a model and apply and save. Then it should work.