Krisseck / Phrasing-Bot

A bot that checks your grammar and phrasing using LLM of choice
MIT License
27 stars 2 forks source link

[error] There is no model to select #1

Open andreiramani opened 5 months ago

andreiramani commented 5 months ago

My local gpt is text-generation-webui from https://github.com/oobabooga/text-generation-webui.

I already succeed connecting phrasing-bot to text-generation-webui. Phrasing-bot frontend yarn start successfully, unfortunately - there is no models to select. model_missing

After selecting/click the 'model' menu, yarn suddenly disconnect and gave following error: yarn_error_model

For information, i'm using model Meta-Llama-3-8B-Instruct.Q5_K_S.gguf - and in the same time, this model running fine with my text-generation-webui.

Any suggestion how to define/to acccess the model from phrasing-bot (or maybe there is wrong configuration on my .env file)?

This is the content of my .env file: PORT=5000

REQUEST_TYPE=custom

CUSTOM_API_BASE_URL=127.0.0.1:5001 CUSTOM_API_KEY=somekey CUSTOM_API_MODEL=Meta-Llama-3-8B-Instruct.Q5_K_S CUSTOM_API_PROMPT_TEMPLATE=LLAMA_3

(my setup: Win11 Pro, Node 20.14, Yarn 1.22.22, text-generation-webui, Nvidia RTX 3050 6GB, RAM 32GB, Ryzen 7 gen7)

Krisseck commented 5 months ago

CUSTOM_API_BASE_URL=127.0.0.1:5001

The CUSTOM_API_BASE_URL field needs to end in v1, try this:

CUSTOM_API_BASE_URL=http://127.0.0.1:5001/v1

And make sure you start text-generation-webui with --api, as per here: https://github.com/oobabooga/text-generation-webui/wiki/12-%E2%80%90-OpenAI-API