Closed CRCODE22 closed 1 year ago
Are you running Agent-LLM in Docker or local?
Are you running Agent-LLM in Docker or local?
Docker.
Are you running Agent-LLM in Docker or local?
Docker.
Sorry, there is some clarification on Docker that I probably need to add to the docs. If you're using docker, it does not know anything at 127.0.0.1 because that is a loopback IP for the computer you're using. Docker is like running a computer on your computer, and it isn't aware of your computer and believes 127.0.0.1 is itself only.
TLDR: You need to set your ooba IP in your env to your local IP, for example if your local IP was 192.168.1.5:
AGENT_PROVIDER_URI=http://192.168.1.5:7860
If you're using Linux, you usually get your IP in the terminal, you should see it listed under "IPv4 Address". Here are the commands to find it per OS if it helps.
ip addr
In Windows, should be listed under inet:
ipconfig
Mac, IP should be listed under inet or en0:
ifconfig
Connection refused when trying to use Agent-LLM with oobabooga
env file important parts:
=========================
AI PROVIDER CONFIG
=========================
AI_PROVIDER=oobabooga AI_MODEL=vicuna AI_TEMPERATURE=0.2 MAX_TOKENS=2000
AI PROVIDER: CUSTOM (e.g., Oobabooga, Fastchat, etc.)
=========================
AI_PROVIDER_URI=http://127.0.0.1:7860
Command to start Oobabooga:
call python server.py --model MiniGPT-4-LLaMA-13b-4bit-128g --listen --wbits 4 --groupsize 128 --auto-devices --no-stream