THUDM / AgentBench

A Comprehensive Benchmark to Evaluate LLMs as Agents (ICLR'24)
https://llmbench.ai
Apache License 2.0
2.24k stars 162 forks source link

Connection error #124

Closed StupiddCupid closed 8 months ago

StupiddCupid commented 8 months ago

Hi there, I'd like to add Llama2 as an agent to the config file. However, I got a "Connection error" when trying to check if the agent is configured correctly.

Screenshot 2024-03-05 at 8 04 09 PM

The fs_agent.yaml as below:

Screenshot 2024-03-05 at 8 04 31 PM

I wonder if there is any idea on solving this. Thank you.

Longin-Yu commented 8 months ago

Have you deployed a server via FastChat before you start this client?

EYH0602 commented 8 months ago

to run a FastChat server, the following steps would help

pip install fastchat
python -m fastchat.serve.controller --host 0.0.0.0

in a different terminal (with vicuna-7b as an example):

python -m fastchat.serve.model_worker --model-path lmsys/vicuna-7b-v1.5 --host 0.0.0.0

NOTE: to download the model

mkdir lmsys && cd lmsys
git clone https://huggingface.co/lmsys/vicuna-7b-v1.5

Then you can run a FastChat client

> python -m src.client.agent_test --config configs/agents/fs_agent.yaml --agent vicuna-7b 
512
================= USER  ===================
>>> hi
================ AGENT ====================
Hello! How can I help you today? Is there something you would like to talk about or ask me a question? I'm here to assist you with any information or advice you may need.
================= USER  ===================
>>>
StupiddCupid commented 8 months ago

Solved. Thank you!