OpenBMB / ChatDev

Create Customized Software using Natural Language Idea (through LLM-powered Multi-Agent Collaboration)
https://arxiv.org/abs/2307.07924
Apache License 2.0
24.4k stars 3.06k forks source link

stop should either be excluded, set to [] or utilized. Setting it to null caused errors #344

Open foxbg opened 5 months ago

foxbg commented 5 months ago

In the prompt stop is set to null. stop should either be excluded, set to [] or utilized.

{"messages": [{"role": "system", "content": "ChatDev is a software company powered by multiple intelligent agents, such as chief executive officer, chief human resources officer, chief product officer, chief technology officer, etc, with a multi-agent organizational structure and the mission of 'changing the digital world through programming'.\n......"}, {"role": "user", "content": "ChatDev ..."}], "model": "gpt-3.5-turbo-16k", "frequency_penalty": 0.0, "logit_bias": {}, "max_tokens": 15821, "n": 1, "presence_penalty": 0.0, "stop": null, "stream": false, "temperature": 0.2, "top_p": 1.0, "user": ""}HTTP/1.0 200 OK

Ref: https://github.com/LostRuins/koboldcpp/issues/643

LostRuins commented 5 months ago

Also "max_tokens": 15821 is not a good idea. It should ideally be half or less of your maximum context length, 1k is a good value.

thinkwee commented 2 months ago

Could you please try the latest version of ChatDev and provide more background information? I don't know the relationship between ChatDev and koboldcpp.

foxbg commented 1 month ago

Hi,

Here is what I get with latest version

....
"model": "gpt-3.5-turbo-16k", "frequency_penalty": 0.0, "logit_bias": {}, "max_tokens": 15654, "n": 1, "presence_penalty": 0.0, "stop": null, "stream": false, "temperature": 0.2, "top_p": 1.0, "user": ""}
....
Processing Prompt [BLAS] (729 / 729 tokens)
Generating (6 / 15654 tokens)
(EOS token triggered! ID:2)
CtxLimit: 736/16384, Process:8.03s (11.0ms/T = 90.77T/s), Generate:4.20s (699.8ms/T = 1.43T/s), Total:12.23s (0.49T/s)must be str, not NoneType

the relationship is that koboldcpp is AI text-generation software for GGML and GGUF models build of llama.cpp. I'm using it to run with local model. As noted above issue is most probably with "stop": null,

LostRuins commented 1 month ago

Where is the prompt? I don't see you sending any prompt.