microsoft / TaskWeaver

A code-first agent framework for seamlessly planning and executing data analytics tasks.
https://microsoft.github.io/TaskWeaver/
MIT License
5.21k stars 661 forks source link

Not able run Taskweaver with LLM Qwen1.5-72B-Chat #335

Closed Haxeebraja closed 4 months ago

Haxeebraja commented 4 months ago

Not able to run taskweaver with locally hosted Qwen1.5-72B-Chat. Taskweaver worked fine with Qwen-72B-Chat.

Getting error: Exception: OpenAI API request was invalid: Error code: 400 - {'object': 'error', 'message': 'top_p must be in (0, 1], got 0.0.', 'type': 'BadRequestError', 'param': None, 'code': 400}

Qwen hosted using vllm: python -m vllm.entrypoints.openai.api_server --served-model-name Qwen1.5-72B-Chat --model Qwen/Qwen1.5-72B-Chat

Taskweaver config file: { "llm.api_base": "http://172.17.0.8:8283/v1", "llm.api_key": "Null", "llm.model": "Qwen1.5-72B-Chat", "execution_service.kernel_mode": "local"}

Following call fine with ChatOpenAI in other application: model = ChatOpenAI( model_name = 'Qwen1.5-72B-Chat', base_url = "http://172.17.0.8:8283/v1/", api_key = "EMPTY", temperature=0 )

liqul commented 4 months ago

The error message says that you need to configure a right top_p. In TaskWeaver, you can configure it by setting llm.openai.top_p in the config file.

Haxeebraja commented 4 months ago

Setting llm.openai.top_p made it work along with response format to text.