Open Skylarking opened 2 months ago
Thanks for reaching out. It seems that the action is not generated correctly. The action should be ["open_feishu_assistant_app()"] instead of ["action(open_feishu_assistant_app())"]. Maybe you need to modify the prompt to emphasize this. It seems that Qwen cannot strictly follow the instructions and output the answer in the correct format, which is a common issue for small models.
Because OpenAI Api was banned in CN, I use Qwen Api provided by Alibaba as my llm provider. But some errors occur in command line. Is the provider not good enough so that it cannot reason correctly?
I am also trying to use qwen api.I simply change the api_key and base_url ,but it seems doesn't work.Would you like to tell me what you did to adapt to qwen api?
Because OpenAI Api was banned in CN, I use Qwen Api provided by Alibaba as my llm provider. But some errors occur in command line. Is the provider not good enough so that it cannot reason correctly?
I am also trying to use qwen api.I simply change the api_key and base_url ,but it seems doesn't work.Would you like to tell me what you did to adapt to qwen api?
I am testing Stardew Valley using the 'qwen-vl-max' model, but after the character comes out of the house, there are no further actions, and the task of clearing the farm has not been completed. It seems that the model's intelligence might be insufficient? (By the way, during the test, 196 queries consumed 170,000 tokens, so the cost could also be a significant issue.)
目前我遇到的问题仅供参考: 1.修改api_key和base_url self.client = OpenAI( api_key= "sk-", # 如果您没有配置环境变量,请在此处用您的API Key进行替换 base_url="https://dashscope.aliyuncs.com/compatible-mode/v1", # 填写DashScope服务的base_url )
Because OpenAI Api was banned in CN, I use Qwen Api provided by Alibaba as my llm provider. But some errors occur in command line. Is the provider not good enough so that it cannot reason correctly?
I am also trying to use qwen api.I simply change the api_key and base_url ,but it seems doesn't work.Would you like to tell me what you did to adapt to qwen api?
Because Qwen api is similar to Openai api, I just copy the cradle/provider/llm/openai.py as qwen.py and modify some codes. And then add qwen.py to llm_factory.py . Note that you need to read qwen docs.
Because OpenAI Api was banned in CN, I use Qwen Api provided by Alibaba as my llm provider. But some errors occur in command line. Is the provider not good enough so that it cannot reason correctly?
I am also trying to use qwen api.I simply change the api_key and base_url ,but it seems doesn't work.Would you like to tell me what you did to adapt to qwen api?
If I have any time, I will share my code on github. Then you can check my code to modify your own.
Because OpenAI Api was banned in CN, I use Qwen Api provided by Alibaba as my llm provider. But some errors occur in command line. Is the provider not good enough so that it cannot reason correctly?