allwefantasy / byzer-llm

Easy, fast, and cheap pretrain,finetune, serving for everyone
Apache License 2.0
245 stars 36 forks source link

【求助】我刚刚接触大模型应用,用国内转发openai接口,遇到一个错误。实在不明白原因。 #32

Closed wuxianliang closed 3 months ago

wuxianliang commented 3 months ago

网站声称,用https://4.0.wokaai.com 代替 https://api.openai.com即可,支持gpt-4 claude3。用客户端亲测可用。 我安装好auto-coder以后,按如下部署模型

byzerllm deploy --pretrained_model_type saas/official_openai --infer_params saas.api_key=sk-6NwOjBp5PVa7Ucyd62354530Df534607B0B3A3C2420****  saas.model=gpt-3.5-turbo-1106 saas.base_url=https://4.0.wokaai.com --model openai

收到信息

Failed to look up actor with name 'openai'. This could because 1. You are trying to look up a named actor you didn't create. 2. The named actor died. 3. You did not use a namespace matching the namespace of the actor.
模型 openai 部署成功
(UDFWorker pid=66771) MODEL[openai] Init Model,It may take a while.
(UDFWorker pid=66771) MODEL[openai] Successful to init model, time taken:0.04230642318725586s

尝试部署情况

byzerllm query --model openai --query "你是谁?"

收到报错

Command Line Arguments:
--------------------------------------------------
command             : query
ray_address         : auto
model               : openai
query               : 你是谁?
template            : auto
file                : None
output_file         : 
--------------------------------------------------
2024-05-28 22:02:19,149 INFO worker.py:1564 -- Connecting to existing Ray cluster at address: 192.168.1.13:6379...
2024-05-28 22:02:19,159 INFO worker.py:1740 -- Connected to Ray cluster. View the dashboard at 127.0.0.1:8265 
Traceback (most recent call last):
  File "/home/wxl/anaconda3/envs/autocoder/bin/byzerllm", line 8, in <module>
    sys.exit(main())
  File "/home/wxl/anaconda3/envs/autocoder/lib/python3.10/site-packages/byzerllm/byzerllm_command.py", line 86, in main
    resp = llm_client.chat_oai(model=args.model, conversations=[{
  File "/home/wxl/anaconda3/envs/autocoder/lib/python3.10/site-packages/byzerllm/utils/client/byzerllm_client.py", line 986, in chat_oai
    res = self._query(model,v) 
  File "/home/wxl/anaconda3/envs/autocoder/lib/python3.10/site-packages/byzerllm/utils/client/byzerllm_client.py", line 1510, in _query
    raise inst
  File "/home/wxl/anaconda3/envs/autocoder/lib/python3.10/site-packages/byzerllm/utils/client/byzerllm_client.py", line 1502, in _query
    res = ray.get(worker.async_apply.remote(new_input_value))
  File "/home/wxl/anaconda3/envs/autocoder/lib/python3.10/site-packages/ray/_private/auto_init_hook.py", line 21, in auto_init_wrapper
    return fn(*args, **kwargs)
  File "/home/wxl/anaconda3/envs/autocoder/lib/python3.10/site-packages/ray/_private/client_mode_hook.py", line 103, in wrapper
    return func(*args, **kwargs)
  File "/home/wxl/anaconda3/envs/autocoder/lib/python3.10/site-packages/ray/_private/worker.py", line 2623, in get
    values, debugger_breakpoint = worker.get_objects(object_refs, timeout=timeout)
  File "/home/wxl/anaconda3/envs/autocoder/lib/python3.10/site-packages/ray/_private/worker.py", line 861, in get_objects
    raise value.as_instanceof_cause()
ray.exceptions.RayTaskError(AttributeError): ray::UDFWorker.async_apply() (pid=66771, ip=192.168.1.13, actor_id=681d9acb74d8f22e2cdf1cb614000000, repr=<pyjava.udf.udf_worker.UDFWorker object at 0x7f8f7d0586a0>)
  File "/home/wxl/anaconda3/envs/autocoder/lib/python3.10/concurrent/futures/_base.py", line 458, in result
    return self.__get_result()
  File "/home/wxl/anaconda3/envs/autocoder/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
    raise self._exception
  File "/home/wxl/anaconda3/envs/autocoder/lib/python3.10/site-packages/pyjava/udf/udf_worker.py", line 65, in async_apply
    resp = await self.apply_func(self.model, v)
  File "/home/wxl/anaconda3/envs/autocoder/lib/python3.10/site-packages/byzerllm/utils/text_generator.py", line 183, in simple_predict_func
    v = await llm.async_predict(item)
  File "/home/wxl/anaconda3/envs/autocoder/lib/python3.10/site-packages/byzerllm/utils/text_generator.py", line 161, in async_predict
    response = await self.model.async_stream_chat(self.tokenizer,
  File "/home/wxl/anaconda3/envs/autocoder/lib/python3.10/site-packages/byzerllm/saas/official_openai/__init__.py", line 366, in async_stream_chat
    raise e
  File "/home/wxl/anaconda3/envs/autocoder/lib/python3.10/site-packages/byzerllm/saas/official_openai/__init__.py", line 352, in async_stream_chat
    generated_text = response.choices[0].message.content
AttributeError: 'str' object has no attribute 'choices'
wuxianliang commented 3 months ago

我刚才用kimi的api可以跑通,软件安装没有问题。

wuxianliang commented 3 months ago

正确的部署是严格按照kimi的方式,在base_url后面加上v1。gpt和claude都是这样部署。 openai部署

byzerllm deploy --pretrained_model_type saas/official_openai --infer_params saas.api_key=sk-6NwOjBp5PVa7Ucyd62354530Df534607B0B3A3C2420****  saas.model=gpt-3.5-turbo-1106 saas.base_url=https://4.0.wokaai.com/v1 --model openai

kimi部署

byzerllm deploy --pretrained_model_type saas/official_openai --infer_params saas.api_key=sk-CdCHfI6o9DCYtUDqnn4I7n2VINnnPGzXHhlBHJR2cj5****  saas.model=moonshot-v1-128k saas.base_url=https://api.moonshot.cn/v1 --model kimi_chat_128k