MetaGLM / glm-cookbook

Examples and guides for using the GLM APIs
https://open.bigmodel.cn/
Apache License 2.0
726 stars 79 forks source link

glm-4v max token参数非法。请检查文档 #24

Closed Philemonade closed 4 months ago

Philemonade commented 4 months ago

System Info / 系統信息

glm-4v

Information / 问题信息

Reproduction / 复现过程

params = {
    "model": "glm-4v",
    "messages": PROMPT_MESSAGES,
    "max_tokens": 20000,
    "temperature": 0.9,
    "top_p": 1,
}

result = client.chat.completions.create(**params)
print(result.choices[0].message.content)

top_p:取值范围是:(0.0, 1.0) 开区间,不能等于 0 或 1 Traceback (most recent call last): File "/data/txy/glm-cookbook/vision/video_understanding.py", line 49, in result = client.chat.completions.create(**params) File "/apprun/anaconda3/envs/glm4/lib/python3.10/site-packages/zhipuai/api_resource/chat/completions.py", line 73, in create return self._post( File "/apprun/anaconda3/envs/glm4/lib/python3.10/site-packages/zhipuai/core/_http_client.py", line 595, in post return cast(ResponseT, self.request(cast_type, opts, stream=stream, stream_cls=stream_cls)) File "/apprun/anaconda3/envs/glm4/lib/python3.10/site-packages/zhipuai/core/_http_client.py", line 363, in request return self._request( File "/apprun/anaconda3/envs/glm4/lib/python3.10/site-packages/zhipuai/core/_http_client.py", line 450, in _request raise self._make_status_error(err.response) from None zhipuai.core._errors.APIRequestFailedError: Error code: 400, with error text {"error":{"code":"1214","message":"max_tokens参数非法。请检查文档。"}}

Expected behavior / 期待表现

what is the correct max token size of glm-4v?

zRzRzRzRzRzRzR commented 4 months ago

max_tokens 要小于8192