Closed jackqdldd closed 1 week ago
具体是什么参数不一致? 如果是模型推理的话,支持 temperature 参数
模型服务地址:http://xxx.com.cn:9316/multimodal/glm4v 参数:{ "query": "xxx", "image_path": "/data/ee_22_1714340162.jpeg", "history": [] }
接口可以curl通,但是执行example_eval_vlm_swift.py 的时候一直卡着 2024-08-13 16:05:49,460 - evalscope - INFO - *** Run task with config: Arguments(data=['MMBench_TEST_CN'], model=['glm4v'], nframe=8, pack=False, use_subtitle=False, work_dir='output', mode='all', nproc=16, retry=None, judge=None, verbose=False, ignore=False, rerun=False, limit=2, OPENAI_API_KEY='EMPTY', OPENAI_API_BASE=None, LOCAL_LLM=None)
2024-08-13 16:05:49,944 - ChatAPI - INFO - BaseAPI received the following kwargs: {'name': 'CustomAPIModel', 'type': 'glm4v'}
2024-08-13 16:05:49,945 - ChatAPI - INFO - Will try to use them as kwargs for generate
.
2024-08-13 16:05:49,945 - ChatAPI - INFO - Using API Base: http://xxx/multimodal/glm4v; API Key: EMPTY
2024-08-13 16:05:49,945 - ChatAPI - INFO - BaseAPI received the following kwargs: {'name': 'CustomAPIModel', 'type': 'glm4v'}
2024-08-13 16:05:49,945 - ChatAPI - INFO - Will try to use them as kwargs for generate
.
2024-08-13 16:05:49,945 - ChatAPI - INFO - Using API Base: http://xxxn:9316/multimodal/glm4v; API Key: EMPTY
Processing ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ -:--:-- 0/2 0%
目前只支持OpenAI API接口格式的调用
本地部署的模型参数和默认的参数不一致,如何在调用的时候指定参数