CosmosShadow / gptpdf

Using GPT to parse PDF
MIT License
2.69k stars 207 forks source link

not work with `qwen-vl-max` as provide unsupported param: `temperature` #24

Closed youzipi closed 1 month ago

youzipi commented 1 month ago

not work with qwen-vl-max as provide not supported param: temperature

code:

from GeneralAgent import Agent

models = [

    ('qwen-vl-max', 6000, 'sk-0xxx', 'https://dashscope.aliyuncs.com/compatible-mode/v1'),
]

for model, token_limit, api_key, base_url in models:
    agent = Agent('You are a helpful agent.', model=model, token_limit=token_limit, api_key=api_key, base_url=base_url)
    agent.user_input('介绍一下成都')

error:

openai.BadRequestError: Error code: 400 - {'error': {'code': 'invalid_parameter', 'param': None, 'message': "'temperature' is not support for vl model now", 'type': 'invalid_request_error'}}

error location:

GeneralAgent.skills.llm_inference.llm_inference
image
zRzRzRzRzRzRzR commented 1 month ago

已经修复,https://github.com/CosmosShadow/GeneralAgent/pull/11 PR中

CosmosShadow commented 1 month ago

Update to gptpdf==0.0.13

eruca commented 1 month ago

版本已升级至0.0.13,使用parse_pdf,还是这个错误:

2024-07-12 12:06:53,818 - INFO - HTTP Request: POST https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions "HTTP/1.1 400 Bad Request"
2024-07-12 12:06:53,822 - ERROR - Error code: 400 - {'error': {'code': 'invalid_parameter_error', 'param': None, 'message': 'Range of max_tokens should be [1, 1500]', 'type': 'invalid_request_error'}, 'id': 'chatcmpl-d479aeb6-489a-9d6f-8ed9-164a106146e8'}
CosmosShadow commented 1 month ago

版本已升级至0.0.13,使用parse_pdf,还是这个错误:

2024-07-12 12:06:53,818 - INFO - HTTP Request: POST https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions "HTTP/1.1 400 Bad Request"
2024-07-12 12:06:53,822 - ERROR - Error code: 400 - {'error': {'code': 'invalid_parameter_error', 'param': None, 'message': 'Range of max_tokens should be [1, 1500]', 'type': 'invalid_request_error'}, 'id': 'chatcmpl-d479aeb6-489a-9d6f-8ed9-164a106146e8'}

这个报错显示你的max_token超量了。