InternLM / lmdeploy

LMDeploy is a toolkit for compressing, deploying, and serving LLMs.
https://lmdeploy.readthedocs.io/en/latest/
Apache License 2.0
3.15k stars 281 forks source link

[Feature] 多模态的模型支持在线serving吗? #1762

Closed CSEEduanyu closed 6 days ago

CSEEduanyu commented 3 weeks ago

Motivation

[Feature] 多模态的模型支持在线serving吗?

Related resources

No response

Additional context

No response

lvhan028 commented 3 weeks ago

支持。文档在这里:https://lmdeploy.readthedocs.io/en/latest/serving/api_server_vl.html

CSEEduanyu commented 2 weeks ago

支持。文档在这里:https://lmdeploy.readthedocs.io/en/latest/serving/api_server_vl.html

大佬 有curl传base64的样例不 试了下传base64没跑通 我看代码注释里是说可以传base64的

lvhan028 commented 2 weeks ago

@AllentDan 可以提供个样例吗?

CSEEduanyu commented 2 weeks ago

@AllentDan 可以提供个样例吗?

可以了 但是为啥只有/v1/chat/interactive这个接口能传image_url呢?然后这个接口又没max_tokens的字段。。

AllentDan commented 2 weeks ago

@AllentDan 可以提供个样例吗?

可以了 但是为啥只有/v1/chat/interactive这个接口能传image_url呢?然后这个接口又没max_tokens的字段。。

有 request_output_len。另外 OpenAI 接口也可以传,可自行查阅 OpenAI 文档 https://platform.openai.com/docs/api-reference/chat/create

CSEEduanyu commented 2 weeks ago

@AllentDan 可以提供个样例吗?

可以了 但是为啥只有/v1/chat/interactive这个接口能传image_url呢?然后这个接口又没max_tokens的字段。。

有 request_output_len。另外 OpenAI 接口也可以传,可自行查阅 OpenAI 文档 https://platform.openai.com/docs/api-reference/chat/create

/v1/completions的请求参数CompletionRequest里没有image_url呢 我试了这个接口不行

AllentDan commented 2 weeks ago

请仔细查看文档,/v1/chat/completions

CSEEduanyu commented 2 weeks ago

请仔细查看文档,/v1/chat/completions /v1/chat/completions都没有image_url字段 https://github.com/InternLM/lmdeploy/blob/fbd294adaef24dd98aeaf99f9b2246167c723451/lmdeploy/serve/openai/protocol.py#L81

CSEEduanyu commented 2 weeks ago

https://github.com/InternLM/lmdeploy/blob/fbd294adaef24dd98aeaf99f9b2246167c723451/lmdeploy/serve/openai/protocol.py#L294 protocol所有协议里 只有这个接口有image_url字段 跟文档不太符合

AllentDan commented 2 weeks ago

https://lmdeploy.readthedocs.io/en/latest/serving/api_server_vl.html#integrate-with-openai

请仔细阅读文档,在 messages 里

from openai import OpenAI

client = OpenAI(api_key='YOUR_API_KEY', base_url='http://0.0.0.0:23333/v1')
model_name = client.models.list().data[0].id
response = client.chat.completions.create(
    model=model_name,
    messages=[{
        'role':
        'user',
        'content': [{
            'type': 'text',
            'text': 'Describe the image please',
        }, {
            'type': 'image_url',
            'image_url': {
                'url':
                'https://raw.githubusercontent.com/open-mmlab/mmdeploy/main/tests/data/tiger.jpeg',
            },
        }],
    }],
    temperature=0.8,
    top_p=0.8)
print(response)
github-actions[bot] commented 1 week ago

This issue is marked as stale because it has been marked as invalid or awaiting response for 7 days without any further response. It will be closed in 5 days if the stale label is not removed or if there is no further response.

github-actions[bot] commented 6 days ago

This issue is closed because it has been stale for 5 days. Please open a new issue if you have similar issues or you have any new updates now.