Open yunhaoli24 opened 1 year ago
choice_data = ChatCompletionResponseStreamChoice(
index=0, delta=DeltaMessage(content=new_text), finish_reason=None
)
chunk = ChatCompletionResponse(
model=model_id, choices=[choice_data], object="chat.completion.chunk"
)
代码有些问题,流式接口返回时这块会报错,
ChatCompletionResponseStreamChoice
和ChatCompletionResponse
不匹配,
要在openai_api_protocol.py
建一个ChatCompletionResponseStream
进行替换
class ChatCompletionResponseStream(BaseModel):
id: str = Field(default_factory=lambda: f"chatcmpl-{shortuuid.random()}")
object: str = "chat.completion"
created: int = Field(default_factory=lambda: int(time.time()))
model: str = "chinese-llama-alpaca"
choices: List[ChatCompletionResponseStreamChoice]
choice_data = ChatCompletionResponseStreamChoice( index=0, delta=DeltaMessage(content=new_text), finish_reason=None ) chunk = ChatCompletionResponse( model=model_id, choices=[choice_data], object="chat.completion.chunk" )
代码有些问题,流式接口返回时这块会报错,
ChatCompletionResponseStreamChoice
和ChatCompletionResponse
不匹配,要在
openai_api_protocol.py
建一个ChatCompletionResponseStream
进行替换class ChatCompletionResponseStream(BaseModel): id: str = Field(default_factory=lambda: f"chatcmpl-{shortuuid.random()}") object: str = "chat.completion" created: int = Field(default_factory=lambda: int(time.time())) model: str = "chinese-llama-alpaca" choices: List[ChatCompletionResponseStreamChoice]
您好,我修改了一下代码,使得ChatCompletionResponse
兼容两种格式的返回。现在运行不会报错了
class ChatCompletionResponse(BaseModel):
id: str = Field(default_factory=lambda: f"chatcmpl-{shortuuid.random()}")
object: str = "chat.completion"
created: int = Field(default_factory=lambda: int(time.time()))
model: str = "chinese-llama-alpaca"
choices: List[
Union[ChatCompletionResponseChoice, ChatCompletionResponseStreamChoice]
]
Thank you for your contribution to our project.
Since we are working on Chinese-LLaMA-Alpaca-2, and it is still under construction, we will be very glad to accept PR to Chinese-LLaMA-Alpaca-2. Would you please create a PR on Streaming openai api support to Chinese-LLaMA-Alpaca-2? Most of the code of openai_api_protocol.py
and openai_api_server.py
in the two projects are the same, except for some prompts and naming conventions. We really appreciate for your contribution.
As long as we have reviewed your code, we will merge your PRs in both repos.
Thank you for your contribution to our project. Since we are working on Chinese-LLaMA-Alpaca-2, and it is still under construction, we will be very glad to accept PR to Chinese-LLaMA-Alpaca-2. Would you please create a PR on Streaming openai api support to Chinese-LLaMA-Alpaca-2? Most of the code of
openai_api_protocol.py
andopenai_api_server.py
in the two projects are the same, except for some prompts and naming conventions. We really appreciate for your contribution.As long as we have reviewed your code, we will merge your PRs in both repos.
ok
Description
Use
transformers
TextIteratorStreamer
to support streaming response for the OpenAI API. #762Ref https://huggingface.co/docs/transformers/internal/generation_utils
Related Issue
None