Closed Koohoko closed 1 year ago
request_llm/bridge_chatgpt.py里面搜requests.post
我测试了下 可以直接改成stream=False
request_llm/bridge_chatgpt.py里面搜requests.post
谢谢楼主,但是我发现即使request.post里面stream=False,只要在payload里面包含“'stream': True”, 就会RESPONSE 500。
request_llm/bridge_chatgpt.py里面搜requests.post
谢谢楼主,但是我发现即使request.post里面stream=False,只要在payload里面包含“'stream': True”, 就会RESPONSE 500。
补充一张截图
补充说明一下我们学校的API的现状:
下面的 POST (chunk 1) 成功 200 OK response:
Chunk 1:
POST https://api.hku.hk/openai/deployments/chatgpt/chat/completions?api-version=2023-03-15-preview HTTP/1.1
Content-Type: application/json
Cache-Control: no-cache
api-key: ••••••••••••••••••••••••••••••••
{
"model": "gpt-35-turbo",
"messages": [{
"role": "user",
"content": "Hello!"
}],
"stream":false
}
但是POST(chunk 2)失败, with 500 Internal Server Error.
Chunk 2:
POST https://api.hku.hk/openai/deployments/chatgpt/chat/completions?api-version=2023-03-15-preview HTTP/1.1
Content-Type: application/json
Cache-Control: no-cache
api-key: ••••••••••••••••••••••••••••••••
{
"model": "gpt-35-turbo",
"messages": [{
"role": "user",
"content": "Hello!"
}],
"stream":true
}
这个修改有点麻烦
f43cea0
非常感谢!这个commit有效,能用啦。
似乎涉及到predict_no_ui_long_connection
的功能还是不可用, 比如"理解PDF论文内容":
API应该是有回复了,但是predict_no_ui_long_connection
设置了payload的stream为false后没有正常解读RESPONSE
Traceback (most recent call last):
File "./crazy_functions/crazy_utils.py", line 79, in _req_gpt
result = predict_no_ui_long_connection(
File "./request_llm/bridge_all.py", line 375, in predict_no_ui_long_connection
return method(inputs, llm_kwargs, history, sys_prompt, observe_window, console_slience)
File "./request_llm/bridge_chatgpt.py", line 87, in predict_no_ui_long_connection
raise RuntimeError("OpenAI拒绝了请求:" + error_msg)
RuntimeError: OpenAI拒绝了请求:{"id":"chatcmpl-7eId3pVg99BCaAP7QdelPeaLIKKSe","object":"chat.completion","created":1689839085,"model":"gpt-4","usage":{"prompt_tokens":2740,"completion_tokens":234,"total_tokens":2974},"choices":[{"message":{"role":"assistant","content":"Severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) has caused millions of deaths worldwide. The intense scientific effort to understand its biology has resulted in numerous genomic sequences. This review explores mechanisms generating genetic variation in SARS-CoV-2, within-host and population-level processes, and selective forces driving the evolution of higher transmissibility and severity. The role of antigenic evolution, implications of immune escape and reinfections, and evidence for and relevance of recombination are also examined. The emergence of variants of concern (VOCs) is discussed with a focus on the chronic infection model and potential animal reservoirs. The review evaluates uncertainties and outlines potential future SARS-CoV-2 evolutionary trajectories.\n\nSARS-CoV-2 evolves rapidly and evolves on timescales comparable to the transmission and ecological dynamics of the virus. Evolution is driven by the mutation rate and acted upon by natural selection. Viral evolution involves complexity, as viruses must successfully replicate within individuals and transmit between them. The review also considers factors driving the evolution of the virus, theories for the emergence of epidemiologically important variants, and potential future evolutionary scenarios with likely public health repercussions."},"finish_reason":"stop","index":0}]}
仿照上面的改一改就行
最后找学校enable的stream的功能
Class | 类型
程序主体
Feature Request | 功能请求
你好,
感谢开发,我现在有一个学校提供的 Azure API,只要 Stream=True 就会报 500 Internal server error。我尝试手动把所有stream强行=False, 但是似乎抓到response之后不能正常parse和后续处理。请问主要功能能否能stream=False 来使用?需要如何设置呢。
相似情况也曾经在别处发生过:https://github.com/ripperhe/Bob/issues/541
感谢!