CosmosShadow / GeneralAgent

A python native agent framework
396 stars 50 forks source link

chore: Compatible with service using Azure API proxies #6

Closed Okysu closed 3 months ago

Okysu commented 3 months ago

兼容使用Azure OpenAI代理的程序

从gptpdf过来的,十分不错的项目。但我在调试的时候一直报错 LLM(Large Languate Model) error, Please check your key or base_url, or network

发现我没有OpenAI官方的接口,只有Azure OpenAI的,又苦于Azure OpenAI的周边生态过于糟糕。所以使用了One API此类接口代理程序把Azure的接口对齐为OpenAI官方的,但Azure OpenAI的接口出参对于OpenAI官方的有细微的差别(苦笑

# GeneralAgent/skills/llm_inference的_llm_inference_with_stream方法中
 # Compatible with service using Azure API proxies, such as One-API
                if chunk.choices[0].delta is None:
                    continue

只需要加上这段简短的代码,就可以兼容。 是因为Azure在流式传输的末尾会给个空的内容(很狗屎

Compatible with Programs Using Azure OpenAI Proxies

Coming from gptpdf, it's a really nice project. But when I was debugging, I kept getting the error: LLM(Large Language Model) error, Please check your key or base_url, or network

I found that I didn't have the official OpenAI interface, only Azure OpenAI, and I was frustrated with the poor ecosystem around Azure OpenAI. So I used proxy programs like One API to align Azure's interface with OpenAI's official one. However, there are slight differences in the output parameters of Azure OpenAI compared to OpenAI's official ones (sigh).

# In the llm_inference_with_stream method of GeneralAgent/skills/llm_inference
# Compatible with service using Azure API proxies, such as One-API
if chunk.choices[0].delta is None:
    continue

Just add this short piece of code to make it compatible. This is because Azure gives an empty content at the end of the stream (very annoying).

JunweiDuan commented 3 months ago

我的增加了还是报错, File "C:\ProgramData\anaconda3\Lib\site-packages\openai_base_client.py", line 1020, in _request raise self._make_status_error_from_response(err.response) from None openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}} ERROR:root:LLM(Large Languate Model) error, Please check your key or base_url, or network Traceback (most recent call last): File "D:\Users\duanjunwei\AppData\Roaming\Python\Python311\site-packages\GeneralAgent\skills\llm_inference.py", line 108, in _llm_inference_with_stream response = client.chat.completions.create(messages=messages, model=model, stream=True, temperature=temperature) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Okysu commented 3 months ago

我的增加了还是报错, File "C:\ProgramData\anaconda3\Lib\site-packages\openai_base_client.py", line 1020, in _request raise self._make_status_error_from_response(err.response) from None openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}} ERROR:root:LLM(Large Languate Model) error, Please check your key or base_url, or network Traceback (most recent call last): File "D:\Users\duanjunwei\AppData\Roaming\Python\Python311\site-packages\GeneralAgent\skills\llm_inference.py", line 108, in _llm_inference_with_stream response = client.chat.completions.create(messages=messages, model=model, stream=True, temperature=temperature) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

你这是404 不存在,我看还有一个closed的pr是兼容azure的,你可以查看一下 。 我这个是解决,类似于one-api那种把azure的接口对齐为openai官方的,就是通过代理程序把azure的请求方式变为和openai一致,但是会存在azure的部分的模型出参和行为和openai不一致导致的错误。

2