binary-husky / gpt_academic

为GPT/GLM等LLM大语言模型提供实用化交互接口,特别优化论文阅读/润色/写作体验,模块化设计,支持自定义快捷按钮&函数插件,支持Python和C++等项目剖析&自译解功能,PDF/LaTex论文翻译&总结功能,支持并行问询多种LLM模型,支持chatglm3等本地模型。接入通义千问, deepseekcoder, 讯飞星火, 文心一言, llama2, rwkv, claude2, moss等。
https://github.com/binary-husky/gpt_academic/wiki/online
GNU General Public License v3.0
63.48k stars 7.87k forks source link

[Feature]: Added support for llama-cpp-python sever #1240

Open Lookforworld opened 9 months ago

Lookforworld commented 9 months ago

Class | 类型

大语言模型

Feature Request | 功能请求

Personally, I feel that the sever module of llama-cpp-python is very simple and easy to use, but I have been unable to add this part of the function to the library, can I add the API for this sever?

binary-husky commented 9 months ago

https://github.com/abetlen/llama-cpp-python

this ?

binary-husky commented 9 months ago

you can refer to https://github.com/binary-husky/gpt_academic/blob/master/request_llms/bridge_chatglm3.py

it is very easy to implement llama directly in to GPT-Academic, some simply copy and paste will do. Let me know if you need any help

Lookforworld commented 9 months ago

https://github.com/abetlen/llama-cpp-python

this ?

Yes! Their server module provides an OpenAI-like API, and I added the following code to bridge_all.py:

if "llama_cpp" in AVAIL_LLM_MODELS:   # llama_cpp
    try:
        from .bridge_llama_cpp import predict_no_ui_long_connection as llama_cpp_noui
        from .bridge_llama_cpp import predict as llama_cpp_ui
        model_info.update({
            "llama_cpp": {
                "fn_with_ui": llama_cpp_ui,
                "fn_without_ui": llama_cpp_noui,
                "endpoint": openai_endpoint,
                "max_token": 4096,
                "tokenizer": tokenizer_gpt35,
                "token_cnt": get_token_num_gpt35,
            }
        })
    except:
        print(trimmed_format_exc())

Then I created a bridge_llama_cpp.py file under the 'request_llms' path and modified the relevant content, but all kinds of unexpected errors, simple dialogue is fine, but if you use plugins to explain the whole python project etc., there will be all kinds of unexpected errors! bridge_llama_cpp.zip