InternLM / HuixiangDou

HuixiangDou: Overcoming Group Chat Scenarios with LLM-based Technical Assistance
https://openxlab.org.cn/apps/detail/tpoisonooo/huixiangdou-web
BSD 3-Clause "New" or "Revised" License
1.45k stars 125 forks source link

非Hybrid-LLM环境,使用deepseek远程模型运行web-demo报错 #277

Closed SchweitzerGAO closed 4 months ago

SchweitzerGAO commented 4 months ago

error log | 日志或报错信息 | ログ

7

context | 编译/运行环境 | バックグラウンド

Intern Studio 30% A100 with built-in Huixiangdou environment

how to reproduce | 复现步骤 | 再現方法

  1. 按照实战营教程 配置config.ini
  2. 运行python3 -m tests.test_query_gradio
  3. 打开http://127.0.0.1:7860 输入“茴香豆如何在微信群部署?”
  4. 在终端看到报错信息

more | 其他 | その他

  1. 在上面第2步加了--standalone 也无济于事
  2. Hybrid-LLM环境下可以正常运行,不过似乎没有调用deepseek API
tpoisonooo commented 4 months ago
  1. 参考 FAQ image

  2. 如果不行,pdb 这行 https://github.com/InternLM/HuixiangDou/blob/9f459614dc9139f6d1bd9f6ea39a6212963605a9/huixiangdou/service/llm_server_hybrid.py#L514C25-L514C36

    这个 remote deepseek 实现就 4 行,怎么也能 debug 出来

    def call_deepseek(self, prompt, history):
        client = OpenAI(
            api_key=self.server_config['remote_api_key'],
            base_url='https://api.deepseek.com/v1',
        )

        messages = build_messages(
            prompt=prompt,
            history=history,
            system='You are a helpful assistant')  # noqa E501

        logger.debug('remote api sending: {}'.format(messages))
        completion = client.chat.completions.create(
            model=self.server_config['remote_llm_model'],
            messages=messages,
            temperature=0.1,
        )
        return completion.choices[0].message.content
tpoisonooo commented 4 months ago

跑了当前版本没问题。 image

检查一下这些配置是否有错:

image

SchweitzerGAO commented 4 months ago

好的,我去测一下main分支,实战营教程文档最开始让切换到另一个分支去了

tpoisonooo commented 4 months ago

好的,我去测一下main分支,实战营教程文档最开始让切换到另一个分支去了

fixed https://github.com/InternLM/Tutorial/pull/695

SchweitzerGAO commented 4 months ago

main分支可以正常运行,感谢大佬