lobehub / lobe-chat

🤯 Lobe Chat - an open-source, modern-design AI chat framework. Supports Multi AI Providers( OpenAI / Claude 3 / Gemini / Ollama / Azure / DeepSeek), Knowledge Base (file upload / knowledge management / RAG ), Multi-Modals (Vision/TTS) and plugin system. One-click FREE deployment of your private ChatGPT/ Claude application.
https://chat-preview.lobehub.com
Other
45.26k stars 10.18k forks source link

[Docs] Feedback for `self-hosting/server-database/docker-compose` #3919

Open RazorSnow001 opened 2 months ago

RazorSnow001 commented 2 months ago

📝 Reference

https://github.com/lobehub/lobe-chat/tree/main/docs/self-hosting/server-database/docker-compose.zh-CN.mdx

🐛 Feedback

Your response

按照教程本地启动后,登陆也成功了,但是提问后没有出现任何报错,输出一个空白回答 。(这里使用的是代理的openai key ),在基础版本的lobechat 上使用非常正常,但是 lobedb 版本上连通测试可以通过但是显示可用模型为0,正常应该是有几十个 image image。 知识库上传文件后分块报错

lobehubbot commented 2 months ago

👀 @RazorSnow001

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. Please make sure you have given us as much context as possible.\ 非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

dosubot[bot] commented 2 months ago

Hello, @RazorSnow001! I'm here to help you with your GitHub issue. I can assist with bug fixes, answer your questions, and guide you on your journey to becoming a contributor.

I found some similar discussions and issues that might help resolve your problem:

  1. Docker 安装,配置好环境变量后,为何返回值是空白? - This discussion highlights the importance of correctly setting the OPENAI_PROXY_URL with the /v1 suffix if required. The issue was resolved by ensuring the correct URL format [1].

  2. File Upload / KnowledgeBase Deployment Feedback | 知识库部署问题反馈 - This issue was closed by a pull request that addressed deployment problems related to the knowledge base [2].

To address your specific issues:

Blank Responses and "0 Models Available"

  1. Check OPENAI_PROXY_URL Configuration: Ensure that the OPENAI_PROXY_URL environment variable is set correctly, including the /v1 suffix if required.

  2. Authorization Header: Verify that the Authorization header is correctly set with the Bearer token when fetching models.

  3. Base URL Configuration: Ensure that the baseURL is correctly configured to point to the proxy URL.

  4. Model Handling: Ensure the models function handles the response properly and filters the models as needed.

Here is a code snippet to help with the configuration:

const baseURL = 'https://your-proxy-url.com';
export const LobeTogetherAI = LobeOpenAICompatibleFactory({
  baseURL: `${baseURL}/v1`,
  constructorOptions: {
    defaultHeaders: {
      'HTTP-Referer': 'https://chat-preview.lobehub.com',
      'X-Title': 'Lobe Chat',
    },
  },
  debug: {
    chatCompletion: () => process.env.DEBUG_TOGETHERAI_CHAT_COMPLETION === '1',
  },
  models: async ({ apiKey }) => {
    const data = await fetch(`${baseURL}/api/models`, {
      headers: {
        Authorization: `Bearer ${apiKey}`,
      },
    });
    if (!data.ok) {
      throw new Error(`Together Fetch Error: ${data.statusText || data.status}`);
    }

    const models: TogetherAIModel[] = await data.json();

    return models
      .filter((m) => m.display_type === 'chat')
      .map((model) => {
        return {
          description: model.description,
          displayName: model.display_name,
          enabled: LOBE_DEFAULT_MODEL_LIST.find((m) => model.name.endsWith(m.id))?.enabled || false,
          functionCall: model.description?.includes('function calling'),
          id: model.name,
          maxOutput: model.context_length,
          tokens: model.context_length,
          vision: model.description?.includes('vision') || model.name?.includes('vision'),
        };
      });
  },
  provider: ModelProvider.TogetherAI,
});

File Upload Errors in Knowledge Base

Ensure that your MinIO S3 configuration is correct. Follow these steps:

  1. Access MinIO WebUI: Open http://localhost:9001 and log in with the default admin account password configured in .env.

  2. Create Bucket: Create a bucket that matches the MINIO_LOBE_BUCKET field in your .env file (default is lobe).

  3. Set Custom Policy: Use the following policy for the bucket:

    {
     "Statement": [
       {
         "Effect": "Allow",
         "Principal": {
           "AWS": ["*"]
         },
         "Action": ["s3:GetBucketLocation"],
         "Resource": ["arn:aws:s3:::lobe"]
       },
       {
         "Effect": "Allow",
         "Principal": {
           "AWS": ["*"]
         },
         "Action": ["s3:ListBucket"],
         "Resource": ["arn:aws:s3:::lobe"],
         "Condition": {
           "StringEquals": {
             "s3:prefix": ["files/*"]
           }
         }
       },
       {
         "Effect": "Allow",
         "Principal": {
           "AWS": ["*"]
         },
         "Action": ["s3:PutObject", "s3:DeleteObject", "s3:GetObject"],
         "Resource": ["arn:aws:s3:::lobe/files/**"]
       }
     ],
     "Version": "2012-10-17"
    }
  4. Create Access Key: Generate a new access key and fill the Access Key and Secret Key into your .env file under S3_ACCESS_KEY_ID and S3_SECRET_ACCESS_KEY.

Restart LobeChat Service

After making these changes, restart the LobeChat service:

docker compose up -d

Do not use docker compose restart lobe as it will not reload the environment variables.

For more detailed steps, refer to the self-hosting documentation [3][4].

To continue talking to Dosu, mention @dosu.

BiFangKNT commented 2 months ago

点击获取模型列表了吗?你这个问题应该去翻你的代理api的文档,也许是url填错了

lobehubbot commented 2 months ago

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Did you click to get the model list? For this question, you should check the documentation of your proxy API. Maybe the URL is filled in incorrectly.