Open RazorSnow001 opened 1 month ago
👀 @RazorSnow001
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. Please make sure you have given us as much context as possible.\ 非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。
Hello, @RazorSnow001! I'm here to help you with your GitHub issue. I can assist with bug fixes, answer your questions, and guide you on your journey to becoming a contributor.
I found some similar discussions and issues that might help resolve your problem:
Docker 安装,配置好环境变量后,为何返回值是空白? - This discussion highlights the importance of correctly setting the OPENAI_PROXY_URL
with the /v1
suffix if required. The issue was resolved by ensuring the correct URL format [1].
File Upload / KnowledgeBase Deployment Feedback | 知识库部署问题反馈 - This issue was closed by a pull request that addressed deployment problems related to the knowledge base [2].
To address your specific issues:
Check OPENAI_PROXY_URL
Configuration:
Ensure that the OPENAI_PROXY_URL
environment variable is set correctly, including the /v1
suffix if required.
Authorization Header:
Verify that the Authorization
header is correctly set with the Bearer
token when fetching models.
Base URL Configuration:
Ensure that the baseURL
is correctly configured to point to the proxy URL.
Model Handling:
Ensure the models
function handles the response properly and filters the models as needed.
Here is a code snippet to help with the configuration:
const baseURL = 'https://your-proxy-url.com';
export const LobeTogetherAI = LobeOpenAICompatibleFactory({
baseURL: `${baseURL}/v1`,
constructorOptions: {
defaultHeaders: {
'HTTP-Referer': 'https://chat-preview.lobehub.com',
'X-Title': 'Lobe Chat',
},
},
debug: {
chatCompletion: () => process.env.DEBUG_TOGETHERAI_CHAT_COMPLETION === '1',
},
models: async ({ apiKey }) => {
const data = await fetch(`${baseURL}/api/models`, {
headers: {
Authorization: `Bearer ${apiKey}`,
},
});
if (!data.ok) {
throw new Error(`Together Fetch Error: ${data.statusText || data.status}`);
}
const models: TogetherAIModel[] = await data.json();
return models
.filter((m) => m.display_type === 'chat')
.map((model) => {
return {
description: model.description,
displayName: model.display_name,
enabled: LOBE_DEFAULT_MODEL_LIST.find((m) => model.name.endsWith(m.id))?.enabled || false,
functionCall: model.description?.includes('function calling'),
id: model.name,
maxOutput: model.context_length,
tokens: model.context_length,
vision: model.description?.includes('vision') || model.name?.includes('vision'),
};
});
},
provider: ModelProvider.TogetherAI,
});
Ensure that your MinIO S3 configuration is correct. Follow these steps:
Access MinIO WebUI:
Open http://localhost:9001
and log in with the default admin account password configured in .env
.
Create Bucket:
Create a bucket that matches the MINIO_LOBE_BUCKET
field in your .env
file (default is lobe
).
Set Custom Policy: Use the following policy for the bucket:
{
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": ["*"]
},
"Action": ["s3:GetBucketLocation"],
"Resource": ["arn:aws:s3:::lobe"]
},
{
"Effect": "Allow",
"Principal": {
"AWS": ["*"]
},
"Action": ["s3:ListBucket"],
"Resource": ["arn:aws:s3:::lobe"],
"Condition": {
"StringEquals": {
"s3:prefix": ["files/*"]
}
}
},
{
"Effect": "Allow",
"Principal": {
"AWS": ["*"]
},
"Action": ["s3:PutObject", "s3:DeleteObject", "s3:GetObject"],
"Resource": ["arn:aws:s3:::lobe/files/**"]
}
],
"Version": "2012-10-17"
}
Create Access Key:
Generate a new access key and fill the Access Key
and Secret Key
into your .env
file under S3_ACCESS_KEY_ID
and S3_SECRET_ACCESS_KEY
.
After making these changes, restart the LobeChat service:
docker compose up -d
Do not use docker compose restart lobe
as it will not reload the environment variables.
For more detailed steps, refer to the self-hosting documentation [3][4].
To continue talking to Dosu, mention @dosu.
点击获取模型列表了吗?你这个问题应该去翻你的代理api的文档,也许是url填错了
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
Did you click to get the model list? For this question, you should check the documentation of your proxy API. Maybe the URL is filled in incorrectly.
📝 Reference
https://github.com/lobehub/lobe-chat/tree/main/docs/self-hosting/server-database/docker-compose.zh-CN.mdx
🐛 Feedback
Your response
按照教程本地启动后,登陆也成功了,但是提问后没有出现任何报错,输出一个空白回答 。(这里使用的是代理的openai key ),在基础版本的lobechat 上使用非常正常,但是 lobedb 版本上连通测试可以通过但是显示可用模型为0,正常应该是有几十个 。 知识库上传文件后分块报错