Open lloydzhou opened 1 month ago
Please follow the issue template to update title and description of your issue.
Bot detected the issue body's language is not English, translate it automatically.
Title: How to use the custom model function of v2.13.0 (how to use 2.13.0 multi models)
Supports specifying provider through
@
, and supports specifying deploy_name through=
+gpt-3.5-turbo@openai,+gpt-3.5-turbo@azure=gpt-3.5
The above configuration will display two modelsgpt-3.5-turbo(OpenAI)
andgpt-3.5(Azure)
respectively And using thegpt-3.5-turbo
model deployed by Azure will send the request to the deployed servicedeploy_name=gpt-3.5
+Doubao-lite-4k@bytedance=ep-2024xxxx-xxx
The above configuration will add aDoubao-lite-4k(ByteDance)
model. When this model is selected, the request will be sent to theep-2024xxxx-xxx
deploy-id service
If openai releases gpt-4.5, but nextchat does not release the new version in time You can configure
+gpt-4.5@OpenAI
, thegpt-4.5(OpenAI)
option will be added to the model list, and the request will be sent according to the message format of openai
CUSTOM_MODELS
:
For example, the manufacturer forwarded
gpt-3.5-turbo
andclaude-2.1
at the same time, both in formats compatible with openai. Configure+gpt-3.5-turbo@OpenAI,+claude-2.1@OpenAI
(the use of OpenAI instead of openai here means that it is not a built-in service provider list, but only conforms to the message format of OpenAI) Two new model options,gpt-3.5-turbo(OpenAI)
andclaude-2.1(OpenAI)
, will be added. Requests will be sent to/api/openai/*
At this time, you can configure theBASE_URL
+OPENAI_API_KEY
mode in .env, or sign in to configure a custom interface to use the services of this transit provider
感谢提供的帮助。有个疑问,怎么对模型进行排序,比如自定义模型排列在最前面。
Bot detected the issue body's language is not English, translate it automatically.
Thanks for the help. I have a question, how to sort the models, such as custom models at the front.
另外想问一下如何支持多个相同的自定义部署呢 比如Azure #4398
当前已知的是可以支持相同的资源使用不同的部署,但是无法使用跨区域资源的不同部署
Bot detected the issue body's language is not English, translate it automatically.
Also, I would like to ask how to support multiple custom deployments #4398
docker compose部署发现按描述配置:+gpt-3.5-turbo@azure=gpt-3.5,无法正常生效,没有配置任何openai的参数,但默认走openai导致返回错误
Bot detected the issue body's language is not English, translate it automatically.
The docker compose deployment found that it was configured as described: +gpt-3.5-turbo@azure=gpt-3.5, which could not take effect normally. No openai parameters were configured, but openai was used by default.
自定义模型的url,在env中是配置哪个变量?
Bot detected the issue body's language is not English, translate it automatically.
Which variable is configured in env for the custom model url?
问答时报错:Unknown parameter: 'path' chatgpt-next的配置: gpt-4o@openai,代理地址是one-api,one-api对接的是azure的api。
请求one-api的参数: { "messages": [ { "role": "system", "content": "\nYou are ChatGPT, a large language model trained by OpenAI.\nKnowledge cutoff: 2023-10\nCurrent model: gpt-4o\nCurrent time: Mon Jul 22 2024 16:32:39 GMT+0800 (中国标准时间)\nLatex inline: \(x^2\) \nLatex block: $$e=mc^2$$\n\n" }, { "role": "user", "content": "你好呀,gpt" } ], "stream": true, "model": "gpt-4o", "temperature": 0.5, "presence_penalty": 0, "frequency_penalty": 0, "top_p": 1, "path": "completions" }
Bot detected the issue body's language is not English, translate it automatically.
Error during Q&A: Unknown parameter: 'path' The configuration of chatgpt-next: gpt-4o@openai, the proxy address is one-api, and one-api is connected to the azure api.
Parameters for requesting one-api: { "messages": [ { "role": "system", "content": "\nYou are ChatGPT, a large language model trained by OpenAI.\nKnowledge cutoff: 2023-10\nCurrent model: gpt-4o\nCurrent time: Mon Jul 22 2024 16:32:39 GMT+0800 (China Standard time)\nLatex inline: \(x^2\) \nLatex block: $$e=mc^2$$\n\n" }, { "role": "user", "content": "Hello, gpt" } ], "stream": true, "model": "gpt-4o", "temperature": 0.5, "presence_penalty": 0, "frequency_penalty": 0, "top_p": 1, "path": "completions" }
谢谢,解决了第三方api的问题
Bot detected the issue body's language is not English, translate it automatically.
Thank you, the problem with the third-party API has been solved
弱弱地问下,插件里面的Artifacts是干嘛的,谷歌、百度了一遍,还是没看明白。。。
Bot detected the issue body's language is not English, translate it automatically.
I asked weakly what the Artifacts in the plug-in are for. I searched Google and Baidu, but I still don’t understand. . .
Bot detected the issue body's language is not English, translate it automatically.
I asked weakly what the Artifacts in the plug-in are for. I searched Google and Baidu, but I still don’t understand. . .
适用Claude模型的一个插件,具体可以看一下官方的说明 https://www.anthropic.com/news/claude-3-5-sonnet
CUSTOM_MODELS
配置使用: