ChatGPTNextWeb / ChatGPT-Next-Web

A cross-platform ChatGPT/Gemini UI (Web / PWA / Linux / Win / MacOS). 一键拥有你自己的跨平台 ChatGPT/Gemini 应用。
https://app.nextchat.dev/
MIT License
77.15k stars 59.42k forks source link

希望【设置】模型服务商选项,增加 ollama本地大模型 选项 #5370

Open bingshan2024 opened 2 months ago

bingshan2024 commented 2 months ago

🥰 需求描述

希望【设置】模型服务商选项,增加 ollama本地大模型 选项 这个需求是在不与互联网连接的局域网内,假设A电脑部署了ollama,下载了一些本地大模型 本机用户可以在A电脑上直接选中ollama本地大模型,设置接口地址为http://localhost:11434,直接对话 其他用户或者更多的用户都可以在本机安装nextchat客户端,然后设置接口地址为http://A电脑的IP:11434,进行对话 用户选择A电脑的ollama本地大模型、接口后 模型(model)选项中出现的是 A电脑上ollama本地大模型的名字(当然如果A电脑的nextchat管理端 管理员可以有权直接设置统一使用某个模型也行)

🧐 解决方案

希望【设置】模型服务商选项,增加 ollama本地大模型 选项 这个需求是在不与互联网连接的局域网内,假设A电脑部署了ollama,下载了一些本地大模型 本机用户可以在A电脑上直接选中ollama本地大模型,设置接口地址为http://localhost:11434,直接对话 其他用户或者更多的用户都可以在本机安装nextchat客户端,然后设置接口地址为http://A电脑的IP:11434,进行对话 用户选择A电脑的ollama本地大模型、接口后 模型(model)选项中出现的是 A电脑上ollama本地大模型的名字(当然如果A电脑的nextchat管理端 管理员可以有权直接设置统一使用某个模型也行)

📝 补充信息

No response

nextchat-manager[bot] commented 2 months ago

Please follow the issue template to update title and description of your issue.

Issues-translate-bot commented 2 months ago

Bot detected the issue body's language is not English, translate it automatically.


Title: I hope to [Settings] the model service provider option and add the ollama local large model option

🥰 Description of requirements

I hope that the [Settings] model service provider option will add the ollama local large model option. This requirement is within a local area network that is not connected to the Internet. Assume that computer A has deployed ollama and downloaded some large local models. Local users can directly select the ollama local large model on computer A, set the interface address to http://localhost:11434, and have a direct conversation Other users or more users can install the nextchat client on this machine, and then set the interface address to http://A's computer's IP: 11434 to have a conversation. After the user selects the ollama local large model and interface of computer A, What appears in the model option is the name of the ollama local large model on computer A (of course, if the administrator of the nextchat management terminal of computer A has the right to directly set the unified use of a certain model)

🧐 Solution

I hope that the [Settings] model service provider option will add the ollama local large model option. This requirement is within a local area network that is not connected to the Internet. Assume that computer A has deployed ollama and downloaded some large local models. Local users can directly select the ollama local large model on computer A, set the interface address to http://localhost:11434, and have a direct conversation Other users or more users can install the nextchat client on this machine, and then set the interface address to http://A's computer's IP: 11434 to have a conversation. After the user selects the ollama local large model and interface of computer A, What appears in the model option is the name of the ollama local large model on computer A (of course, if the administrator of the nextchat management terminal of computer A has the right to directly set the unified use of a certain model)

📝 Supplementary information

No response

williamwa commented 2 months ago

that's a good idea!