logancyang / obsidian-copilot

THE Copilot in Obsidian
https://www.obsidiancopilot.com/
GNU Affero General Public License v3.0
2.99k stars 209 forks source link

Optimized support for the API for Qwen streaming (paragraph-by-paragraph answers)优化对于通义千问API的支持,让其能够流式传输(一段一段的回答) #641

Open XSR-WatchPioneer opened 2 months ago

XSR-WatchPioneer commented 2 months ago

I use a third party compatible with the OpenAI API to enable the Qwen model, but cross-domain needs to be enabled, otherwise it will not work properly.

我使用兼容OpenAI API的第三方可以启用通义千问模型,但是需要开启跨域,否则无法正常工作。

However, after cross-domain is enabled, the answers of the model cannot be streamed, and must be displayed after all the answers are generated.

但是开启跨域以后,模型的回答不能流式加载,必须要等全部回答生成后才显示。

Configuration about Qwen: - fill https://dashscope.aliyuncs.com/compatible-mode/v1 URL.

关于通义千问的配置: - URL 填 https://dashscope.aliyuncs.com/compatible-mode/v1

Emt-lin commented 1 month ago

@XSR-WatchPioneer Cannot be resolved at this time

see:

Say goodbye to CORS errors for both chat models and embedding! The new model table in settings now lets you turn on "CORS" for individual chat models if you see CORS issue with them. And embedding models are immune to CORS errors by default! Caveat: this is powered by Obsidian API's requestUrl which does not support "streaming" of LLM responses. So streaming is disabled whenever you have CORS on in Copilot settings. Please upvote this feature request to let Obsidian know your need for streaming!