Closed gavinliu closed 6 months ago
👀 @gavinliu
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. Please make sure you have given us as much context as possible.\ 非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。
没有计划,我们不准备用 LangChain。计划等 Ollama 自行实行Function后我们再接入
✅ @gavinliu
This issue is closed, If you have any questions, you can comment and reply.\ 此问题已经关闭。如果您有任何问题,可以留言并回复。
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
There is no plan and we are not going to use LangChain. We plan to wait until Ollama implements Function on its own before we can connect to it.
ollama 已经支持 function calling 了
@hl1221hl 有文档吗
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
@hl1221hl Is there any documentation?
https://ollama.com/blog/tool-support 这里
@arvinxx
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
@arvinxx
@arvinxx 有计划不?
魔搭社区也有个 Ollama + Qwen2 的 Function calling 的文章,https://mp.weixin.qq.com/s/d82jUnXldJw_UPVPngZjDQ
@gavinliu 看了,目前不支持流式,不好做 https://github.com/lobehub/lobe-chat/issues/3436
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
@gavinliu saw it, streaming is not currently supported and it is difficult to do https://github.com/lobehub/lobe-chat/issues/3436
@gavinliu 看了,目前不支持流式,不好做 https://github.com/lobehub/lobe-chat/issues/3436
当时Anthropic是不是流式出之前就做了? 那个是怎么伪造了一个流吗?
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
@gavinliu saw it, streaming is not currently supported and it is difficult to do https://github.com/lobehub/lobe-chat/issues/3436
Did Anthropic do it before it was streamed? How did you forge a stream?
@BrandonStudio 没有吧,当时 anothropic 就支持了流式 tools 的。
非 stream 模式转成 stream 模式不复杂的,但主要的问题是必须要先在流式模式下存在对应的返回数据结构才能将非流式转成流式。 Groq 做了这一层转换,是因为 groq 的api 对应了 openai 的结构,所以将groq 的非流式tools 转成openai 的流式tools 就好了。
而ollama这边不能这么做。原因是 Ollama 本身的返回的流式数据结构是和 openai 不一致的。目前 ollama 只有文本的流式返回结构,没有定义 tools 的。这就导致如果现在要将ollama非流式的tools转成流式,就要我自定义一个 ollama 的 tools 的流式结构,而这个结构大概率和未来ollama 官方的流式 tools 对不上,那就又有迁移成本。
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
@BrandonStudio No, anothropic already supported streaming tools at that time.
Converting non-stream mode to stream mode is not complicated, but the main problem is that the corresponding return data structure must first exist in streaming mode before non-streaming can be converted to streaming. Groq has done this level of conversion because groq's API corresponds to the structure of openai, so it is enough to convert groq's non-streaming tools into openai's streaming tools.
But ollama can't do this. The reason is that the streaming data structure returned by Ollama itself is inconsistent with openai. Currently, ollama only has a text stream return structure and does not define tools. This means that if I want to convert ollama's non-streaming tools into streaming, I need to customize a streaming structure of ollama's tools, and this structure will most likely not match the official streaming tools, so There are migration costs again.
我是在考虑我那个Cloudflare,流式和非流式结构不一样,非流式的是JSON好弄一点
顺便一提那个PR啥时候可以review(
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
I'm thinking about my Cloudflare. The streaming and non-streaming ones have different structures. The non-streaming one is JSON which is easier to use.
By the way, when will the PR be available for review (
@BrandonStudio 最近积攒的 PR,都等知识库/文件上传发布后一并 review。大招快憋出来了😂
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
The PRs that @BrandonStudio has accumulated recently will be reviewed together after the knowledge base/files are uploaded and published. I’m almost holding my big move out 😂
🥰 需求描述
目前只支持了 OpenAI 的 Function Calling,期望本地模型能够支持 Function Calling
🧐 解决方案
https://js.langchain.com/docs/integrations/chat/ollama_functions
是否可以用 Langchain 中的 Ollama Functions 完成功能开发?
📝 补充信息
No response