-
Is there a way or tutorial on how to configure ollama litellm to work with skyvern? How can skyvern work with a local llm?
-
Hi,
I might be missing something obvious, but after I successfully execute:
```
ollama run chanwit/flux-7b:v0.2
```
And then type:
```
>>> tell me about SOPS.
```
I get the error:
```
Error: Post "ht…
-
https://github.com/ollama/ollama/tree/main/examples
https://github.com/ollama/ollama/tree/main/examples/langchain-python-rag-websummary
0. 올라마 이용해서 gemma 실행된 상태에서만 파이썬 프로그램 이용해서 gemma 로컬 LLM 이용 …
-
### What is the issue?
I tried running with the 1.7b version, and it ran successfully.
![image](https://github.com/user-attachments/assets/6074c785-cbb2-43e0-b82d-32fe74184840)
However, when runni…
-
hi. adding support for cloud LLMs via API (chatGPT, Claude, Gemini, Grok) would be easy?
PS: thanks for the cool extension!
-
I was trying to use an LM studio hosted local server, but apparently put in the wrong end point. Every end point I attempted to enter as the server showed up with an error. I haven't connected an age…
-
Hi,
Is it possible to use tools with ollama ?
I have found this article
https://ollama.com/blog/tool-support
I have tried with OllamaChat() but it's not implemented.
I also tried with …
-
I use the ReActAgent together with Llama3 via Ollama. If I ask a normal question like "Hello. How are you?" the agent crashes, since it's searching for a tool to use. I guess it should not throw an er…
-
> Found out that the 'OPENAI_API_TYPE' value 'llama2' does not work. I also noticed that the `llm = get_llm()` is used 3 times but not used in the code. A llm is used via `chat = get_cha…
-
**例行检查**
[//]: # (方框内删除已有的空格,填 x 号)
+ [x] 我已确认目前没有类似 issue
+ [x] 我已确认我已升级到最新版本
+ [x] 我已完整查看过项目 README,尤其是常见问题部分
+ [x] 我理解并愿意跟进此 issue,协助测试和提供反馈
+ [x] 我理解并认可上述内容,并理解项目维护者精力有限,**不遵循规则的 issue 可能…