-
When using an LLM for say a lesson it'd be nice to prime the LLM with a persistent initial "basic instruction" that never falls out of the window, e.g.
"You're a German language instructor, I'm an…
-
**Describe the bug**
I have tried with open ai and the r2r is working fine. But when i switch to ollama its not giving the answer, have added detailed errors.
**To Reproduce**
Steps to reproduce …
-
## Temperature and seed parameters should be part of 'options'
According to [the docs](https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-chat-completion) temperature and seed should b…
-
### Self Checks
- [X] This is only for bug report, if you would like to ask a question, please head to [Discussions](https://github.com/langgenius/dify/discussions/categories/general).
- [X] I have s…
-
Hi there, with more and more models supporting function calling now, I am missing a way to pass a list of functions (tools) to the client. This what works perfectly calling the API:
```python
mode…
-
As you can see from the curl commands below from the ramalama project (toggle x = True in the ramalama python script to print the curl commands) the "Accept: application/vnd.docker.distribution.manife…
-
Hi,
Is there any complete example as how to use this version 5 with litellm?
I have seen your following comment on various issues with ollama based models but the documentation doesn't say as ho…
-
Sorry, I don't know what I'm doing wrong but it's probably just something obvious I'm missing.
I'm on Mac, FWIW. I installed Python, pip, and all the dependencies I saw in book2text.py, but it kept…
-
Hi, can you please provide a guide or support to use local llm models like Ollama lama3.1 8b or 70b
-
### Before submitting your bug report
- [ ] I believe this is a bug. I'll try to join the [Continue Discord](https://discord.gg/NWtdYexhMs) for questions
- [ ] I'm not able to find an [open issue](ht…
bwdmr updated
1 month ago