-
Hello,
I've been trying to run the framework using a model I installed with Ollama, but I haven't been able to do it, maybe it's related to the model path, but I'm not sure. Have you tried models ins…
-
I ran Grok up and selected Ollama had this error ;
"ModuleNotFoundError: No module named 'providers.ollamaprovider'" and a list of references
I renamed the provider to ollamaprovider.py. …
-
Wanted to know if it's any hastle to add compatiblity to Ollama so the model can be run locally.
Haven't tried, just wondering if anyone has.
-
### Skill Name
Codeigniter, Proxmox, Ollama
### Why?
Because it is still widely used, and I also need to use it.
### Reference Image
Codeigniter: https://uxwing.com/codeigniter-icon/
Proxmox: ht…
-
In the last Wagtail webinar @tomdyson mentioned that this project can use [Ollama](https://ollama.com/) with the Llava model as a backend. Is it already possible or meant for a future release? I've be…
-
### Version
1.18
### Areas for Improvement
- [ ] UI/UX
- [ ] Onboarding
- [ ] Docs
- [ ] Chat
- [ ] Commands
- [X] Context
- [ ] Response Quality
- [ ] Other
### What needs to be improved? Please …
-
I have Ollama instance set up on an external server and think it would be a great, easy fit for this type of add-on if I could hook it in. Opens up use of many more models with better hardware, if the…
-
My organization cannot use OpenAI / Diffbot LLMs. Is there any plan to add support for Ollama local LLMs?
-
Is there any way to limit the risk when opening up OLLAMA_ORIGNS to * given that it is basically removing any CORS protection? That would allow any computer to connect to your OLLAMA server potential…
-
### Discussed in https://github.com/langflow-ai/langflow/discussions/1804
Originally posted by **VinojRaj** April 30, 2024
I am new to Langflow and I was trying to use Llama2 through Ollama as…