-
### What happened?
I was doing THIS,
fabric --listmodels --remoteOllamaServer http://127.0.0.1:11434
when THAT happened.
Error: Client error '404 Not Found' for url 'http://127.0.0.1:11434/v1/…
-
### Is there an existing issue for this feature?
- [X] Yes, I have searched the existing issues and it doesn't exist.
### Feature Description
It would be awesome to be able to use [Ollama](https://…
-
Hi, I'm interested in having a native Go client library for Ollama REST API, like the Python and JS ones.
I can start myself, but want to ask: is someone already working on it?
If it's not taken, …
-
**Describe the bug**
On fresh install, got this message:
Could not fetch Ollama models. Make sure the Ollama base URL is accessible with RAGapp.
Also popup says "Failed to fetch Ollama models"
…
-
### Feature request / 功能建议
目前有很多开源模型支持了ollama部署,请问GLM4有ollama部署的教程吗?或者能否给出下prompt template和stop token,我自己转换试下。
### Motivation / 动机
希望有ollama部署部署教程
### Your contribution / 您的贡献
ollama参考…
-
When I run this code
`from scrapegraphai.graphs import SmartScraperGraph
import nest_asyncio
graph_config = {
"llm": {
"model": "ollama/mistral",
"temperature": 0,
…
-
### Describe the question.
根据ollama 的[接入文档](https://github.com/ollama/ollama/blob/main/docs/modelfile.md)
但是有个小问题,TEMPLATE格式不对。一直无法正常输出
![image](https://github.com/InternLM/InternLM/assets/28780269…
-
### Template for?
CVE-2024-37032
### Details:
PoC
https://www.wiz.io/blog/probllama-ollama-vulnerability-cve-2024-37032
-
Ollama llama3 Integration doesn't work for now.
Changes can be made in :
- Dockerfile
- Middleware Folder
-
# Bug Report
## Description
**Bug Summary:**
Making a HTTP POST to /ollama/v1/chat/completions worked in v0.3.5 but now returns 405 - Method not allowed
**Steps to Reproduce:**
Using the do…