-
Hello.
I'd like to try out your tool, but I get this error message:
```shell
❯ ollama-commit
ℹ AI PROVIDER: ollama
ℹ AI MODEL: mistral
ℹ API HOST: http://localhost:11434
ℹ LANGUAGE: en
◐ T…
-
Theres already some issues it seems possible in theory, but this would really may Clio a great tool.
-
C:\Users\reikairen\Desktop\ollama-ebook-summary-main>python book2text.py "article.pdf"
Traceback (most recent call last):
File "C:\Users\reikairen\Desktop\ollama-ebook-summary-main\book2text.py", …
-
This is just what I've been looking for. Is it possible to use a remote host for Ollama? I tried changing the localhost:11434 to my remote IP:11434 but I receive an error when I try to chat or pull a …
-
### Is your feature request related to a problem? Please describe
i see latest nightly has pull and list available like ollama - awesome.
allows me to use ollama list/pull.
any chance to trigge…
-
- add in models
- add in dev info on llm ops
- add in push notifications for dropped ollama servers
- add in nightly batches with ollama servers
-
`python lightrag_ollama_demo.py
Traceback (most recent call last):
File "/LightRAG/examples/lightrag_ollama_demo.py", line 14, in
rag = LightRAG(
^^^^^^^^^
TypeError: LightRAG._…
-
1、双击安装ipex-llm-ollama-installer-20240918.exe
默认安装在C:\Users\OPS17\ipex-llm-ollama
2、拷贝ipex-llm-ollama文件夹到C:\aipc\
3、这个时候运行ipex-llm-ollama里面的start.bat,可以正常运行,并且访问什么都正常
4、这个时候用Python编写一个程序,并打包为…
-
It just writes in the chat on the left but the code and see this:
_Unfortunately, I'm a large language model, I don't have direct control over the layout of your HTML document. However, I can provide…
-
使用ollama运行minicpm-v模型,调用过程中发现,单独调用llm文字部分,正常运行到igpu。
但是同时使用图片和文字,会出先LLM运行到CPU上。
ollama run minicpm-v:latest
Test prompt
{
"model": "minicpm-v:latest",
"prompt": "图片讲了什么内容?",
"images":[…