-
**I'm submitting a ...**
[ x] bug report
**Summary**
I'm getting errors with:
Anthropic
```
@ax-llm/ax/build/module/src/util/apicall.js:39
throw new Error(`API Error: ${apiUrl.hre…
-
**Describe the bug**
I installed vs code plugin v3.7.19 and local ollama in Docker using [official container](https://hub.docker.com/r/ollama/ollama) from Docker Hub.
Via `curl` it seems it somehow …
-
### System Info
- `transformers` version: 4.36.2
- Platform: Linux-4.18.0-193.6.3.el8_2.v1.4.x86_64-x86_64-with-glibc2.29
- Python version: 3.8.10
- Huggingface_hub version: 0.19.4
- Safetensors …
-
### How are you running AnythingLLM?
AnythingLLM desktop app
### What happened?
I have been successfully trailing using the Desktop Anything LLM v1.6.4, (OSX version).
- On Macbook Pro intel 32…
-
### Issue description
The constructor was changed to private so we cannot use this in TypeScript.
### Expected Behavior
I should be able to follow docs:
```
const model = new LlamaModel({
m…
-
### Version
v1.14.0
### Describe the bug
when enabling the experimental feature to use a model hosted on ollama, cody can only talks to localhost:11434 - we can't configure a different URL - as we …
-
I tried to run it on macOS in local mode but failed, I want to delete the already downloaded model but I don't know the file path.
-
Ubuntu 20.04
LLAMA_CUBLAS=on pip install llama-cpp-python --force-reinstall --upgrade --no-cache-dir
(CodeLlama) developer@ai:~$ python --version
Python 3.10.12
>>> from llama_cpp import Llama
…
-
### How are you running AnythingLLM?
Docker (local)
### What happened?
I started Ollama with docker:
`docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama`
I then load…
cope updated
5 months ago
-
LLama only supports messages in this order: 'system', 'user' and 'assistant' roles, starting with 'system', then 'user' and alternating (u/a/u/a/u...)
continue.dev sends two user role messages in a…