-
- **I don´t know what Sourcegraph version I am using..how to know**
- **Platform information: Brave browser, Windows 10 Pro**
#### Steps to reproduce:
1.the issue has started just today (06/0…
-
I imported the private key for the 0th index account in metamask, but still logged in as a voter. Could u pls help me how were u able to solve that issue.
-
Error occurs. Error: 404 This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions?
Can I edit the setting somewhere to update the endpoint?
-
Got this funny message when using text-gen plugin:
```
File "/home/user/workspace/other/llamaindex_rag/chat_server.py", line 81, in chat_with_data
chat_engine = index.as_chat_engine(
…
-
### What is this about?
We encountered an issue during our CircleCI pipeline where the command to rerun failed tests could not find any failed tests, resulting in an exit status of 1. The process beg…
-
# Use case
**What problem you are trying to solve?**
I would like to be able to set up Maddy (along with Caddy and some other glue code) as a [chatmail](https://delta.chat/en/chatmail) server th…
-
### System Info
- `transformers` version: 4.45.2
- Platform: Linux-5.4.0-163-generic-x86_64-with-glibc2.31
- Python version: 3.11.10
- Huggingface_hub version: 0.26.0
- Safetensors version: 0.4.5…
-
Currently ollama plugin only supports chat/generate models, however ollama supports embeddings models as well: https://ollama.com/blog/embedding-models
-
**问题描述**
在xinference里我切换了模型为qwen2.5-instruct的7b模型,在WEBUI中可以正常使用,但是通过api访问时返回{'detail': "Only ['qwen1.5-chat', 'qwen1.5-moe-chat', 'qwen2-instruct', 'qwen2-moe-instruct', 'glm4-chat', 'glm4-chat-1m'] …
-
ADDENDUM 20/08/24
--------------------------
Having posted the issue, I've been working on the agent output in the `ui.Chat()` component in my own application. The "support for agents", which I or…
kovla updated
3 weeks ago