-
There are applications that may not want to proxy all LLM calls through a backend server, which is a current limitation.
Specifically, the useChat hook in React assumes making a fetch call to the s…
-
https://huggingface.co/deepseek-ai/DeepSeek-V2-Lite-Chat/tree/main
-
### Describe the bug
I'm writing to report an error with Microcks when using asyncAPI with multiple URI_PARTS in the examples array part.
Taking the following asyncAPI as an example:
```
as…
-
Support for the [minimessage format](https://docs.advntr.dev/minimessage/). It should ideally contain a function that takes a string in the minimessage format and converts it into the data needed to d…
-
**Describe the feature you'd like**
Currently, I dont see support for connecting to Azure OpenAI services which is behind the Proxy layer. This is a limitation because many companies prefer this setu…
-
### Feature Description
OpenSearch support as a chat store
### Reason
_No response_
### Value of Feature
Expands architecture options. Like Postgres, OpenSearch is an open-source vector store wh…
-
**Is your feature request related to a problem? Please describe.**
Add support to reply to messages in direct chat and group chat
**Describe the solution you'd like**
Users should be able to repl…
-
https://cogsworth.readthedocs.io/en/latest/
-
# Feature Request
OpenAI API supports applications including an anonymized `user` to support moderation / abuse. See more info in the api spec for chat completions.
> A unique identifier repres…
-
Hi,
I tested the new langchain-ibm version, it works good for mistral-large.
However with llama-3.1-70B I got the following error:
```
ibm_watsonx_ai.wml_client_error.ApiRequestFailure: Failu…