-
### Feature Description
For OpenAI, on the stream event `thread.message.created`, the response is a [`message` object](https://platform.openai.com/docs/api-reference/messages/object), which includes …
-
### **User Story:**
As an **Developer considering using Meshrabiya**, I want to see the **Chat window** so that I can **see Meshrabiya working for myself**
### **Background info:**
- Need to have a…
-
### Bug Description
Imagine an instantiated Vertex ai model (LLM). I'm trying to access the "achat" interface to chat with the gemini model (gemini-1.5-flash-002)
And I'm getting the following e…
-
Zulip works best when most of the messages you receive are not muted. When we load a user's personal message history for the combined feed, we should display a `navbar_alerts` banner if >90% of th…
-
I want the naming in source_name to remain unchanged in the final answer. For example, if the content of source_name is '1-Book,' in the final answer,
it changes to 'found in book no 1,' and the…
-
Related to #1776.
Protein ID: Q96BV4 (https://www.glygen.org/protein/Q96BV4)
redirects to [P99999-1](https://www.glygen.org/protein/P99999-1).
But the history section on this page does not h…
-
### Question Validation
- [X] I have searched both the documentation and discord for an answer.
### Question
Hi, I'm working on llamaindex with FastAPI and streamlit.
I created a API, chat,
…
-
## Context
Using `LangchainProcessor` as LLM wrapper, and `InMemoryChatMessageHistory` (extends `BaseChatMessageHistory`) as message store.
## Current behavior
```
system_message = [
{
…
-
Hi,
I'm trying to get the model.generate function 's output to be returned within a gradio-called function.
I wanted to make a minimal gradio app with a single input textbox and single response …
-
### Question Validation
- [X] I have searched both the documentation and discord for an answer.
### Question
How can I use `response_gen` without getting the thoughts?
In my example:
he…