-
### Your current environment
```text
The output of `python collect_env.py`
```
### How would you like to use vllm
I used llama3-8b and 70b to fine-tune a model. When testing the model, I …
-
### Self Checks
- [X] I have searched for existing issues [search for existing issues](https://github.com/langgenius/dify/issues), including closed ones.
- [X] I confirm that I am using English to su…
-
**Is your feature request related to a problem? Please describe.**
Our current AI assistant implementation lacks an intelligent memory layer, which limits its ability to provide personalized interact…
-
### URL
https://python.langchain.com/v0.2/docs/tutorials/chatbot/
### Checklist
- [X] I added a very descriptive title to this issue.
- [X] I included a link to the documentation page I am referrin…
-
I am working on this project, and I want to use bedrock as I my chat service of choice, I tested this library with openai chatgpt-4, and it works perfectly.
Here is the code I am testing to connect…
-
O modelo ChatMaritalk não tem as mesmas integrações que o modelo `ChatOpenai` tem. A comunidade está usando muito LangGraph e LCEL e uma troca GPT4 -> Sabia3 está muito dificil sem que haja essa integ…
-
### Feature Description
Would love to be able to use `useAssistant` but really need the file upload support, are there plans to provide an abstraction for this?
### Use Case
_No response_
### Addi…
-
### Here's example of gradio chat, where user give a promt
with gr.Blocks() as demo:
chatbot = gr.Chatbot(value=[[None, "Hello, How can you assist you today?"]])
msg = gr.Textbox()
c…
-
### Describe the issue
since the new token and message limiting feature is not released yet, I am trying to use transform_history in my code. Here is the code:
```python
import autogen
from a…
-
Hello,
I wanted to update the template to use OpenAI Assistants + Threads. In the comments it says it's supported but I can't seem to figure out where I can update this.