Closed v-byte-cpu closed 2 months ago
🤖
It's great to see that you've identified a potential issue and a possible solution related to the stream()
method in LangChain. Your observation about adding a pending=True
argument to the decorator seems like a step in the right direction.
If you're interested in contributing to LangChain, I would encourage you to go ahead and create a pull request with your proposed fix. The maintainers will review your changes and determine if it aligns with the project's goals.
Thank you for your interest in improving LangChain!
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
I'm getting the same issue in my app. Chainlit offers a generic langchain provider that streams the model backend request provided by an llm input. When I add support for updating slider inputs in a playground request like so:
result = await make_async(self.llm.stream)(
input=messages, # User Prompt
**request.prompt.settings # User Inputs like temperature, top_K, etc.
)
I get the error above.
When I downgrade to v0.0.350 I also receive the same error, so I am unsure if the issue is in the specific PR you mentioned.
The out-of-the-box GenericLangchainProvider
in chainlit for v0.0.352 doesn't work because of this issue.
Edit: On followup testing, it appears that v0.0.349 is the release that is working for me.
https://github.com/langchain-ai/langchain/assets/43555799/827dfcfd-d66f-42ec-aa47-c4ae1df29bbf
The issue still persists in the latest version. Please provide guidance on how to handle streaming in ChatOllama. @v-byte-cpu do you have any updates?
Issue was fixed in #15104
I'm facing this issue for GooglePalm in the latest version
System Info
langchain version: v0.0.352 python version: 3.11
Hi there! After that PR https://github.com/langchain-ai/langchain/pull/14713 was merged, I started getting errors in stream() method:
I guess this decorator must have a
pending=True
argument.Who can help?
@hwchase17 @agola11
Information
Related Components
Reproduction
Expected behavior
successful streaming output from llm