Closed marcovirgolin closed 1 year ago
Hi! thanks for bug report! I will look into this.
@marcovirgolin I don't think there's a bug in the code but likely a small error in how you are using BaseLangchainStreamingResponse
. In case of multiple inputs, your fastapi application should look something like:
@app.post("/chat")
async def chat(
request: QueryRequest,
chain: ConversationChain = Depends(conversation_chain),
) -> LLMChainStreamingResponse:
return LLMChainStreamingResponse.from_chain(
chain, request.dict(), media_type="text/event-stream"
)
here request.dict()
is a dictionary: {number-one": "<value>", "number-two": "<value>"}
. QueryRequest
is a pydantic model to define the request body parameters.
Your second question about self.background
: it is supposed to be an instance of BackgroundTask
. You can learn more about them here: https://fastapi.tiangolo.com/tutorial/background-tasks/
We use it to run some operation after the chain execution is complete. It is useful in situations where you might want to update a database with new input and chain output.
Thank you, very possible that I am doing something wrong. Thank you also for explaining background.
I know I’m passing a dict correctly. Did you happen to have tried also with LLMChain instead of ConversationChain? Is it crucial to pass the Chain as a dependency instead of creating it within the body of the function? (I tried both)
I will try more tomorrow.
If possible, can you share your script? It will be helpful for me to debug this error.
@ajndkr my bad, I was missing the .from_chain
(previous version did not need that).
tried now with most recent version, using StreamingResponse
, works fine too.
Thank you very much <3
I am trying the library with an LLMChain instead of ConversationChain. My chain has custom inputs. Say, for example, the prompt is
and thus
This leads to the error:
from BaseLangchainStreamingResponse(StreamingResponse).
LIB VERSION
langchain==0.0.157 fastapi_async_langchain==0.4.3
MY TRACEBACK
This initiates at line 43 of BaseLangchainStreamingResponse:
which gives the exception:
Basically, my input variables are being ignored/lost.
What happens next is that we got in the catch block below,
however
self.background
(which I don't know what it means) has no attributekwargs
SUMMARY
I do not know why but since the recent changes passing of input variables seem to be broken when using
LLMChain
(I don't know about ConversationChain). It was working fine before.moreover, it is assumed that
background
haskwargs
but that might not always the case.