Chainlit / cookbook

Chainlit's cookbook repo
https://github.com/Chainlit/chainlit
713 stars 270 forks source link

how to write the @cl.on_chat_resume with retrieval? #56

Open wangpengfei2048 opened 8 months ago

wangpengfei2048 commented 8 months ago
I use the code with the example for @cl.on_chat_resume ,   the chat is OK 

but I want to use the retrieval, can you give an example?

I use the code with langchain example ,but fail!

template = """Answer the question based only on the following context: {context}

Question: {question} """ prompt = ChatPromptTemplate.from_template(template) model = ChatOpenAI()

retrieval_chain = ( {"context": retriever, "question": RunnablePassthrough()} | prompt | model | StrOutputParser() )

wangpengfei2048 commented 8 months ago

I use the example: prompt = ChatPromptTemplate.from_messages( [ ("system", "You are a helpful chatbot"), MessagesPlaceholder(variable_name="history"), ("human", "{question}"), ] )

runnable = (
    RunnablePassthrough.assign(
        history=RunnableLambda(memory.load_memory_variables) | itemgetter("history")
    )
    | prompt
    | model
    | StrOutputParser()
)
cl.user_session.set("runnable", runnable)
willydouhard commented 8 months ago

Here is an example with a retriever. You can combine this with the resume example.

wangpengfei2048 commented 8 months ago

I do it as the example. but there is an error , when exeute : async for chunk in runnable.astream( {"question": message.content}, config=RunnableConfig(callbacks=[cl.LangchainCallbackHandler()]), ): await res.stream_token(chunk)

Traceback (most recent call last): File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/chainlit/utils.py", line 39, in wrapper return await user_function(params_values) File "/usr/local/project/rag-app-3/main.py", line 130, in onMessage await app.question_anwsering(message.content, False) File "/usr/local/project/rag-app-3/app.py", line 277, in question_anwsering async for chunk in runnable.astream( File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1904, in astream async for chunk in self.atransform(input_aiter(), config, kwargs): File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1890, in atransform async for chunk in self._atransform_stream_with_config( File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1197, in _atransform_stream_with_config async for chunk in iterator: File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1863, in _atransform async for output in final_pipeline: File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/output_parsers/transform.py", line 60, in atransform async for chunk in self._atransform_stream_with_config( File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1176, in _atransform_stream_with_config final_input: Optional[Input] = await py_anext(input_for_tracing, None) File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/utils/aiter.py", line 62, in anext_impl return await anext(iterator) File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/utils/aiter.py", line 97, in tee_peer item = await iterator.anext() File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 712, in atransform async for chunk in input: File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 712, in atransform async for chunk in input: File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2210, in atransform async for chunk in self._atransform_stream_with_config( File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1197, in _atransform_stream_with_config async for chunk in iterator: File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2197, in _atransform chunk = AddableDict({step_name: task.result()}) File "/usr/local/lib/python3.10/asyncio/futures.py", line 201, in result raise self._exception File "/usr/local/lib/python3.10/asyncio/tasks.py", line 232, in step result = coro.send(None) File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2180, in get_next_chunk return await py_anext(generator) File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1890, in atransform async for chunk in self._atransform_stream_with_config( File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1197, in _atransform_stream_with_config async for chunk in iterator: File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 1863, in _atransform async for output in final_pipeline: File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 712, in atransform async for chunk in input: File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 722, in atransform async for output in self.astream(final, config, kwargs): File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 503, in astream yield await self.ainvoke(input, config, kwargs) File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/retrievers.py", line 127, in ainvoke return await self.aget_relevant_documents( File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/retrievers.py", line 269, in aget_relevant_documents raise e File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/retrievers.py", line 262, in aget_relevant_documents result = await self._aget_relevant_documents( File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain/retrievers/merger_retriever.py", line 57, in _aget_relevant_documents merged_documents = await self.amerge_documents(query, run_manager) File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain/retrievers/merger_retriever.py", line 106, in amerge_documents retriever_docs = await asyncio.gather( File "/usr/local/lib/python3.10/asyncio/tasks.py", line 304, in wakeup future.result() File "/usr/local/lib/python3.10/asyncio/tasks.py", line 232, in step result = coro.send(None) File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/retrievers.py", line 269, in aget_relevant_documents raise e File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/retrievers.py", line 262, in aget_relevant_documents result = await self._aget_relevant_documents( File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/vectorstores.py", line 676, in _aget_relevant_documents docs = await self.vectorstore.asimilarity_search( File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_core/vectorstores.py", line 387, in asimilarity_search return await asyncio.get_event_loop().run_in_executor(None, func) File "/usr/local/lib/python3.10/asyncio/futures.py", line 284, in await yield self # This tells Task to wait for completion. File "/usr/local/lib/python3.10/asyncio/tasks.py", line 304, in wakeup future.result() File "/usr/local/lib/python3.10/asyncio/futures.py", line 201, in result raise self._exception File "/usr/local/lib/python3.10/concurrent/futures/thread.py", line 52, in run result = self.fn(*self.args, **self.kwargs) File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_community/vectorstores/chroma.py", line 348, in similarity_search docs_and_scores = self.similarity_search_with_score(query, k, filter=filter) File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_community/vectorstores/chroma.py", line 432, in similarity_search_with_score query_embedding = self._embedding_function.embed_query(query) File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_community/embeddings/openai.py", line 696, in embed_query return self.embed_documents([text])[0] File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_community/embeddings/openai.py", line 667, in embed_documents return self._get_len_safe_embeddings(texts, engine=engine) File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/langchain_community/embeddings/openai.py", line 470, in _get_len_safe_embeddings token = encoding.encode( File "/usr/local/project/rag-app-3/venv/lib/python3.10/site-packages/tiktoken/core.py", line 116, in encode if match := _special_token_regex(disallowed_special).search(text): TypeError: expected string or buffer

wangpengfei2048 commented 8 months ago

I use these code is OK async for chunk in runnable.astream( message.content, config=RunnableConfig(callbacks=[cl.LangchainCallbackHandler()]), ): await res.stream_token(chunk)

willydouhard commented 8 months ago

The issue is that you were not passing message.content maybe?