Open MarcSkovMadsen opened 11 months ago
Maybe the basic example above works as expected because it expected only to add the ChatMessage. But this agent does not show chain of thought either
from langchain.llms import OpenAI
from langchain.agents import AgentType, initialize_agent, load_tools
import panel as pn
def callback(contents, user, instance):
llm.predict(contents)
instance = pn.chat.ChatInterface(callback=callback)
callback_handler = pn.chat.PanelCallbackHandler(instance)
llm = OpenAI(temperature=0, streaming=True, callbacks=[callback_handler])
tools = load_tools(["ddg-search"], callbacks=[callback_handler])
agent = initialize_agent(
tools, llm, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True, callbacks=[callback_handler]
)
instance.servable()
Streamlit version
from langchain.llms import OpenAI
from langchain.agents import AgentType, initialize_agent, load_tools
from langchain.callbacks import StreamlitCallbackHandler
import streamlit as st
llm = OpenAI(temperature=0, streaming=True)
tools = load_tools(["ddg-search"])
agent = initialize_agent(
tools, llm, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True
)
if prompt := st.chat_input():
st.chat_message("user").write(prompt)
with st.chat_message("assistant"):
st_callback = StreamlitCallbackHandler(st.container())
response = agent.run(prompt, callbacks=[st_callback])
st.write(response)
The Math Assistant example in panel-chat-examples uses the PanelCallbackHandler
.
Maybe it shows me something like chain of thought. But its really, really hard for me to know as the user experience is no where near that of the StreamlitCallbackHandler
.
Examples not working because https://github.com/holoviz-topics/panel-chat-examples/pull/67 isn't merged yet.
Lots of methods not implemented
Need clarification. It has all the methods listed on BaseCallbackHandler https://python.langchain.com/docs/modules/callbacks/
Examples not working because holoviz-topics/panel-chat-examples#67 isn't merged yet.
I don't understand 👍 . I'm reporting issues with the main
branch of Panel primarely?
Inherits object when langchain not installed.
This is done to prevent import errors. This eventually will be migrated to Langchain I think.
Lots of methods not implemented
Need clarification. It has all the methods listed on BaseCallbackHandler https://python.langchain.com/docs/modules/callbacks/
Thanks. I'm referring to the picture posted above. Maybe its because right now I don't see any chain of thought so it's hard for me to imagine how this should work.
But in that picture lots of the methods don't do anything to the Panel ChatInterface
. They just call super()
.
Could you share a video and code of the PanelCallbackHandler
working @ahuang11? That would help me a lot to understand how you would expect this to work. Thanks.
There is no chain of thought because there's no agents / tools involved in this. This is simply a wrapper of OpenAI generation.
import panel as pn
from langchain.llms import OpenAI
pn.extension()
def callback(contents, user, instance):
llm.predict(contents)
instance = pn.chat.ChatInterface(callback=callback)
callback_handler = pn.chat.PanelCallbackHandler(instance)
llm = OpenAI(temperature=0, callbacks=[callback_handler])
instance.servable()
There is no chain of thought because there's no agents / tools involved in this. This is simply a wrapper of OpenAI generation.
import panel as pn from langchain.llms import OpenAI pn.extension() def callback(contents, user, instance): llm.predict(contents) instance = pn.chat.ChatInterface(callback=callback) callback_handler = pn.chat.PanelCallbackHandler(instance) llm = OpenAI(temperature=0, callbacks=[callback_handler]) instance.servable()
Thanks. What about the "duck duck go" and "maths assistant" examples I'm referring to?
I assume there's no chain of thought there because it did not need to use the tool.
Try https://github.com/holoviz/panel/blob/main/examples/reference/chat/PanelCallbackHandler.ipynb
import panel as pn
from langchain.agents import AgentType, load_tools, initialize_agent
from langchain.llms import OpenAI
pn.extension()
async def callback(contents, *args):
await agent.arun(contents)
instance = pn.chat.ChatInterface(callback=callback)
callback_handler = pn.chat.PanelCallbackHandler(instance)
llm = OpenAI(temperature=0, callbacks=[callback_handler], streaming=True)
tools = load_tools(["serpapi", "llm-math"], llm=llm, callbacks=[callback_handler])
agent = initialize_agent(
tools,
llm,
agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION,
callbacks=[callback_handler],
)
instance.servable()
I believe this issue is mostly feature enhancements, and I appreciate the issue report. I don't think this should block the release of 1.3.0.
Please feel free to create PRs to improve the Langchain integration.
What is the serpapi
for? I don't have a key. Is it nescessary?
I believe it's a search tool. You could probably use duckduckgo
How would you explain that the "duck duck go" example above https://github.com/holoviz/panel/issues/5679#issuecomment-1769110035 shows shows chain of thought for Streamlit but not for Panel?
Unfortunately, I don't have enough time to investigate, but would like to understand it better too!
I can also get chain of thought with the Maths Assistant.
But the example at panel-chat-examples
has issues https://github.com/holoviz-topics/panel-chat-examples/issues/68.
I believe this issue is mostly feature enhancements, and I appreciate the issue report. I don't think this should block the release of 1.3.0.
Please feel free to create PRs to improve the Langchain integration.
For me the main issues are
PanelCallbackHandler
.
pn.chat.langchain.PanelCallBackhandler
or alternatively pn.chat.LangChainCallbackHandler
to make it clear the PanelCallbackHandler is for LangChain. This is very hard to change later.Some working examples will help
I believe all the examples within the reference gallery + panel-chat-examples are working
Some videos of the examples working in the Reference Notebook will help
Perhaps a link to the panel-chat-examples langchain directory
This is very hard to change later.
I imagine it'll be in Langchain so it can simply be PanelCallbackHandler, like Streamlit's is StreamlitCallbackHandler. https://github.com/langchain-ai/langchain/blob/master/libs/langchain/langchain/callbacks/streamlit/streamlit_callback_handler.py#L225
To address the missing visual aids, I added links in https://github.com/holoviz/panel/pull/5681
Sounds good. I will give the reference notebook an iteration.
I got the duck duck go example working to some extent by fixing the code
from langchain.llms import OpenAI
from langchain.agents import AgentType, initialize_agent, load_tools
import panel as pn
llm = OpenAI(temperature=0, streaming=True)
tools = load_tools(["ddg-search"])
agent = initialize_agent(
tools, llm, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True,
)
async def callback(contents, user, instance):
callback_handler = pn.chat.langchain.PanelCallbackHandler(instance)
await agent.arun(contents, callbacks=[callback_handler])
pn.chat.ChatInterface(callback=callback).servable()
But it does not show me the full search results. That is what Streamlit does?
Hi. When making a retrieval augmented generation app in panel, where the LLM queries on a vector database created with some documents, using PanelCallbackHandler
displays the whole chain of thought process, which includes:
Is there a way to not display 1 and 3? I just want to be able to stream the response from the LLM, and name the chatbot "Bot".
Here is my code, if it is relevant.
I'm running the latest
main
branch of Panel testing out thePanelCallbackHandler
.PanelCallBackhandler
leads to bad practice code. (To be solved by #5682)PanelCallbackHandler
does not show the final response. Only the final output from the last tool. (To be solved by #5682)PanelCallbackHandler
is for langchain. (To be solved by #5682)PanelCallBackHandler
object
when LangChain not installed.PanelCallbackHandler
not clear.AsyncPanelCallbackHandler
.Basic Example not working
I would expect it to give me chain of thought similar to how Streamlit does it. But it does not show me anything that I could not see just using the
ChatInterface
.https://github.com/holoviz/panel/assets/42288570/46315581-230b-4bae-b8b2-a3d05417c38d
Cannot configure
PanelCallbackhandler
I have no args for configuring the
PanelCallbackHandler
.Compare this to the
StreamlitCallbackHandler
https://github.com/langchain-ai/langchain/blob/master/libs/langchain/langchain/callbacks/streamlit/streamlit_callback_handler.py.Besides the above Streamlit also output the chain of thought to stdout which is really helpful because it can help me create a log of what happens that I can analyze later. I would like that as an option to do too.
Lots of methods not implemented.
Looking at the
PanelCallBackHandler
I see so many of the methods not implemented. Instead of just calling thesuper().xyz
method. To me its a signal that only selected functionality is implemented?Compare this to https://github.com/langchain-ai/langchain/blob/master/libs/langchain/langchain/callbacks/streamlit/streamlit_callback_handler.py
Inherits
object
whenlangchain
not installed.Pyright complains that the
PanelCallbackHandler
inheritsobject
whenlangchain
not installed.Maybe we can live with this. But is theoretically a problem that we run lots of
super().xyz
methods that do not exist onobject
.PanelCallBackhandler
leads to bad practice code.The below code is the first example in the reference guide and illustrates some issues
callback
thellm
is not defined. This is back practice and could lead to issues.ChatInterface
instance and thePanelCallbackHandler
. This is quirky and hard to remember.Compare this to the Streamlit code
or the reference example
I like this Panel version better.
It also makes it easier for users to declare the
llm
once. For example in a seperate module. Or by using caching.Not clear that
PanelCallbackHandler
is for langchainAs long a the call back handler is a part of Panel code base I think its better to make LangChain more explicit by
langchain
as infrom panel.chat.langchain import PanelCallbackHandler
orLangChainCallbackHandler
. I.e.pn.chat.LangChainCallbackHandler
.Reference Notebook does not explain or show what to expect from the code.
The reference notebook only contains code. It does not contain any text or videos that can help the user understand how these examples should work. Thus leading to confusion also for me as I had other expectations.
Output from CallbackHandler note clear
What does the below mean? Is it really correctly formatted?