holoviz / panel

Panel: The powerful data exploration & web app framework for Python
https://panel.holoviz.org
BSD 3-Clause "New" or "Revised" License
4.64k stars 504 forks source link

PanelCallbackHandler feature enhancements #5679

Open MarcSkovMadsen opened 11 months ago

MarcSkovMadsen commented 11 months ago

I'm running the latest main branch of Panel testing out the PanelCallbackHandler.

Basic Example not working

I would expect it to give me chain of thought similar to how Streamlit does it. But it does not show me anything that I could not see just using the ChatInterface.

image

https://github.com/holoviz/panel/assets/42288570/46315581-230b-4bae-b8b2-a3d05417c38d

import panel as pn
from langchain.llms import OpenAI

pn.extension()

def callback(contents, user, instance):
    llm.predict(contents)

instance = pn.chat.ChatInterface(callback=callback)
callback_handler = pn.chat.PanelCallbackHandler(instance)

llm = OpenAI(temperature=0, callbacks=[callback_handler])

instance.servable()

Cannot configure PanelCallbackhandler

I have no args for configuring the PanelCallbackHandler.

image

Compare this to the StreamlitCallbackHandler https://github.com/langchain-ai/langchain/blob/master/libs/langchain/langchain/callbacks/streamlit/streamlit_callback_handler.py.

image

Besides the above Streamlit also output the chain of thought to stdout which is really helpful because it can help me create a log of what happens that I can analyze later. I would like that as an option to do too.

Lots of methods not implemented.

Looking at the PanelCallBackHandler I see so many of the methods not implemented. Instead of just calling the super().xyz method. To me its a signal that only selected functionality is implemented?

image

Compare this to https://github.com/langchain-ai/langchain/blob/master/libs/langchain/langchain/callbacks/streamlit/streamlit_callback_handler.py

Inherits object when langchain not installed.

Pyright complains that the PanelCallbackHandler inherits object when langchain not installed.

Maybe we can live with this. But is theoretically a problem that we run lots of super().xyz methods that do not exist on object.

image

PanelCallBackhandler leads to bad practice code.

The below code is the first example in the reference guide and illustrates some issues

import panel as pn
from langchain.llms import OpenAI

pn.extension()

def callback(contents, user, instance):
    llm.predict(contents)

instance = pn.chat.ChatInterface(callback=callback)
callback_handler = pn.chat.PanelCallbackHandler(instance)

llm = OpenAI(temperature=0, callbacks=[callback_handler])

instance.servable()

Compare this to the Streamlit code

from langchain.callbacks import StreamlitCallbackHandler
from langchain.llms import OpenAI
from langchain.llms.openai import OpenAI
import streamlit as st

llm = OpenAI(temperature=0, streaming=True)

if prompt := st.chat_input():
    st.chat_message("user").write(prompt)
    with st.chat_message("assistant"):
        st_callback = StreamlitCallbackHandler(st.container())
        response = llm.predict(prompt, callbacks=[st_callback])
        st.write(response)

or the reference example

from langchain.llms import OpenAI
from langchain.agents import AgentType, initialize_agent, load_tools
from langchain.callbacks import StreamlitCallbackHandler
import streamlit as st

llm = OpenAI(temperature=0, streaming=True)
tools = load_tools(["ddg-search"])
agent = initialize_agent(
    tools, llm, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True
)

if prompt := st.chat_input():
    st.chat_message("user").write(prompt)
    with st.chat_message("assistant"):
        st_callback = StreamlitCallbackHandler(st.container())
        response = agent.run(prompt, callbacks=[st_callback])
        st.write(response)

I like this Panel version better.

import panel as pn
from langchain.llms import OpenAI

pn.extension()

llm = OpenAI(temperature=0)

def callback(contents, user, instance):
    callback_handler = pn.chat.PanelCallbackHandler(instance)
    llm.predict(contents, callbacks=[callback_handler])

instance = pn.chat.ChatInterface(callback=callback)
instance.servable()

It also makes it easier for users to declare the llm once. For example in a seperate module. Or by using caching.

Not clear that PanelCallbackHandler is for langchain

As long a the call back handler is a part of Panel code base I think its better to make LangChain more explicit by

Reference Notebook does not explain or show what to expect from the code.

The reference notebook only contains code. It does not contain any text or videos that can help the user understand how these examples should work. Thus leading to confusion also for me as I had other expectations.

Output from CallbackHandler note clear

What does the below mean? Is it really correctly formatted?

image

MarcSkovMadsen commented 11 months ago

Duck Duck Go Agent not showing chain of thought

Maybe the basic example above works as expected because it expected only to add the ChatMessage. But this agent does not show chain of thought either

image

from langchain.llms import OpenAI
from langchain.agents import AgentType, initialize_agent, load_tools
import panel as pn

def callback(contents, user, instance):
    llm.predict(contents)

instance = pn.chat.ChatInterface(callback=callback)
callback_handler = pn.chat.PanelCallbackHandler(instance)

llm = OpenAI(temperature=0, streaming=True, callbacks=[callback_handler])
tools = load_tools(["ddg-search"], callbacks=[callback_handler])
agent = initialize_agent(
    tools, llm, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True, callbacks=[callback_handler]
)

instance.servable()

Streamlit version

image

from langchain.llms import OpenAI
from langchain.agents import AgentType, initialize_agent, load_tools
from langchain.callbacks import StreamlitCallbackHandler
import streamlit as st

llm = OpenAI(temperature=0, streaming=True)
tools = load_tools(["ddg-search"])
agent = initialize_agent(
    tools, llm, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True
)

if prompt := st.chat_input():
    st.chat_message("user").write(prompt)
    with st.chat_message("assistant"):
        st_callback = StreamlitCallbackHandler(st.container())
        response = agent.run(prompt, callbacks=[st_callback])
        st.write(response)
MarcSkovMadsen commented 11 months ago

Maths Assistants not showing chain of thought either

The Math Assistant example in panel-chat-examples uses the PanelCallbackHandler.

Maybe it shows me something like chain of thought. But its really, really hard for me to know as the user experience is no where near that of the StreamlitCallbackHandler.

image

ahuang11 commented 11 months ago

Examples not working because https://github.com/holoviz-topics/panel-chat-examples/pull/67 isn't merged yet.

ahuang11 commented 11 months ago

Lots of methods not implemented

Need clarification. It has all the methods listed on BaseCallbackHandler https://python.langchain.com/docs/modules/callbacks/

MarcSkovMadsen commented 11 months ago

Examples not working because holoviz-topics/panel-chat-examples#67 isn't merged yet.

I don't understand 👍 . I'm reporting issues with the main branch of Panel primarely?

ahuang11 commented 11 months ago

Inherits object when langchain not installed.

This is done to prevent import errors. This eventually will be migrated to Langchain I think.

MarcSkovMadsen commented 11 months ago

Lots of methods not implemented

Need clarification. It has all the methods listed on BaseCallbackHandler https://python.langchain.com/docs/modules/callbacks/

Thanks. I'm referring to the picture posted above. Maybe its because right now I don't see any chain of thought so it's hard for me to imagine how this should work.

But in that picture lots of the methods don't do anything to the Panel ChatInterface. They just call super().

MarcSkovMadsen commented 11 months ago

Could you share a video and code of the PanelCallbackHandler working @ahuang11? That would help me a lot to understand how you would expect this to work. Thanks.

ahuang11 commented 11 months ago

There is no chain of thought because there's no agents / tools involved in this. This is simply a wrapper of OpenAI generation.

import panel as pn
from langchain.llms import OpenAI

pn.extension()

def callback(contents, user, instance):
    llm.predict(contents)

instance = pn.chat.ChatInterface(callback=callback)
callback_handler = pn.chat.PanelCallbackHandler(instance)

llm = OpenAI(temperature=0, callbacks=[callback_handler])

instance.servable()
MarcSkovMadsen commented 11 months ago

There is no chain of thought because there's no agents / tools involved in this. This is simply a wrapper of OpenAI generation.

import panel as pn
from langchain.llms import OpenAI

pn.extension()

def callback(contents, user, instance):
    llm.predict(contents)

instance = pn.chat.ChatInterface(callback=callback)
callback_handler = pn.chat.PanelCallbackHandler(instance)

llm = OpenAI(temperature=0, callbacks=[callback_handler])

instance.servable()

Thanks. What about the "duck duck go" and "maths assistant" examples I'm referring to?

ahuang11 commented 11 months ago

I assume there's no chain of thought there because it did not need to use the tool.

Try https://github.com/holoviz/panel/blob/main/examples/reference/chat/PanelCallbackHandler.ipynb

import panel as pn
from langchain.agents import AgentType, load_tools, initialize_agent
from langchain.llms import OpenAI

pn.extension()

async def callback(contents, *args):
    await agent.arun(contents)

instance = pn.chat.ChatInterface(callback=callback)
callback_handler = pn.chat.PanelCallbackHandler(instance)
llm = OpenAI(temperature=0, callbacks=[callback_handler], streaming=True)
tools = load_tools(["serpapi", "llm-math"], llm=llm, callbacks=[callback_handler])
agent = initialize_agent(
    tools,
    llm,
    agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION,
    callbacks=[callback_handler],
)

instance.servable()
image
ahuang11 commented 11 months ago

I believe this issue is mostly feature enhancements, and I appreciate the issue report. I don't think this should block the release of 1.3.0.

Please feel free to create PRs to improve the Langchain integration.

MarcSkovMadsen commented 11 months ago

What is the serpapi for? I don't have a key. Is it nescessary?

ahuang11 commented 11 months ago

I believe it's a search tool. You could probably use duckduckgo

MarcSkovMadsen commented 11 months ago

How would you explain that the "duck duck go" example above https://github.com/holoviz/panel/issues/5679#issuecomment-1769110035 shows shows chain of thought for Streamlit but not for Panel?

ahuang11 commented 11 months ago

Unfortunately, I don't have enough time to investigate, but would like to understand it better too!

MarcSkovMadsen commented 11 months ago

I can also get chain of thought with the Maths Assistant.

image

But the example at panel-chat-examples has issues https://github.com/holoviz-topics/panel-chat-examples/issues/68.

MarcSkovMadsen commented 11 months ago

I believe this issue is mostly feature enhancements, and I appreciate the issue report. I don't think this should block the release of 1.3.0.

Please feel free to create PRs to improve the Langchain integration.

For me the main issues are

ahuang11 commented 11 months ago

Some working examples will help

I believe all the examples within the reference gallery + panel-chat-examples are working

Some videos of the examples working in the Reference Notebook will help

Perhaps a link to the panel-chat-examples langchain directory

This is very hard to change later.

I imagine it'll be in Langchain so it can simply be PanelCallbackHandler, like Streamlit's is StreamlitCallbackHandler. https://github.com/langchain-ai/langchain/blob/master/libs/langchain/langchain/callbacks/streamlit/streamlit_callback_handler.py#L225

ahuang11 commented 11 months ago

To address the missing visual aids, I added links in https://github.com/holoviz/panel/pull/5681

MarcSkovMadsen commented 11 months ago

Sounds good. I will give the reference notebook an iteration.

MarcSkovMadsen commented 11 months ago

I got the duck duck go example working to some extent by fixing the code

from langchain.llms import OpenAI
from langchain.agents import AgentType, initialize_agent, load_tools
import panel as pn

llm = OpenAI(temperature=0, streaming=True)
tools = load_tools(["ddg-search"])
agent = initialize_agent(
    tools, llm, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True, 
)

async def callback(contents, user, instance):
    callback_handler = pn.chat.langchain.PanelCallbackHandler(instance)
    await agent.arun(contents, callbacks=[callback_handler])

pn.chat.ChatInterface(callback=callback).servable()

panel-callback-handler-agent

But it does not show me the full search results. That is what Streamlit does?

image

PrashantSaikia commented 6 months ago

Hi. When making a retrieval augmented generation app in panel, where the LLM queries on a vector database created with some documents, using PanelCallbackHandler displays the whole chain of thought process, which includes:

  1. Showing the relevant documents fetched, by "LangChain (retriever)"
  2. Streaming the response from the LLM, by "LangChain (gpt-4-1106-preview)"
  3. And once the streaming is finished, the same response is copied and displayed again, by "Assistant".

Is there a way to not display 1 and 3? I just want to be able to stream the response from the LLM, and name the chatbot "Bot".

Here is my code, if it is relevant.