Open riyavsinha opened 5 hours ago
Thanks for sharing your use-case and example. It would be great to try to scope out what this cell might look like and why it might need to exist (what are the gaps between the Python cell). What would the inputs be (text, images, etc)? and what would the outputs be (markdown, structured logs, charts etc?)
For example, I can get pretty far with 2 cells (instead of 1) to invoke and run an agent. Would this work for a proof-of-concept? Could you share the agent you built with langchain so I can try to understand what further features could be supported?
import marimo
app = marimo.App(width="medium")
@app.cell
def __():
from agents import my_agent
return (my_agent,)
@app.cell
def __():
import marimo as mo
input = mo.ui.text_area(placeholder="Query").form(bordered=False)
input
return input, mo
@app.cell
def __(input, my_agent):
my_agent(input.value)
return
if __name__ == "__main__":
app.run()
Description
My usecase is that I have a LangChain/Langgraph agent that I would like to import and use in marimo.
I understand
mo.ai
enables llm access, andmo.ui.chat
renders a chat box to interface with an llm, with the capability to use tools even currently. I also understand that AI cells are currently possible to write python/sql code.I think marimo specifically is missing the opportunity to give a unique interface with LLM agents. Instead of rendering a chat window, I'd like to create a "chat" cell in the same way that a languageadapter is built for constructing md/sql cells.
In a data exploration context, this could be very useful to make each llm input+output its own cell. Using mo.ui.chat is insufficient, because data exploration may happen in a sequential order like a chat session, but the "findings" may need to be rearranged to present a synthesized argument, which is why the "notebook" experience is more useful. Similarly, the AI cells are great and quite a bit closer to what would work, but it falls short in that the user query is not saved.
Suggested solution
I would love code that looks something along the lines of
Alternative
No response
Additional context
I'm happy to help with implementing this if we can establish a path forward!
I'm currently in the process of resolving package dependencies for the latest LangChain/Graph versions by helping build Pyodide/WASM compatible versions, but i think the lower langchain 0.2 / langgraph 0.1 versions work for now