Open dexhorthy opened 6 days ago
Hey @dexhorthy! Oh man you're reading our minds.
We're trying to figure out the right representation for this -- have been playing with lowlevel logs, rich printing, and something as big as a tui a la textual. Don't want to overdo it though.
@jlowin there's a flag for this at the moment now, right? But maybe concurrent tui frames have been wacky or something?
Yes -- as of a week ago we printed everything and the feedback was that it's too much, which is how we ended up in this more compressed state. I want to make this configurable. We do have an experimental TUI intended to let you basically expand things dynamically as they're running but it's not 100% right now. I'm going to open a tracking issue for making verbosity more configurable. never mind that's obviously what this is...
In the meantime set CONTROLFLOW_TOOLS_VERBOSE=1
(or controlflow.settings.tools_verbose=True
at runtime) to print everything
for sure yeah the runtime setting change worked (didn't try the env var yet) - thanks!
I can see what y'all mean about the verbose one being real extra loud 🙂 - the "show me the inputs" was great but the "show me the outputs" could stand to be sliced or otherwise summarized. Or maybe I just need to make leaner tools 😉
Thinking more on this, I guess what I really want is some kind of streaming or event based interface where I can implement what happens on_tool_call / on_tool_result
That would let me customize what I print and also make it easy to stitch these cf flows or tasks into parts of a eg fastapi server and stream progress back to the user
so far we're havin a lot of fun over here, had a question - I'd imagine there's probably an easy-ish way to do this, but I haven't done a ton w/ prefect so bear with me.
I'm interested to know not just when tool calls finish, but also when they start, and what params they were started with. This helps a lot during development as I'm trying to document the functions/tools I'm building and make sure the LLM can use them well.
I did manage to spin up the prefect web UI and start a server in a another shell, and trigger a deployment from a third shell, but wondering if there's a way to just up the verbosity of the logs a little bit without leaving the CLI / IDE
Here's a subset of the code, minus the tools impls
I'd be looking for a way to opt-in to something like showing the params in, and maybe the first 200 chars of the response if it's real big
just riffing here - let me know what y'all think or if I missed a flag/option