Open El-Hassan-Hajbi opened 2 weeks ago
Hi @ @El-Hassan-Hajbi what do you mean by callbacks? are you talking about prices?
I m interested in catching the on_llm_start and on_llm_end for each generate answer node and integrate them in langfuse, to see the traces of my scrap graph.
I can see a llm_custom_callback class, but not sure how to use it.
Look https://github.com/ScrapeGraphAI/Scrapegraph-ai/blob/main/examples/openai/smart_scraper_openai.py. you should. look print(prettify_exec_info(graph_exec_info)). if you want to integrate lòangfuse we can work together if you want
I'm one of the Langfuse maintainers. Met @PeriniM a while back and was really impressed by the project. Happy to help in case you have any questions while adding a Langfuse integration!
Look https://github.com/ScrapeGraphAI/Scrapegraph-ai/blob/main/examples/openai/smart_scraper_openai.py. you should. look print(prettify_exec_info(graph_exec_info)). if you want to integrate lòangfuse we can work together if you want
This is very good, but doesn't have the features of langfuse. I think having a trace of all the scrapgraphs is very powerfull, it enables the user to monitor each node of the graph (prompts, parsed html, merged response ..)
I'm one of the Langfuse maintainers. Met @PeriniM a while back and was really impressed by the project. Happy to help in case you have any questions while adding a Langfuse integration!
Thanks Marc, (and you too thanks for building langfuse, no doubt it is a useful tool) ! I modified the generate_answer_node locally and traced it in langfuse, I will be working on other nodes soon, and push it. Hope this will be helpfull for others.
Hi @El-Hassan-Hajbi make the pull request when you are ready
Thanks team for your great work.
Can you provide an example on the usage of "callbacks" from scrapgraphAI ?
And if possible an integration with langfuse traces.
Thanks.