Open vincelwt opened 2 weeks ago
Great contribution! Is this also supporting the open source variant of lunary?
So, if we install lunary locally, can we use your integration in this PR to natively connect to the local variant or only the commercial one?
Hi @BigFoxMedia yes absolutely, this is completely compatible with the open-source version. If you install Lunary locally, you just have to set the correct endpoint to http://localhost:3333
in the settings :)
Hi!
This PR extends the Lunary integration to automatically report user conversations using the Threads feature (not just the LLM calls), as well as report the user data if lead capture is enabled.
(Lunary is an open-source LLM monitoring toolkit :) ).
This was requested by many of our mutual users.
For example this is a Flowise conversation:
Now visible into Lunary:
Also with full support for tracing each assistant's message:
I extensively tested this (both with and without Lunary enabled) but let me know if something else is required!
Cheers