FlowiseAI / Flowise

Drag & drop UI to build your customized LLM flow
https://flowiseai.com
Apache License 2.0
30.55k stars 15.8k forks source link

[FEATURE] Use prompts defined in Langfuse for Flowise Chatflows #1603

Open dkindlund opened 8 months ago

dkindlund commented 8 months ago

Describe the feature you'd like Langfuse supports the ability to store and manage prompts within that UI. It would be awesome if there was an easy way to fetch those prompts dynamically into a specific chatflow within Flowise.

Additional context Langfuse has a prompts_get API endpoint to support this functionality: https://langfuse.com/docs/api-reference#tag/Prompts

The scope of work would be:

It seems easy enough and I could probably write this as a custom JS node, but: 1) It would be better if someone from Langfuse supported this and 2) There's no way for custom JS nodes to easily fetch credential info (and I really don't want to hard code credentials in custom JS nodes anymore)

Hey @HenryHengZJ and @marcklingen, let me know your thoughts on this.

dkindlund commented 8 months ago

Side note btw, @marcklingen -- it would be great if the prompts_get endpoint supported like an optional param of version=latest so that you didn't have to constantly update those numbers by hand everytime.

marcklingen commented 8 months ago

Side note btw, @marcklingen -- it would be great if the prompts_get endpoint supported like an optional param of version=latest so that you didn't have to constantly update those numbers by hand everytime

If you do not specify a version, the latest version marked as active in Langfuse is used. No need to update versions every time you make a change.

HenryHengZJ commented 8 months ago

one way is to specify langfuse credential as variable, and use $vars in the custom function for that

petitgen commented 5 months ago

Do you have any updates on this issue? Is any Public PR accepted to have this feature?

thiagolealassis commented 2 months ago

+1