Open baptisteArno opened 3 months ago
Or maybe users would add their credentials in their workspace settings so that we don't care about usage and it matches Typebot's philosophy to let the user decide and control his expenses on AI services. We could also let them customize the prompts, use any AI providers etc.
That would be powerful
That sounds good to me! Do we have any similar AI-powered features in the current version of Typebot? I can take that as a reference to structure and modularise the code required for this and all subsequent AI features :)
Cool, thanks a lot! I understand your vision to a certain extent, however, I would like to spend some more time to think through this implementation. I will comment in this thread again (very soon!) once I have a clearer idea.
Hi @baptisteArno, Sorry for the delayed response.
I've gone through the code logic for connecting groups and identified where to trigger an API call (conditionally, if the user has valid LLM API Credentials stored in the workspace settings) to the corresponding endpoint every time a group is connected to the next one, to receive the AI-generated title and update it in the UI.
I would like to point out the following:
Is it best practice to store users' API Keys in our database? Or should we store them in the user's local-storage / session-storage to enhance security?
Since there will be multiple in-app AI features in the chatbot builder, would it be a good idea to provide users with more granular control for each of these features? For example, they could switch on/off automatic block title generation / codeblock validation or other features, as well as modify the system prompts for each of them. We could dedicate an entire section for that in the workspace settings.
I noticed in the credentials section, there is an OpenAI
block where users can save their API keys to implement in the chatbot flow. Should we somehow connect that to in-app API features, or should we keep it completely separate for the time being?
Please let me know your thoughts on these points. Thanks.
Appreciate the follow-up!
createCredentials.ts
to see the server logic. I don't really see the point to save them in local storage instead.
@baptisteArno I’d love to contribute to this by implementing a system prompt for generating group titles using any LLM API of your choice (OpenAI, OpenRouter, or another). One challenge I foresee is controlling usage to avoid unpredictable billing costs. Do you have any specific thoughts on managing this?
We could approach it in one or both of the following ways:
For reference,
Dub.co
uses a similar approach for generating link names with AI, limiting usage to a certain weekly/monthly quota to prevent misuse. I’ve attached a screen recording for context.Screencast of Dub.co