Closed garussell closed 7 months ago
How would the chatbot work, and how is it different than what the OpenAi service already does? 1. It is practiced for WebSockets, 2. If the user is 'subscribed' to a chat channel, other manual functionality can go away. The chatbot can be used to create different artisan cards.
Does this expose a limit on using SQL? Could the chatbot create a different database table (or MongoDB)?
I like the idea of using a different db service for a different AI service, however, it becomes a different app. We can refine the 'artist_files' card to be a bit more informative for 'artisan' and use chatbot feature to change the input prompts for the different prompt methods in the existing aiService.
This was a valuable experiment because it helped me understand the need for a chat bot service in the form of a separate repo.
This repo will not house the websocket chat bot feature, but a new repo will. I want to experiment with Docker to containerize the different repos and deployments.
This is going to happen!
Build the back end first, then separate issues for front-end integration.
(this is also where it starts to make sense to separate the back-end with AI services and deploy them separately (on the cloud), as well as utilize Docker / then build the front-ent using a framework)