Open matteoveglia opened 1 year ago
Thanks for sharing the tool; I didn't know it.
We could handle multiple AI APIs; I need to think of the best way to do it, though 🤔.
Maybe a configuration view, where you can specify which AI APIs you want to use and the given model parameters for the query. However, the app itself should not host the model since it's a UI to interface with AI APIs. So you would run the Llama model in a separate service and make queries via the app.
What do you think?
Yeah absolutely, both this project and Dalai would remain separate, I think looking at your app as more of a 'model agnostic' portal to AI models would be the best thing absolutely.
The dalai tool allows for interfacing through a socket.io connection and it looks like you can do quite a bit with that so I think that's the best route
Hi, It would be great, seeing as this is a self-hosted project anyway, to have it also give the ability to connect to the dalai service created by this project: https://github.com/cocktailpeanut/dalai
The project is designed specifically for interaction with the Llama and Alpaca models, of all sizes. It has it's own web interface but it's more just for testing than actual production like ai-chat-app is.