Open agaldemas opened 3 months ago
Alain. Please, create a branch from feature/fiware_integration in order the try out the local model using the CB
sorry @dncampo, @javicond3,
I did not saw your message, I forked the repo @ mine: https://github.com/agaldemas/Poc_LLM_CB_FIWARE and made a pull request... may be it wasn't the best way to do... but you can merge on the feature/fiware_integration branch !!!
I will do a new branch for flowise integration...
@javicond3
Your idea by using openai.beta.threads
& openai.beta.assistants
from openai api, was to be able to keep the context in a thread and using an assistant to tune the behavior of the chat, provided by openai, but unfortunately it's not supported by ollama...
I had to use openai.chat.completions, which is working, but we loose the context of a conversation :( ...
=> this is a feature we can introduce using flowise (which use langchain under the hood)
Yes, threads are only supported now by OpenAI. You can send the context (or a summary) in every iteration but is not efficient... Does flowise support it?
Yes, threads are only supported now by OpenAI. You can send the context (or a summary) in every iteration but is not efficient... Does flowise support it?
you can do this in many different ways with flowise, manage prompt, store context, inject embeddings, prepared in vector db, etc....
[ ] introduce Flowise in the loop
[x] use local llm (ollama + compatible model)
[ ] add vector storage to store retrieved POIs as embeddings, then use as retriever to augment prompt instead of giving the context-broker results...
[ ] add additional agent that can search on the web, and add some results to the response
[x] format better the output (display markdown return transformed in html)