ging / Poc_LLM_CB_FIWARE

Apache License 2.0
0 stars 0 forks source link

ideas from meeting with David and more ! #1

Open agaldemas opened 3 months ago

agaldemas commented 3 months ago
dncampo commented 3 months ago

Alain. Please, create a branch from feature/fiware_integration in order the try out the local model using the CB

agaldemas commented 3 months ago

sorry @dncampo, @javicond3,

I did not saw your message, I forked the repo @ mine: https://github.com/agaldemas/Poc_LLM_CB_FIWARE and made a pull request... may be it wasn't the best way to do... but you can merge on the feature/fiware_integration branch !!!

Tell me if you want I create the branch anyway...

agaldemas commented 3 months ago

I will do a new branch for flowise integration...

agaldemas commented 3 months ago
agaldemas commented 3 months ago

@javicond3

Your idea by using openai.beta.threads& openai.beta.assistants from openai api, was to be able to keep the context in a thread and using an assistant to tune the behavior of the chat, provided by openai, but unfortunately it's not supported by ollama...

I had to use openai.chat.completions, which is working, but we loose the context of a conversation :( ...

=> this is a feature we can introduce using flowise (which use langchain under the hood)

javicond3 commented 3 months ago

Yes, threads are only supported now by OpenAI. You can send the context (or a summary) in every iteration but is not efficient... Does flowise support it?

agaldemas commented 3 months ago

Yes, threads are only supported now by OpenAI. You can send the context (or a summary) in every iteration but is not efficient... Does flowise support it?

you can do this in many different ways with flowise, manage prompt, store context, inject embeddings, prepared in vector db, etc....