Open M340i opened 4 months ago
not without file injection @run.py time which would hold a tokenized txt file of the conversation history. It would become part of the task_prompt file along with a whole prompt engineering construct, much like the ones we use today except more in depth and increased ability to create rules for an agent. this is not in the package as of yet. I have started my own run.py version with a link to the file in first draft. Issue #321 In the end it will inject files where we have to type task, name, org.
So will it requires more token during the task creation?
that would depend on the specificity of the task outline. more information will define the task better.
@Linuxscot1974 may I know have you tried out the Redis chatGPT memory and if yes, what are the differences between langchain and redis chatGPT memory?
In the end it will inject files where we have to type task, name, org.
So it means that we ended up having more parameters for output creation?
I noticed that since there is no memory hosting the past work done by the custom multi-agent, is it possible to merge the Langchain to store the input/output of past records for each agent to use in future output generation?